Search results for: edge detection algorithm
2051 A Model to Assess Sustainability Using Multi-Criteria Analysis and Geographic Information Systems: A Case Study
Authors: Antonio Boggia, Luisa Paolotti, Gianluca Massei, Lucia Rocchi, Elaine Pace, Maria Attard
Abstract:
The aim of this paper is to present a methodology and a computer model for sustainability assessment based on the integration of Multi-criteria Decision Analysis (MCDA) with a Geographic Information System (GIS). It presents the result of a study for the implementation of a model for measuring sustainability to address the policy actions for the improvement of sustainability at territory level. The aim is to rank areas in order to understand the specific technical and/or financial support that is required to develop sustainable growth. Assessing sustainable development is a multidimensional problem: economic, social and environmental aspects have to be taken into account at the same time. The tool for a multidimensional representation is a proper set of indicators. The set of indicators must be integrated into a model, that is an assessment methodology, to be used for measuring sustainability. The model, developed by the Environmental Laboratory of the University of Perugia, is called GeoUmbriaSUIT. It is a calculation procedure developed as a plugin working in the open-source GIS software QuantumGIS. The multi-criteria method used within GeoUmbriaSUIT is the algorithm TOPSIS (Technique for Order Preference by Similarity to Ideal Design), which defines a ranking based on the distance from the worst point and the closeness to an ideal point, for each of the criteria used. For the sustainability assessment procedure, GeoUmbriaSUIT uses a geographic vector file where the graphic data represent the study area and the single evaluation units within it (the alternatives, e.g. the regions of a country, or the municipalities of a region), while the alphanumeric data (attribute table), describe the environmental, economic and social aspects related to the evaluation units by means of a set of indicators (criteria). The use of the algorithm available in the plugin allows to treat individually the indicators representing the three dimensions of sustainability, and to compute three different indices: environmental index, economic index and social index. The graphic output of the model allows for an integrated assessment of the three dimensions, avoiding aggregation. The presence of separate indices and graphic output make GeoUmbriaSUIT a readable and transparent tool, since it doesn’t produce an aggregate index of sustainability as final result of the calculations, which is often cryptic and difficult to interpret. In addition, it is possible to develop a “back analysis”, able to explain the positions obtained by the alternatives in the ranking, based on the criteria used. The case study presented is an assessment of the level of sustainability in the six regions of Malta, an island state in the middle of the Mediterranean Sea and the southernmost member of the European Union. The results show that the integration of MCDA-GIS is an adequate approach for sustainability assessment. In particular, the implemented model is able to provide easy to understand results. This is a very important condition for a sound decision support tool, since most of the time decision makers are not experts and need understandable output. In addition, the evaluation path is traceable and transparent.Keywords: GIS, multi-criteria analysis, sustainability assessment, sustainable development
Procedia PDF Downloads 2892050 Characterization of Agroforestry Systems in Burkina Faso Using an Earth Observation Data Cube
Authors: Dan Kanmegne
Abstract:
Africa will become the most populated continent by the end of the century, with around 4 billion inhabitants. Food security and climate changes will become continental issues since agricultural practices depend on climate but also contribute to global emissions and land degradation. Agroforestry has been identified as a cost-efficient and reliable strategy to address these two issues. It is defined as the integrated management of trees and crops/animals in the same land unit. Agroforestry provides benefits in terms of goods (fruits, medicine, wood, etc.) and services (windbreaks, fertility, etc.), and is acknowledged to have a great potential for carbon sequestration; therefore it can be integrated into reduction mechanisms of carbon emissions. Particularly in sub-Saharan Africa, the constraint stands in the lack of information about both areas under agroforestry and the characterization (composition, structure, and management) of each agroforestry system at the country level. This study describes and quantifies “what is where?”, earliest to the quantification of carbon stock in different systems. Remote sensing (RS) is the most efficient approach to map such a dynamic technology as agroforestry since it gives relatively adequate and consistent information over a large area at nearly no cost. RS data fulfill the good practice guidelines of the Intergovernmental Panel On Climate Change (IPCC) that is to be used in carbon estimation. Satellite data are getting more and more accessible, and the archives are growing exponentially. To retrieve useful information to support decision-making out of this large amount of data, satellite data needs to be organized so to ensure fast processing, quick accessibility, and ease of use. A new solution is a data cube, which can be understood as a multi-dimensional stack (space, time, data type) of spatially aligned pixels and used for efficient access and analysis. A data cube for Burkina Faso has been set up from the cooperation project between the international service provider WASCAL and Germany, which provides an accessible exploitation architecture of multi-temporal satellite data. The aim of this study is to map and characterize agroforestry systems using the Burkina Faso earth observation data cube. The approach in its initial stage is based on an unsupervised image classification of a normalized difference vegetation index (NDVI) time series from 2010 to 2018, to stratify the country based on the vegetation. Fifteen strata were identified, and four samples per location were randomly assigned to define the sampling units. For safety reasons, the northern part will not be part of the fieldwork. A total of 52 locations will be visited by the end of the dry season in February-March 2020. The field campaigns will consist of identifying and describing different agroforestry systems and qualitative interviews. A multi-temporal supervised image classification will be done with a random forest algorithm, and the field data will be used for both training the algorithm and accuracy assessment. The expected outputs are (i) map(s) of agroforestry dynamics, (ii) characteristics of different systems (main species, management, area, etc.); (iii) assessment report of Burkina Faso data cube.Keywords: agroforestry systems, Burkina Faso, earth observation data cube, multi-temporal image classification
Procedia PDF Downloads 1462049 Identification of Disease Causing DNA Motifs in Human DNA Using Clustering Approach
Authors: G. Tamilpavai, C. Vishnuppriya
Abstract:
Studying DNA (deoxyribonucleic acid) sequence is useful in biological processes and it is applied in the fields such as diagnostic and forensic research. DNA is the hereditary information in human and almost all other organisms. It is passed to their generations. Earlier stage detection of defective DNA sequence may lead to many developments in the field of Bioinformatics. Nowadays various tedious techniques are used to identify defective DNA. The proposed work is to analyze and identify the cancer-causing DNA motif in a given sequence. Initially the human DNA sequence is separated as k-mers using k-mer separation rule. The separated k-mers are clustered using Self Organizing Map (SOM). Using Levenshtein distance measure, cancer associated DNA motif is identified from the k-mer clusters. Experimental results of this work indicate the presence or absence of cancer causing DNA motif. If the cancer associated DNA motif is found in DNA, it is declared as the cancer disease causing DNA sequence. Otherwise the input human DNA is declared as normal sequence. Finally, elapsed time is calculated for finding the presence of cancer causing DNA motif using clustering formation. It is compared with normal process of finding cancer causing DNA motif. Locating cancer associated motif is easier in cluster formation process than the other one. The proposed work will be an initiative aid for finding genetic disease related research.Keywords: bioinformatics, cancer motif, DNA, k-mers, Levenshtein distance, SOM
Procedia PDF Downloads 1882048 Study and Calibration of Autonomous UAV Systems With Thermal Sensing With Multi-purpose Roles
Authors: Raahil Sheikh, Prathamesh Minde, Priya Gujjar, Himanshu Dwivedi, Abhishek Maurya
Abstract:
UAVs have been an initial member of our environment since it's the first used by Austrian warfare in Venice. At that stage, they were just pilotless balloons equipped with bombs to be dropped on enemy territory. Over time, technological advancements allowed UAVs to be controlled remotely or autonomously. This study shall mainly focus on the intensification of pre-existing manual drones equipping them with a variety of sensors and making them autonomous, and capable, and purposing them for a variety of roles, including thermal sensing, data collection, tracking creatures, forest fires, volcano detection, hydrothermal studies, urban heat, Island measurement, and other environmental research. The system can also be used for reconnaissance, research, 3D mapping, and search and rescue missions. This study mainly focuses on automating tedious tasks and reducing human errors as much as possible, reducing deployment time, and increasing the overall efficiency, efficacy, and reliability of the UAVs. Creation of a comprehensive Ground Control System UI (GCS) enabling less trained professionals to be able to use the UAV with maximum potency. With the inclusion of such an autonomous system, artificially intelligent paths and environmental gusts and concerns can be avoidedKeywords: UAV, autonomous systems, drones, geo thermal imaging
Procedia PDF Downloads 862047 An Integrated Lightweight Naïve Bayes Based Webpage Classification Service for Smartphone Browsers
Authors: Mayank Gupta, Siba Prasad Samal, Vasu Kakkirala
Abstract:
The internet world and its priorities have changed considerably in the last decade. Browsing on smart phones has increased manifold and is set to explode much more. Users spent considerable time browsing different websites, that gives a great deal of insight into user’s preferences. Instead of plain information classifying different aspects of browsing like Bookmarks, History, and Download Manager into useful categories would improve and enhance the user’s experience. Most of the classification solutions are server side that involves maintaining server and other heavy resources. It has security constraints and maybe misses on contextual data during classification. On device, classification solves many such problems, but the challenge is to achieve accuracy on classification with resource constraints. This on device classification can be much more useful in personalization, reducing dependency on cloud connectivity and better privacy/security. This approach provides more relevant results as compared to current standalone solutions because it uses content rendered by browser which is customized by the content provider based on user’s profile. This paper proposes a Naive Bayes based lightweight classification engine targeted for a resource constraint devices. Our solution integrates with Web Browser that in turn triggers classification algorithm. Whenever a user browses a webpage, this solution extracts DOM Tree data from the browser’s rendering engine. This DOM data is a dynamic, contextual and secure data that can’t be replicated. This proposal extracts different features of the webpage that runs on an algorithm to classify into multiple categories. Naive Bayes based engine is chosen in this solution for its inherent advantages in using limited resources compared to other classification algorithms like Support Vector Machine, Neural Networks, etc. Naive Bayes classification requires small memory footprint and less computation suitable for smartphone environment. This solution has a feature to partition the model into multiple chunks that in turn will facilitate less usage of memory instead of loading a complete model. Classification of the webpages done through integrated engine is faster, more relevant and energy efficient than other standalone on device solution. This classification engine has been tested on Samsung Z3 Tizen hardware. The Engine is integrated into Tizen Browser that uses Chromium Rendering Engine. For this solution, extensive dataset is sourced from dmoztools.net and cleaned. This cleaned dataset has 227.5K webpages which are divided into 8 generic categories ('education', 'games', 'health', 'entertainment', 'news', 'shopping', 'sports', 'travel'). Our browser integrated solution has resulted in 15% less memory usage (due to partition method) and 24% less power consumption in comparison with standalone solution. This solution considered 70% of the dataset for training the data model and the rest 30% dataset for testing. An average accuracy of ~96.3% is achieved across the above mentioned 8 categories. This engine can be further extended for suggesting Dynamic tags and using the classification for differential uses cases to enhance browsing experience.Keywords: chromium, lightweight engine, mobile computing, Naive Bayes, Tizen, web browser, webpage classification
Procedia PDF Downloads 1632046 Single-Molecule Analysis of Structure and Dynamics in Polymer Materials by Super-Resolution Technique
Authors: Hiroyuki Aoki
Abstract:
The physical properties of polymer materials are dependent on the conformation and molecular motion of a polymer chain. Therefore, the structure and dynamic behavior of the single polymer chain have been the most important concerns in the field of polymer physics. However, it has been impossible to directly observe the conformation of the single polymer chain in a bulk medium. In the current work, the novel techniques to study the conformation and dynamics of a single polymer chain are proposed. Since a fluorescence method is extremely sensitive, the fluorescence microscopy enables the direct detection of a single molecule. However, the structure of the polymer chain as large as 100 nm cannot be resolved by conventional fluorescence methods because of the diffraction limit of light. In order to observe the single chains, we developed the labeling method of polymer materials with a photo-switchable dye and the super-resolution microscopy. The real-space conformational analysis of single polymer chains with the spatial resolution of 15-20 nm was achieved. The super-resolution microscopy enables us to obtain the three-dimensional coordinates; therefore, we succeeded the conformational analysis in three dimensions. The direct observation by the nanometric optical microscopy would reveal the detailed information on the molecular processes in the various polymer systems.Keywords: polymer materials, single molecule, super-resolution techniques, conformation
Procedia PDF Downloads 3062045 Auditory Rehabilitation via an VR Serious Game for Children with Cochlear Implants: Bio-Behavioral Outcomes
Authors: Areti Okalidou, Paul D. Hatzigiannakoglou, Aikaterini Vatou, George Kyriafinis
Abstract:
Young children are nowadays adept at using technology. Hence, computer-based auditory training programs (CBATPs) have become increasingly popular in aural rehabilitation for children with hearing loss and/or with cochlear implants (CI). Yet, their clinical utility for prognostic, diagnostic, and monitoring purposes has not been explored. The purposes of the study were: a) to develop an updated version of the auditory rehabilitation tool for Greek-speaking children with cochlear implants, b) to develop a database for behavioral responses, and c) to compare accuracy rates and reaction times in children differing in hearing status and other medical and demographic characteristics, in order to assess the tool’s clinical utility in prognosis, diagnosis, and progress monitoring. The updated version of the auditory rehabilitation tool was developed on a tablet, retaining the User-Centered Design approach and the elements of the Virtual Reality (VR) serious game. The visual stimuli were farm animals acting in simple game scenarios designed to trigger children’s responses to animal sounds, names, and relevant sentences. Based on an extended version of Erber’s auditory development model, the VR game consisted of six stages, i.e., sound detection, sound discrimination, word discrimination, identification, comprehension of words in a carrier phrase, and comprehension of sentences. A familiarization stage (learning) was set prior to the game. Children’s tactile responses were recorded as correct, false, or impulsive, following a child-dependent set up of a valid delay time after stimulus offset for valid responses. Reaction times were also recorded, and the database was in Εxcel format. The tablet version of the auditory rehabilitation tool was piloted in 22 preschool children with Νormal Ηearing (ΝΗ), which led to improvements. The study took place in clinical settings or at children’s homes. Fifteen children with CI, aged 5;7-12;3 years with post-implantation 0;11-5;1 years used the auditory rehabilitation tool. Eight children with CI were monolingual, two were bilingual and five had additional disabilities. The control groups consisted of 13 children with ΝΗ, aged 2;6-9;11 years. A comparison of both accuracy rates, as percent correct, and reaction times (in sec) was made at each stage, across hearing status, age, and also, within the CI group, based on presence of additional disability and bilingualism. Both monolingual Greek-speaking children with CI with no additional disabilities and hearing peers showed high accuracy rates at all stages, with performances falling above the 3rd quartile. However, children with normal hearing scored higher than the children with CI, especially in the detection and word discrimination tasks. The reaction time differences between the two groups decreased in language-based tasks. Results for children with CI with additional disability or bilingualism varied. Finally, older children scored higher than younger ones in both groups (CI, NH), but larger differences occurred in children with CI. The interactions between familiarization of the software, age, hearing status and demographic characteristics are discussed. Overall, the VR game is a promising tool for tracking the development of auditory skills, as it provides multi-level longitudinal empirical data. Acknowledgment: This work is part of a project that has received funding from the Research Committee of the University of Macedonia under the Basic Research 2020-21 funding programme.Keywords: VR serious games, auditory rehabilitation, auditory training, children with cochlear implants
Procedia PDF Downloads 892044 Ayurvastra: A Study on the Ancient Indian Textile for Healing
Authors: Reena Aggarwal
Abstract:
The use of textile chemicals in the various pre and post-textile manufacturing processes has made the textile industry conscious of its negative contribution to environmental pollution. Popular environmentally friendly fibers such as recycled polyester and organic cotton have been now increasingly used by fabrics and apparel manufacturers. However, after these textiles or the finished apparel are manufactured, they have to be dyed in the same chemical dyes that are harmful and toxic to the environment. Dyeing is a major area of concern for the environment as well as for people who have chemical sensitivities as it may cause nausea, breathing difficulties, seizures, etc. Ayurvastra or herbal medical textiles are one step ahead of the organic lifestyle, which supports the core concept of holistic well-being and also eliminates the impact of harmful chemicals and pesticides. There is a wide range of herbs that can be used not only for dyeing but also for providing medicinal properties to the textiles like antibacterial, antifungal, antiseptic, antidepressant and for treating insomnia, skin diseases, etc. The concept of herbal dyeing of fabric is to manifest herbal essence in every aspect of clothing, i.e., from production to end-use, additionally to eliminate the impact of harmful chemical dyes and chemicals which are known to result in problems like skin rashes, headache, trouble concentrating, nausea, diarrhea, fatigue, muscle and joint pain, dizziness, difficulty breathing, irregular heartbeat and seizures. Herbal dyeing or finishing on textiles will give an extra edge to the textiles as it adds an extra function to the fabric. The herbal extracts can be applied to the textiles by a simple process like the pad dry cure method and mainly acts on the human body through the skin for aiding in the treatment of disease or managing the medical condition through its herbal properties. This paper, therefore, delves into producing Ayurvastra, which is a perfect amalgamation of cloth and wellness. The aim of the paper is to design and create herbal disposable and non-disposable medical textile products acting mainly topically (through the skin) for providing medicinal properties/managing medical conditions. Keeping that in mind, a range of antifungal socks and antibacterial napkins treated with turmeric and aloe vera were developed, which are recommended for the treatment of fungal and bacterial infections, respectively. Both Herbal Antifungal socks and Antibacterial napkins have proved to be efficient enough in managing and treating fungal and bacterial infections of the skin, respectively.Keywords: ayurvastra, ayurveda, herbal, pandemic, sustainable
Procedia PDF Downloads 1302043 A Distinct Method Based on Mamba-Unet for Brain Tumor Image Segmentation
Authors: Djallel Bouamama, Yasser R. Haddadi
Abstract:
Accurate brain tumor segmentation is crucial for diagnosis and treatment planning, yet it remains a challenging task due to the variability in tumor shapes and intensities. This paper introduces a distinct approach to brain tumor image segmentation by leveraging an advanced architecture known as Mamba-Unet. Building on the well-established U-Net framework, Mamba-Unet incorporates distinct design enhancements to improve segmentation performance. Our proposed method integrates a multi-scale attention mechanism and a hybrid loss function to effectively capture fine-grained details and contextual information in brain MRI scans. We demonstrate that Mamba-Unet significantly enhances segmentation accuracy compared to conventional U-Net models by utilizing a comprehensive dataset of annotated brain MRI scans. Quantitative evaluations reveal that Mamba-Unet surpasses traditional U-Net architectures and other contemporary segmentation models regarding Dice coefficient, sensitivity, and specificity. The improvements are attributed to the method's ability to manage class imbalance better and resolve complex tumor boundaries. This work advances the state-of-the-art in brain tumor segmentation and holds promise for improving clinical workflows and patient outcomes through more precise and reliable tumor detection.Keywords: brain tumor classification, image segmentation, CNN, U-NET
Procedia PDF Downloads 382042 The Fit of the Partial Pair Distribution Functions of BaMnFeF7 Fluoride Glass Using the Buckingham Potential by the Hybrid RMC Simulation
Authors: Sidi Mohamed Mesli, Mohamed Habchi, Arslane Boudghene Stambouli, Rafik Benallal
Abstract:
The BaMnMF7 (M=Fe,V, transition metal fluoride glass, assuming isomorphous replacement) have been structurally studied through the simultaneous simulation of their neutron diffraction patterns by reverse Monte Carlo (RMC) and by the Hybrid Reverse Monte Carlo (HRMC) analysis. This last is applied to remedy the problem of the artificial satellite peaks that appear in the partial pair distribution functions (PDFs) by the RMC simulation. The HRMC simulation is an extension of the RMC algorithm, which introduces an energy penalty term (potential) in acceptance criteria. The idea of this work is to apply the Buckingham potential at the title glass by ignoring the van der Waals terms, in order to make a fit of the partial pair distribution functions and give the most possible realistic features. When displaying the partial PDFs, we suggest that the Buckingham potential is useful to describe average correlations especially in similar interactions.Keywords: fluoride glasses, RMC simulation, hybrid RMC simulation, Buckingham potential, partial pair distribution functions
Procedia PDF Downloads 5032041 Development of Quasi Real-Time Comprehensive System for Earthquake Disaster
Authors: Zhi Liu, Hui Jiang, Jin Li, Kunhao Chen, Langfang Zhang
Abstract:
Fast acquisition of the seismic information and accurate assessment of the earthquake disaster is the key problem for emergency rescue after a destructive earthquake. In order to meet the requirements of the earthquake emergency response and rescue for the cities and counties, a quasi real-time comprehensive evaluation system for earthquake disaster is developed. Based on monitoring data of Micro-Electro-Mechanical Systems (MEMS) strong motion network, structure database of a county area and the real-time disaster information by the mobile terminal after an earthquake, fragility analysis method and dynamic correction algorithm are synthetically obtained in the developed system. Real-time evaluation of the seismic disaster in the county region is finally realized to provide scientific basis for seismic emergency command, rescue and assistant decision.Keywords: quasi real-time, earthquake disaster data collection, MEMS accelerometer, dynamic correction, comprehensive evaluation
Procedia PDF Downloads 2132040 Plasma Electrolytes and Gamma Glutamyl Transpeptidase (GGT) Status in Dementia Subjects in Southern Nigeria
Authors: Salaam Mujeeb, Adeola Segun, Abdullahi Olasunkanmi
Abstract:
Dementia is becoming a major concern as the world population is increasing and elderly populations are being neglected. Liver and kidney Diseases have been implicated as risk factors in the etiology of Dementia. This study, therefore, evaluates the plasma Gamma Glutamyl Transferase (GGT) activity and plasma Electrolytes in other to find an association between the biomarkers and Dementia. The subjects (38) were age and sex-matched with their corresponding controls and structured questionnaires were used to obtain medical information. Using spectrophotometric and ion selective Electrode techniques respectively, we found and elevated GGT activity in the Dementia Subjects. Remarkably, no association was found between the plasma Electrolytes level and Dementia subjects. It was also observed that severity of Dementia worsens with age. Moreover, the condition of the dementia subjects worsens with reducing weight. Furthermore, the presence of Comorbidity e.g. Hypertension, Obesity, Diabetes and Habits like Smoking, Drugs and Alcohol consumption interferes with Electrolyte balance. Weight loss monitoring and IBM check are advised in Elderly individuals particularly females as they may be inductive of early or future cognitive impairments. Therefore, it might be useful as an early detection tool. Government and society should invest more on the Geriatric population by establishing Old people's home and providing social care services.Keywords: clinical characteristics, dementia, electrolytes, gamma glutamyl transpeptidase, GGT
Procedia PDF Downloads 3252039 Optimal Maintenance Policy for a Three-Unit System
Authors: A. Abbou, V. Makis, N. Salari
Abstract:
We study the condition-based maintenance (CBM) problem of a system subject to stochastic deterioration. The system is composed of three units (or modules): (i) Module 1 deterioration follows a Markov process with two operational states and one failure state. The operational states are partially observable through periodic condition monitoring. (ii) Module 2 deterioration follows a Gamma process with a known failure threshold. The deterioration level of this module is fully observable through periodic inspections. (iii) Only the operating age information is available of Module 3. The lifetime of this module has a general distribution. A CBM policy prescribes when to initiate a maintenance intervention and which modules to repair during intervention. Our objective is to determine the optimal CBM policy minimizing the long-run expected average cost of operating the system. This is achieved by formulating a Markov decision process (MDP) and developing the value iteration algorithm for solving the MDP. We provide numerical examples illustrating the cost-effectiveness of the optimal CBM policy through a comparison with heuristic policies commonly found in the literature.Keywords: reliability, maintenance optimization, Markov decision process, heuristics
Procedia PDF Downloads 2192038 Comparison Between Genetic Algorithms and Particle Swarm Optimization Optimized Proportional Integral Derirative and PSS for Single Machine Infinite System
Authors: Benalia Nadia, Zerzouri Nora, Ben Si Ali Nadia
Abstract:
Abstract: Among the many different modern heuristic optimization methods, genetic algorithms (GA) and the particle swarm optimization (PSO) technique have been attracting a lot of interest. The GA has gained popularity in academia and business mostly because to its simplicity, ability to solve highly nonlinear mixed integer optimization problems that are typical of complex engineering systems, and intuitiveness. The mechanics of the PSO methodology, a relatively recent heuristic search tool, are modeled after the swarming or cooperative behavior of biological groups. It is suitable to compare the performance of the two techniques since they both aim to solve a particular objective function but make use of distinct computing methods. In this article, PSO and GA optimization approaches are used for the parameter tuning of the power system stabilizer and Proportional integral derivative regulator. Load angle and rotor speed variations in the single machine infinite bus bar system is used to measure the performance of the suggested solution.Keywords: SMIB, genetic algorithm, PSO, transient stability, power system stabilizer, PID
Procedia PDF Downloads 842037 Modified Naive Bayes-Based Prediction Modeling for Crop Yield Prediction
Authors: Kefaya Qaddoum
Abstract:
Most of greenhouse growers desire a determined amount of yields in order to accurately meet market requirements. The purpose of this paper is to model a simple but often satisfactory supervised classification method. The original naive Bayes have a serious weakness, which is producing redundant predictors. In this paper, utilized regularization technique was used to obtain a computationally efficient classifier based on naive Bayes. The suggested construction, utilized L1-penalty, is capable of clearing redundant predictors, where a modification of the LARS algorithm is devised to solve this problem, making this method applicable to a wide range of data. In the experimental section, a study conducted to examine the effect of redundant and irrelevant predictors, and test the method on WSG data set for tomato yields, where there are many more predictors than data, and the urge need to predict weekly yield is the goal of this approach. Finally, the modified approach is compared with several naive Bayes variants and other classification algorithms (SVM and kNN), and is shown to be fairly good.Keywords: tomato yield prediction, naive Bayes, redundancy, WSG
Procedia PDF Downloads 2372036 A Novel Meta-Heuristic Algorithm Based on Cloud Theory for Redundancy Allocation Problem under Realistic Condition
Authors: H. Mousavi, M. Sharifi, H. Pourvaziri
Abstract:
Redundancy Allocation Problem (RAP) is a well-known mathematical problem for modeling series-parallel systems. It is a combinatorial optimization problem which focuses on determining an optimal assignment of components in a system design. In this paper, to be more practical, we have considered the problem of redundancy allocation of series system with interval valued reliability of components. Therefore, during the search process, the reliabilities of the components are considered as a stochastic variable with a lower and upper bounds. In order to optimize the problem, we proposed a simulated annealing based on cloud theory (CBSAA). Also, the Monte Carlo simulation (MCS) is embedded to the CBSAA to handle the random variable components’ reliability. This novel approach has been investigated by numerical examples and the experimental results have shown that the CBSAA combining MCS is an efficient tool to solve the RAP of systems with interval-valued component reliabilities.Keywords: redundancy allocation problem, simulated annealing, cloud theory, monte carlo simulation
Procedia PDF Downloads 4122035 Predicting Indonesia External Debt Crisis: An Artificial Neural Network Approach
Authors: Riznaldi Akbar
Abstract:
In this study, we compared the performance of the Artificial Neural Network (ANN) model with back-propagation algorithm in correctly predicting in-sample and out-of-sample external debt crisis in Indonesia. We found that exchange rate, foreign reserves, and exports are the major determinants to experiencing external debt crisis. The ANN in-sample performance provides relatively superior results. The ANN model is able to classify correctly crisis of 89.12 per cent with reasonably low false alarms of 7.01 per cent. In out-of-sample, the prediction performance fairly deteriorates compared to their in-sample performances. It could be explained as the ANN model tends to over-fit the data in the in-sample, but it could not fit the out-of-sample very well. The 10-fold cross-validation has been used to improve the out-of-sample prediction accuracy. The results also offer policy implications. The out-of-sample performance could be very sensitive to the size of the samples, as it could yield a higher total misclassification error and lower prediction accuracy. The ANN model could be used to identify past crisis episodes with some accuracy, but predicting crisis outside the estimation sample is much more challenging because of the presence of uncertainty.Keywords: debt crisis, external debt, artificial neural network, ANN
Procedia PDF Downloads 4442034 A Self-Heating Gas Sensor of SnO2-Based Nanoparticles Electrophoretic Deposited
Authors: Glauco M. M. M. Lustosa, João Paulo C. Costa, Sonia M. Zanetti, Mario Cilense, Leinig Antônio Perazolli, Maria Aparecida Zaghete
Abstract:
The contamination of the environment has been one of the biggest problems of our time, mostly due to developments of many industries. SnO2 is an n-type semiconductor with band gap about 3.5 eV and has its electrical conductivity dependent of type and amount of modifiers agents added into matrix ceramic during synthesis process, allowing applications as sensing of gaseous pollutants on ambient. The chemical synthesis by polymeric precursor method consists in a complexation reaction between tin ion and citric acid at 90 °C/2 hours and subsequently addition of ethyleneglycol for polymerization at 130 °C/2 hours. It also prepared polymeric resin of zinc, cobalt and niobium ions. Stoichiometric amounts of the solutions were mixed to obtain the systems (Zn, Nb)-SnO2 and (Co, Nb) SnO2 . The metal immobilization reduces its segregation during the calcination resulting in a crystalline oxide with high chemical homogeneity. The resin was pre-calcined at 300 °C/1 hour, milled in Atritor Mill at 500 rpm/1 hour, and then calcined at 600 °C/2 hours. X-Ray Diffraction (XDR) indicated formation of SnO2 -rutile phase (JCPDS card nº 41-1445). The characterization by Scanning Electron Microscope of High Resolution showed spherical ceramic powder nanostructured with 10-20 nm of diameter. 20 mg of SnO2 -based powder was kept in 20 ml of isopropyl alcohol and then taken to an electrophoretic deposition (EPD) system. The EPD method allows control the thickness films through the voltage or current applied in the electrophoretic cell and by the time used for deposition of ceramics particles. This procedure obtains films in a short time with low costs, bringing prospects for a new generation of smaller size devices with easy integration technology. In this research, films were obtained in an alumina substrate with interdigital electrodes after applying 2 kV during 5 and 10 minutes in cells containing alcoholic suspension of (Zn, Nb)-SnO2 and (Co, Nb) SnO2 of powders, forming a sensing layer. The substrate has designed integrated micro hotplates that provide an instantaneous and precise temperature control capability when a voltage is applied. The films were sintered at 900 and 1000 °C in a microwave oven of 770 W, adapted by the research group itself with a temperature controller. This sintering is a fast process with homogeneous heating rate which promotes controlled growth of grain size and also the diffusion of modifiers agents, inducing the creation of intrinsic defects which will change the electrical characteristics of SnO2 -based powders. This study has successfully demonstrated a microfabricated system with an integrated micro-hotplate for detection of CO and NO2 gas at different concentrations and temperature, with self-heating SnO2 - based nanoparticles films, being suitable for both industrial process monitoring and detection of low concentrations in buildings/residences in order to safeguard human health. The results indicate the possibility for development of gas sensors devices with low power consumption for integration in portable electronic equipment with fast analysis. Acknowledgments The authors thanks to the LMA-IQ for providing the FEG-SEM images, and the financial support of this project by the Brazilian research funding agencies CNPq, FAPESP 2014/11314-9 and CEPID/CDMF- FAPESP 2013/07296-2.Keywords: chemical synthesis, electrophoretic deposition, self-heating, gas sensor
Procedia PDF Downloads 2752033 Biochemical and Molecular Analysis of Staphylococcus aureus Various Isolates from Different Places
Authors: Kiran Fatima, Kashif Ali
Abstract:
Staphylococcus aureus is an opportunistic human as well as animal pathogen that causes a variety of diseases. A total of 70 staphylococci isolates were obtained from soil, water, yogurt, and clinical samples. The likely staphylococci clinical isolates were identified phenotypically by different biochemical tests. Molecular identification was done by PCR using species-specific 16S rRNA primer pairs, and finally, 50 isolates were found to be positive as Staphylococcus aureus, sciuri, xylous and cohnii. Screened isolates were further analyzed by several microbiological diagnostics tests, including gram staining, coagulase, capsule, hemolysis, fermentation of glucose, lactose, maltose, and sucrose tests enzymatic reactions. It was found that 78%, 81%, and 51% of isolates were positive for gelatin hydrolysis, protease, and lipase activities, respectively. Antibiogram analysis of isolated Staphylococcus aureus strains with respect to different antimicrobial agents revealed resistance patterns ranging from 57 to 96%. Our study also shows 70% of strains to be MRSA, 54.3% as VRSA, and 54.3% as both MRSA and VRSA. All the identified isolates were subjected to detection of mecA, nuc, and hlb genes, and 70%, 84%, and 40% were found to harbour mecA, nuc, and hlb genes, respectively. The current investigation is highly important and informative for the high-level multidrug-resistant Staphylococcus aureus infections inclusive also of methicillin and vancomycin.Keywords: MRSA, VRSA, mecA, MSSA
Procedia PDF Downloads 1302032 An Approaching Index to Evaluate a forward Collision Probability
Authors: Yuan-Lin Chen
Abstract:
This paper presents an approaching forward collision probability index (AFCPI) for alerting and assisting driver in keeping safety distance to avoid the forward collision accident in highway driving. The time to collision (TTC) and time headway (TH) are used to evaluate the TTC forward collision probability index (TFCPI) and the TH forward collision probability index (HFCPI), respectively. The Mamdani fuzzy inference algorithm is presented combining TFCPI and HFCPI to calculate the approaching collision probability index of the vehicle. The AFCPI is easier to understand for the driver who did not even have any professional knowledge in vehicle professional field. At the same time, the driver’s behavior is taken into account for suiting each driver. For the approaching index, the value 0 is indicating the 0% probability of forward collision, and the values 0.5 and 1 are indicating the 50% and 100% probabilities of forward collision, respectively. The AFCPI is useful and easy-to-understand for alerting driver to avoid the forward collision accidents when driving in highway.Keywords: approaching index, forward collision probability, time to collision, time headway
Procedia PDF Downloads 2942031 Dynamic Cardiac Mitochondrial Proteome Alterations after Ischemic Preconditioning
Authors: Abdelbary Prince, Said Moussa, Hyungkyu Kim, Eman Gouda, Jin Han
Abstract:
We compared the dynamic alterations of mitochondrial proteome of control, ischemia-reperfusion (IR) and ischemic preconditioned (IPC) rabbit hearts. Using 2-DE, we identified 29 mitochondrial proteins that were differentially expressed in the IR heart compared with the control and IPC hearts. For two of the spots, the expression patterns were confirmed by Western blotting analysis. These proteins included succinate dehydrogenase complex, Acyl-CoA dehydrogenase, carnitine acetyltransferase, dihydrolipoamide dehydrogenase, Atpase, ATP synthase, dihydrolipoamide succinyltransferase, ubiquinol-cytochrome c reductase, translation elongation factor, acyl-CoA dehydrogenase, actin alpha, succinyl-CoA Ligase, dihydrolipoamide S-succinyltransferase, citrate synthase, acetyl-Coenzyme A dehydrogenase, creatine kinase, isocitrate dehydrogenase, pyruvate dehydrogenase, prohibitin, NADH dehydrogenase (ubiquinone) Fe-S protein, enoyl Coenzyme A hydratase, superoxide dismutase [Mn], and 24-kDa subunit of complex I. Interestingly, most of these proteins are associated with the mitochondrial respiratory chain, antioxidant enzyme system, and energy metabolism. The results provide clues as to the cardioprotective mechanism of ischemic preconditioning at the protein level and may serve as potential biomarkers for detection of ischemia-induced cardiac injury.Keywords: ischemic preconditioning, mitochondria, proteome, cardioprotection
Procedia PDF Downloads 3492030 High Harmonics Generation in Hexagonal Graphene Quantum Dots
Authors: Armenuhi Ghazaryan, Qnarik Poghosyan, Tadevos Markosyan
Abstract:
We have considered the high-order harmonic generation in-plane graphene quantum dots of hexagonal shape by the independent quasiparticle approximation-tight binding model. We have investigated how such a nonlinear effect is affected by a strong optical wave field, quantum dot typical band gap and lateral size, and dephasing processes. The equation of motion for the density matrix is solved by performing the time integration with the eight-order Runge-Kutta algorithm. If the optical wave frequency is much less than the quantum dot intrinsic band gap, the main aspects of multiphoton high harmonic emission in quantum dots are revealed. In such a case, the dependence of the cutoff photon energy on the strength of the optical pump wave is almost linear. But when the wave frequency is comparable to the bandgap of the quantum dot, the cutoff photon energy shows saturation behavior with an increase in the wave field strength.Keywords: strong wave field, multiphoton, bandgap, wave field strength, nanostructure
Procedia PDF Downloads 1572029 Contention Window Adjustment in IEEE 802.11-based Industrial Wireless Networks
Authors: Mohsen Maadani, Seyed Ahmad Motamedi
Abstract:
The use of wireless technology in industrial networks has gained vast attraction in recent years. In this paper, we have thoroughly analyzed the effect of contention window (CW) size on the performance of IEEE 802.11-based industrial wireless networks (IWN), from delay and reliability perspective. Results show that the default values of CWmin, CWmax, and retry limit (RL) are far from the optimum performance due to the industrial application characteristics, including short packet and noisy environment. An adaptive CW algorithm (payload-dependent) has been proposed to minimize the average delay. Finally a simple, but effective CW and RL setting has been proposed for industrial applications which outperforms the minimum-average-delay solution from maximum delay and jitter perspective, at the cost of a little higher average delay. Simulation results show an improvement of up to 20%, 25%, and 30% in average delay, maximum delay and jitter respectively.Keywords: average delay, contention window, distributed coordination function (DCF), jitter, industrial wireless network (IWN), maximum delay, reliability, retry limit
Procedia PDF Downloads 4172028 Renovation Planning Model for a Shopping Mall
Authors: Hsin-Yun Lee
Abstract:
In this study, the pedestrian simulation VISWALK integration and application platform ant algorithms written program made to construct a renovation engineering schedule planning mode. The use of simulation analysis platform construction site when the user running the simulation, after calculating the user walks in the case of construction delays, the ant algorithm to find out the minimum delay time schedule plan, and add volume and unit area deactivated loss of business computing, and finally to the owners and users of two different positions cut considerations pick out the best schedule planning. To assess and validate its effectiveness, this study constructed the model imported floor of a shopping mall floor renovation engineering cases. Verify that the case can be found from the mode of the proposed project schedule planning program can effectively reduce the delay time and the user's walking mall loss of business, the impact of the operation on the renovation engineering facilities in the building to a minimum.Keywords: pedestrian, renovation, schedule, simulation
Procedia PDF Downloads 4132027 Capnography for Detection of Return of Spontaneous Circulation Pseudo-Pea
Authors: Yiyuan David Hu, Alex Lindqwister, Samuel B. Klein, Karen Moodie, Norman A. Paradis
Abstract:
Introduction: Pseudo-Pulseless Electrical Activity (p-PEA) is a lifeless form of profound cardiac shock characterized by measurable cardiac mechanical activity without clinically detectable pulses. Patients in pseudo-PEA carry different prognoses than those in true PEA and may require different therapies. End-tidal carbon dioxide (ET-CO2) is a reliable indicator of the return of spontaneous circulation (ROSC) in ventricular fibrillation and true-PEA but has not been studied p-PEA. Hypothesis: ET-CO2 can be used as an independent indicator of ROSC in p-PEA resuscitation. Methods: 30kg female swine (N = 14) under intravenous anesthesia were instrumented with aortic and right atrial micromanometer pressure. ECG and ET-CO2 were measured continuously. p-PEA was induced by ventilation with 6% oxygen in 94% nitrogen and was defined as a systolic Ao less than 40 mmHg. The statistical relationships between ET-CO2 and ROSC are reported. Results: ET-CO2 during resuscitation strongly correlated with ROSC (Figure 1). Mean ET-CO2 during p-PEA was 28.4 ± 8.4, while mean ET-CO2 in ROSC for 100% O2 cohort was 42.2 ± 12.6 (p < 0.0001), mean ET-CO2 in ROSC for 100% O2 + CPR was 33.0 ± 15.4 (p < 0.0001). Analysis of slope was limited to one minute of resuscitation data to capture local linearity; assessment began 10 seconds after resuscitation started to allow the ventilator to mix 100% O2. Pigs who would recover with 100% O2 had a slope of 0.023 ± 0.001, oxygen + CPR had a slope of 0.018 ± 0.002, and oxygen + CPR + epinephrine had a slope of 0.0050 ± 0.0009. Conclusions: During resuscitation from porcine hypoxic p-PEA, a rise in ET-CO2 is indicative of ROSC.Keywords: ET-CO2, resuscitation, capnography, pseudo-PEA
Procedia PDF Downloads 1872026 Process Improvement and Redesign of the Immuno Histology (IHC) Lab at MSKCC: A Lean and Ergonomic Study
Authors: Samantha Meyerholz
Abstract:
MSKCC offers patients cutting edge cancer care with the highest quality standards. However, many patients and industry members do not realize that the operations of the Immunology Histology Lab (IHC) are the backbone for carrying out this mission. The IHC lab manufactures blocks and slides containing critical tissue samples that will be read by a Pathologist to diagnose and dictate a patient’s treatment course. The lab processes 200 requests daily, leading to the generation of approximately 2,000 slides and 1,100 blocks each day. Lab material is transported through labeling, cutting, staining and sorting manufacturing stations, while being managed by multiple techs throughout the space. The quality of the stain as well as wait times associated with processing requests, is directly associated with patients receiving rapid treatments and having a wider range of care options. This project aims to improve slide request turnaround time for rush and non-rush cases, while increasing the quality of each request filled (no missing slides or poorly stained items). Rush cases are to be filled in less than 24 hours, while standard cases are allotted a 48 hour time period. Reducing turnaround times enable patients to communicate sooner with their clinical team regarding their diagnosis, ultimately leading faster treatments and potentially better outcomes. Additional project goals included streamlining tech and material workflow, while reducing waste and increasing efficiency. This project followed a DMAIC structure with emphasis on lean and ergonomic principles that could be integrated into an evolving lab culture. Load times and batching processes were analyzed using process mapping, FMEA analysis, waste analysis, engineering observation, 5S and spaghetti diagramming. Reduction of lab technician movement as well as their body position at each workstation was of top concern to pathology leadership. With new equipment being brought into the lab to carry out workflow improvements, screen and tool placement was discussed with the techs in focus groups, to reduce variation and increase comfort throughout the workspace. 5S analysis was completed in two phases in the IHC lab, helping to drive solutions that reduced rework and tech motion. The IHC lab plans to continue utilizing these techniques to further reduce the time gap between tissue analysis and cancer care.Keywords: engineering, ergonomics, healthcare, lean
Procedia PDF Downloads 2232025 An Integrative Computational Pipeline for Detection of Tumor Epitopes in Cancer Patients
Authors: Tanushree Jaitly, Shailendra Gupta, Leila Taher, Gerold Schuler, Julio Vera
Abstract:
Genomics-based personalized medicine is a promising approach to fight aggressive tumors based on patient's specific tumor mutation and expression profiles. A remarkable case is, dendritic cell-based immunotherapy, in which tumor epitopes targeting patient's specific mutations are used to design a vaccine that helps in stimulating cytotoxic T cell mediated anticancer immunity. Here we present a computational pipeline for epitope-based personalized cancer vaccines using patient-specific haplotype and cancer mutation profiles. In the workflow proposed, we analyze Whole Exome Sequencing and RNA Sequencing patient data to detect patient-specific mutations and their expression level. Epitopes including the tumor mutations are computationally predicted using patient's haplotype and filtered based on their expression level, binding affinity, and immunogenicity. We calculate binding energy for each filtered major histocompatibility complex (MHC)-peptide complex using docking studies, and use this feature to select good epitope candidates further.Keywords: cancer immunotherapy, epitope prediction, NGS data, personalized medicine
Procedia PDF Downloads 2542024 Point-of-Interest Recommender Systems for Location-Based Social Network Services
Authors: Hoyeon Park, Yunhwan Keon, Kyoung-Jae Kim
Abstract:
Location Based Social Network services (LBSNs) is a new term that combines location based service and social network service (SNS). Unlike traditional SNS, LBSNs emphasizes empirical elements in the user's actual physical location. Point-of-Interest (POI) is the most important factor to implement LBSNs recommendation system. POI information is the most popular spot in the area. In this study, we would like to recommend POI to users in a specific area through recommendation system using collaborative filtering. The process is as follows: first, we will use different data sets based on Seoul and New York to find interesting results on human behavior. Secondly, based on the location-based activity information obtained from the personalized LBSNs, we have devised a new rating that defines the user's preference for the area. Finally, we have developed an automated rating algorithm from massive raw data using distributed systems to reduce advertising costs of LBSNs.Keywords: location-based social network services, point-of-interest, recommender systems, business analytics
Procedia PDF Downloads 2292023 SA-SPKC: Secure and Efficient Aggregation Scheme for Wireless Sensor Networks Using Stateful Public Key Cryptography
Authors: Merad Boudia Omar Rafik, Feham Mohammed
Abstract:
Data aggregation in wireless sensor networks (WSNs) provides a great reduction of energy consumption. The limited resources of sensor nodes make the choice of an encryption algorithm very important for providing security for data aggregation. Asymmetric cryptography involves large ciphertexts and heavy computations but solves, on the other hand, the problem of key distribution of symmetric one. The latter provides smaller ciphertexts and speed computations. Also, the recent researches have shown that achieving the end-to-end confidentiality and the end-to-end integrity at the same is a challenging task. In this paper, we propose (SA-SPKC), a novel security protocol which addresses both security services for WSNs, and where only the base station can verify the individual data and identify the malicious node. Our scheme is based on stateful public key encryption (StPKE). The latter combines the best features of both kinds of encryption along with state in order to reduce the computation overhead. Our analysisKeywords: secure data aggregation, wireless sensor networks, elliptic curve cryptography, homomorphic encryption
Procedia PDF Downloads 2972022 A Sensitive Uric Acid Electrochemical Sensing in Biofluids Based on Ni/Zn Hydroxide Nanocatalyst
Authors: Nathalia Florencia Barros Azeredo, Josué Martins Gonçalves, Pamela De Oliveira Rossini, Koiti Araki, Lucio Angnes
Abstract:
This work demonstrates the electroanalysis of uric acid (UA) at very low working potential (0 V vs Ag/AgCl) directly in body fluids such as saliva and sweat using electrodes modified with mixed -Ni0.75Zn0.25(OH)2 nanoparticles exhibiting stable electrocatalytic responses from alkaline down to weakly acidic media (pH 14 to 3 range). These materials were prepared for the first time and fully characterized by TEM, XRD, and spectroscopic techniques. The electrochemical properties of the modified electrodes were evaluated in a fast and simple procedure for uric acid analyses based on cyclic voltammetry and chronoamperometry, pushing down the detection and quantification limits (respectively of 2.3*10-8 and 7.6*10-8 mol L-1) with good repeatability (RSD = 3.2% for 30 successive analyses pH 14). Finally, the possibility of real application was demonstrated upon realization of unexpectedly robust and sensitive modified FTO (fluorine doped tin oxide) glass and screen-printed sensors for measurement of uric acid directly in real saliva and sweat samples, with no significant interference of usual concentrations of ascorbic acid, acetaminophen, lactate and glucose present in those body fluids (Fig. 1).Keywords: nickel hydroxide, mixed catalyst, uric acid sensors, biofluids
Procedia PDF Downloads 127