Search results for: RLS identification algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6330

Search results for: RLS identification algorithm

1920 Defining Priority Areas for Biodiversity Conservation to Support for Zoning Protected Areas: A Case Study from Vietnam

Authors: Xuan Dinh Vu, Elmar Csaplovics

Abstract:

There has been an increasing need for methods to define priority areas for biodiversity conservation since the effectiveness of biodiversity conservation in protected areas largely depends on the availability of material resources. The identification of priority areas requires the integration of biodiversity data together with social data on human pressures and responses. However, the deficit of comprehensive data and reliable methods becomes a key challenge in zoning where the demand for conservation is most urgent and where the outcomes of conservation strategies can be maximized. In order to fill this gap, the study applied an environmental model Condition–Pressure–Response to suggest a set of criteria to identify priority areas for biodiversity conservation. Our empirical data has been compiled from 185 respondents, categorizing into three main groups: governmental administration, research institutions, and protected areas in Vietnam by using a well - designed questionnaire. Then, the Analytic Hierarchy Process (AHP) theory was used to identify the weight of all criteria. Our results have shown that priority level for biodiversity conservation could be identified by three main indicators: condition, pressure, and response with the value of the weight of 26%, 41%, and 33%, respectively. Based on the three indicators, 7 criteria and 15 sub-criteria were developed to support for defining priority areas for biodiversity conservation and zoning protected areas. In addition, our study also revealed that the groups of governmental administration and protected areas put a focus on the 'Pressure' indicator while the group of Research Institutions emphasized the importance of 'Response' indicator in the evaluation process. Our results provided recommendations to apply the developed criteria for identifying priority areas for biodiversity conservation in Vietnam.

Keywords: biodiversity conservation, condition–pressure–response model, criteria, priority areas, protected areas

Procedia PDF Downloads 170
1919 Analysis and Detection of Facial Expressions in Autism Spectrum Disorder People Using Machine Learning

Authors: Muhammad Maisam Abbas, Salman Tariq, Usama Riaz, Muhammad Tanveer, Humaira Abdul Ghafoor

Abstract:

Autism Spectrum Disorder (ASD) refers to a developmental disorder that impairs an individual's communication and interaction ability. Individuals feel difficult to read facial expressions while communicating or interacting. Facial Expression Recognition (FER) is a unique method of classifying basic human expressions, i.e., happiness, fear, surprise, sadness, disgust, neutral, and anger through static and dynamic sources. This paper conducts a comprehensive comparison and proposed optimal method for a continued research project—a system that can assist people who have Autism Spectrum Disorder (ASD) in recognizing facial expressions. Comparison has been conducted on three supervised learning algorithms EigenFace, FisherFace, and LBPH. The JAFFE, CK+, and TFEID (I&II) datasets have been used to train and test the algorithms. The results were then evaluated based on variance, standard deviation, and accuracy. The experiments showed that FisherFace has the highest accuracy for all datasets and is considered the best algorithm to be implemented in our system.

Keywords: autism spectrum disorder, ASD, EigenFace, facial expression recognition, FisherFace, local binary pattern histogram, LBPH

Procedia PDF Downloads 174
1918 Revising Our Ideas on Revisions: Non-Contact Bridging Plate Fixation of Vancouver B1 and B2 Periprosthetic Femoral Fractures

Authors: S. Ayeko, J. Milton, C. Hughes, K. Anderson, R. G. Middleton

Abstract:

Background: Periprosthetic femoral fractures (PFF) in association with hip hemiarthroplasty or total hip arthroplasty is a common and serious complication. In the Vancouver Classification system algorithm, B1 fractures should be treated with Open Reduction and Internal Fixation (ORIF) and preferentially revised in combination with ORIF if B2 or B3. This study aims to assess patient outcomes after plate osteosynthesis alone for Vancouver B1 and B2 fractures. The main outcome is the 1-year re-revision rate, and secondary outcomes are 30-day and 1-year mortality. Method: This is a retrospective single-centre case-series review from January 2016 to June 2021. Vancouver B1 and B2, non-malignancy fractures in adults over 18 years of age treated with polyaxial Non-Contact Bridging plate osteosynthesis, have been included. Outcomes were gathered from electronic notes and radiographs. Results: There were 50 B1 and 64 B2 fractures. 26 B2 fractures were managed with ORIF and revision, 39 ORIF alone. Of the revision group, one died within 30 days (3.8%), one at one year (3.8%), and two were revised within one year (7.7). Of the B2 ORIF group, three died within 30-day mortality (7.96%), eight at one year (21.1%), and 0 were revised in 1 year. Conclusion: This study has demonstrated that satisfactory outcomes can be achieved with ORIF, excluding revision in the management of B2 fractures.

Keywords: arthroplasty, bridging plate, periprosthetic fracture, revision surgery

Procedia PDF Downloads 101
1917 Geophysical Methods of Mapping Groundwater Aquifer System: Perspectives and Inferences From Lisana Area, Western Margin of the Central Main Ethiopian Rift

Authors: Esubalew Yehualaw Melaku, Tigistu Haile Eritro

Abstract:

In this study, two basic geophysical methods are applied for mapping the groundwater aquifer system in the Lisana area along the Guder River, northeast of Hosanna town, near the western margin of the Central Main Ethiopian Rift. The main target of the study is to map the potential aquifer zone and investigate the groundwater potential for current and future development of the resource in the Gode area. The geophysical methods employed in this study include, Vertical Electrical Sounding (VES) and magnetic survey techniques. Electrical sounding was used to examine and map the depth to the potential aquifer zone of the groundwater and its distribution over the area. On the other hand, a magnetic survey was used to delineate contact between lithologic units and geological structures. The 2D magnetic modeling and the geoelectric sections are used for the identification of weak zones, which control the groundwater flow and storage system. The geophysical survey comprises of twelve VES readings collected by using a Schlumberger array along six profile lines and more than four hundred (400) magnetic readings at about 10m station intervals along four profiles and 20m along three random profiles. The study result revealed that the potential aquifer in the area is obtained at a depth range from 45m to 92m. This is the response of the highly weathered/ fractured ignimbrite and pumice layer with sandy soil, which is the main water-bearing horizon. Overall, in the neighborhood of four VES points, VES- 2, VES- 3, VES-10, and VES-11, shows good water-bearing zones in the study area.

Keywords: vertical electrical sounding, magnetic survey, aquifer, groundwater potential

Procedia PDF Downloads 79
1916 A Multi-Objective Optimization Tool for Dual-Mode Operating Active Magnetic Regenerator Model

Authors: Anna Ouskova Leonteva, Michel Risser, Anne Jeannin-Girardon, Pierre Parrend, Pierre Collet

Abstract:

This paper proposes an efficient optimization tool for an active magnetic regenerator (AMR) model, operating in two modes: magnetic refrigeration system (MRS) and thermo-magnetic generator (TMG). The aim of this optimizer is to improve the design of the AMR by applying a multi-physics multi-scales numerical model as a core of evaluation functions to achieve industrial requirements for refrigeration and energy conservation systems. Based on the multi-objective non-dominated sorting genetic algorithm 3 (NSGA3), it maximizes four different objectives: efficiency and power density for MRS and TMG. The main contribution of this work is in the simultaneously application of a CPU-parallel NSGA3 version to the AMR model in both modes for studying impact of control and design parameters on the performance. The parametric study of the optimization results are presented. The main conclusion is that the common (for TMG and MRS modes) optimal parameters can be found by the proposed tool.

Keywords: ecological refrigeration systems, active magnetic regenerator, thermo-magnetic generator, multi-objective evolutionary optimization, industrial optimization problem, real-world application

Procedia PDF Downloads 114
1915 Combining Chiller and Variable Frequency Drives

Authors: Nasir Khalid, S. Thirumalaichelvam

Abstract:

In most buildings, according to US Department of Energy Data Book, the electrical consumption attributable to centralized heating and ventilation of air- condition (HVAC) component can be as high as 40-60% of the total electricity consumption for an entire building. To provide efficient energy management for the market today, researchers are finding new ways to develop a system that can save electrical consumption of buildings even more. In this concept paper, a system known as Intelligent Chiller Energy Efficiency (iCEE) System is being developed that is capable of saving up to 25% from the chiller’s existing electrical energy consumption. In variable frequency drives (VFDs), research has found significant savings up to 30% of electrical energy consumption. Together with the VFDs at specific Air Handling Unit (AHU) of HVAC component, this system will save even more electrical energy consumption. The iCEE System is compatible with any make, model or age of centrifugal, rotary or reciprocating chiller air-conditioning systems which are electrically driven. The iCEE system uses engineering principles of efficiency analysis, enthalpy analysis, heat transfer, mathematical prediction, modified genetic algorithm, psychometrics analysis, and optimization formulation to achieve true and tangible energy savings for consumers.

Keywords: variable frequency drives, adjustable speed drives, ac drives, chiller energy system

Procedia PDF Downloads 558
1914 Identification and Characterization of Groundwater Recharge Sites in Kuwait

Authors: Dalal Sadeqi

Abstract:

Groundwater is an important component of Kuwait’s water resources. Although limited in quantity and often poor in quality, the significance of this natural source of water cannot be overemphasized. Recharge of groundwater in Kuwait occurs during periodical storm events, especially in open desert areas. Runoff water dissolves accumulated surficial meteoric salts and subsequently leaches them into the groundwater following a period of evaporative enrichment at or near the soil surface. Geochemical processes governing groundwater recharge vary in time and space. Stable isotope (18O and 2H) and geochemical signatures are commonly used to gain some insight into recharge processes and groundwater salinization mechanisms, particularly in arid and semiarid regions. This article addresses the mechanism used in identifying and characterizing the main water shed areas in Kuwait using stable isotopes in an attempt to determine favorable groundwater recharge sites in the country. Stable isotopes of both rainwater and groundwater were targeted in different hydrogeological settings. Additionally, data and information obtained from subsurface logs in the study area were collected and analyzed to develop a better understanding of the lateral and vertical extent of the groundwater aquifers. Geographic Information System (GIS) and RockWorks 3D modelling software were used to map out the hydrogeomorphology of the study area and the subsurface lithology of the investigated aquifers. The collected data and information, including major ion chemistry, isotopes, subsurface characteristics, and hydrogeomorphology, were integrated in a GIS platform to identify and map out suitable natural recharge areas as part of an integrated water resources management scheme that addresses the challenges of the sustainability of the groundwater reserves in the country.

Keywords: scarcity, integrated, recharge, isotope

Procedia PDF Downloads 115
1913 Bioinformatic Approaches in Population Genetics and Phylogenetic Studies

Authors: Masoud Sheidai

Abstract:

Biologists with a special field of population genetics and phylogeny have different research tasks such as populations’ genetic variability and divergence, species relatedness, the evolution of genetic and morphological characters, and identification of DNA SNPs with adaptive potential. To tackle these problems and reach a concise conclusion, they must use the proper and efficient statistical and bioinformatic methods as well as suitable genetic and morphological characteristics. In recent years application of different bioinformatic and statistical methods, which are based on various well-documented assumptions, are the proper analytical tools in the hands of researchers. The species delineation is usually carried out with the use of different clustering methods like K-means clustering based on proper distance measures according to the studied features of organisms. A well-defined species are assumed to be separated from the other taxa by molecular barcodes. The species relationships are studied by using molecular markers, which are analyzed by different analytical methods like multidimensional scaling (MDS) and principal coordinate analysis (PCoA). The species population structuring and genetic divergence are usually investigated by PCoA and PCA methods and a network diagram. These are based on bootstrapping of data. The Association of different genes and DNA sequences to ecological and geographical variables is determined by LFMM (Latent factor mixed model) and redundancy analysis (RDA), which are based on Bayesian and distance methods. Molecular and morphological differentiating characters in the studied species may be identified by linear discriminant analysis (DA) and discriminant analysis of principal components (DAPC). We shall illustrate these methods and related conclusions by giving examples from different edible and medicinal plant species.

Keywords: GWAS analysis, K-Means clustering, LFMM, multidimensional scaling, redundancy analysis

Procedia PDF Downloads 124
1912 Adopting Flocks of Birds Approach to Predator for Anomalies Detection on Industrial Control Systems

Authors: M. Okeke, A. Blyth

Abstract:

Industrial Control Systems (ICS) such as Supervisory Control And Data Acquisition (SCADA) can be seen in many different critical infrastructures, from nuclear management to utility, medical equipment, power, waste and engine management on ships and planes. The role SCADA plays in critical infrastructure has resulted in a call to secure them. Many lives depend on it for daily activities and the attack vectors are becoming more sophisticated. Hence, the security of ICS is vital as malfunction of it might result in huge risk. This paper describes how the application of Prey Predator (PP) approach in flocks of birds could enhance the detection of malicious activities on ICS. The PP approach explains how these animals in groups or flocks detect predators by following some simple rules. They are not necessarily very intelligent animals but their approach in solving complex issues such as detection through corporation, coordination and communication worth emulating. This paper will emulate flocking behavior seen in birds in detecting predators. The PP approach will adopt six nearest bird approach in detecting any predator. Their local and global bests are based on the individual detection as well as group detection. The PP algorithm was designed following MapReduce methodology that follows a Split Detection Convergence (SDC) approach.

Keywords: artificial life, industrial control system (ICS), IDS, prey predator (PP), SCADA, SDC

Procedia PDF Downloads 301
1911 Effect of Rainflow Cycle Number on Fatigue Lifetime of an Arm of Vehicle Suspension System

Authors: Hatem Mrad, Mohamed Bouazara, Fouad Erchiqui

Abstract:

Fatigue, is considered as one of the main cause of mechanical properties degradation of mechanical parts. Probability and reliability methods are appropriate for fatigue analysis using uncertainties that exist in fatigue material or process parameters. Current work deals with the study of the effect of the number and counting Rainflow cycle on fatigue lifetime (cumulative damage) of an upper arm of the vehicle suspension system. The major part of the fatigue damage induced in suspension arm is caused by two main classes of parameters. The first is related to the materials properties and the second is the road excitation or the applied force of the passenger’s number. Therefore, Young's modulus and road excitation are selected as input parameters to conduct repetitive simulations by Monte Carlo (MC) algorithm. Latin hypercube sampling method is used to generate these parameters. Response surface method is established according to fatigue lifetime of each combination of input parameters according to strain-life method. A PYTHON script was developed to automatize finite element simulations of the upper arm according to a design of experiments.

Keywords: fatigue, monte carlo, rainflow cycle, response surface, suspension system

Procedia PDF Downloads 256
1910 Prediction of B-Cell Epitope for 24 Mite Allergens: An in Silico Approach towards Epitope-Based Immune Therapeutics

Authors: Narjes Ebrahimi, Soheila Alyasin, Navid Nezafat, Hossein Esmailzadeh, Younes Ghasemi, Seyed Hesamodin Nabavizadeh

Abstract:

Immunotherapy with allergy vaccines is of great importance in allergen-specific immunotherapy. In recent years, B-cell epitope-based vaccines have attracted considerable attention and the prediction of epitopes is crucial to design these types of allergy vaccines. B-cell epitopes might be linear or conformational. The prerequisite for the identification of conformational epitopes is the information about allergens' tertiary structures. Bioinformatics approaches have paved the way towards the design of epitope-based allergy vaccines through the prediction of tertiary structures and epitopes. Mite allergens are one of the major allergy contributors. Several mite allergens can elicit allergic reactions; however, their structures and epitopes are not well established. So, B-cell epitopes of various groups of mite allergens (24 allergens in 6 allergen groups) were predicted in the present work. Tertiary structures of 17 allergens with unknown structure were predicted and refined with RaptorX and GalaxyRefine servers, respectively. The predicted structures were further evaluated by Rampage, ProSA-web, ERRAT and Verify 3D servers. Linear and conformational B-cell epitopes were identified with Ellipro, Bcepred, and DiscoTope 2 servers. To improve the accuracy level, consensus epitopes were selected. Fifty-four conformational and 133 linear consensus epitopes were predicted. Furthermore, overlapping epitopes in each allergen group were defined, following the sequence alignment of the allergens in each group. The predicted epitopes were also compared with the experimentally identified epitopes. The presented results provide valuable information for further studies about allergy vaccine design.

Keywords: B-cell epitope, Immunotherapy, In silico prediction, Mite allergens, Tertiary structure

Procedia PDF Downloads 160
1909 Improvement of Microscopic Detection of Acid-Fast Bacilli for Tuberculosis by Artificial Intelligence-Assisted Microscopic Platform and Medical Image Recognition System

Authors: Hsiao-Chuan Huang, King-Lung Kuo, Mei-Hsin Lo, Hsiao-Yun Chou, Yusen Lin

Abstract:

The most robust and economical method for laboratory diagnosis of TB is to identify mycobacterial bacilli (AFB) under acid-fast staining despite its disadvantages of low sensitivity and labor-intensive. Though digital pathology becomes popular in medicine, an automated microscopic system for microbiology is still not available. A new AI-assisted automated microscopic system, consisting of a microscopic scanner and recognition program powered by big data and deep learning, may significantly increase the sensitivity of TB smear microscopy. Thus, the objective is to evaluate such an automatic system for the identification of AFB. A total of 5,930 smears was enrolled for this study. An intelligent microscope system (TB-Scan, Wellgen Medical, Taiwan) was used for microscopic image scanning and AFB detection. 272 AFB smears were used for transfer learning to increase the accuracy. Referee medical technicians were used as Gold Standard for result discrepancy. Results showed that, under a total of 1726 AFB smears, the automated system's accuracy, sensitivity and specificity were 95.6% (1,650/1,726), 87.7% (57/65), and 95.9% (1,593/1,661), respectively. Compared to culture, the sensitivity for human technicians was only 33.8% (38/142); however, the automated system can achieve 74.6% (106/142), which is significantly higher than human technicians, and this is the first of such an automated microscope system for TB smear testing in a controlled trial. This automated system could achieve higher TB smear sensitivity and laboratory efficiency and may complement molecular methods (eg. GeneXpert) to reduce the total cost for TB control. Furthermore, such an automated system is capable of remote access by the internet and can be deployed in the area with limited medical resources.

Keywords: TB smears, automated microscope, artificial intelligence, medical imaging

Procedia PDF Downloads 229
1908 Design and Implementation a Platform for Adaptive Online Learning Based on Fuzzy Logic

Authors: Budoor Al Abid

Abstract:

Educational systems are increasingly provided as open online services, providing guidance and support for individual learners. To adapt the learning systems, a proper evaluation must be made. This paper builds the evaluation model Fuzzy C Means Adaptive System (FCMAS) based on data mining techniques to assess the difficulty of the questions. The following steps are implemented; first using a dataset from an online international learning system called (slepemapy.cz) the dataset contains over 1300000 records with 9 features for students, questions and answers information with feedback evaluation. Next, a normalization process as preprocessing step was applied. Then FCM clustering algorithms are used to adaptive the difficulty of the questions. The result is three cluster labeled data depending on the higher Wight (easy, Intermediate, difficult). The FCM algorithm gives a label to all the questions one by one. Then Random Forest (RF) Classifier model is constructed on the clustered dataset uses 70% of the dataset for training and 30% for testing; the result of the model is a 99.9% accuracy rate. This approach improves the Adaptive E-learning system because it depends on the student behavior and gives accurate results in the evaluation process more than the evaluation system that depends on feedback only.

Keywords: machine learning, adaptive, fuzzy logic, data mining

Procedia PDF Downloads 196
1907 Application and Limitation of Heavy Metal Pollution Indicators in Coastal Environment of Pakistan

Authors: Noor Us Saher

Abstract:

Oceans and Marine areas have a great importance, mainly regarding food resources, fishery products and reliance of livelihood. Aquatic pollution is common due to the incorporation of various chemicals mainly entering from urbanization, industrial and commercial facilities, such as oil and chemical spills. Many hazardous wastes and industrial effluents contaminate the nearby areas and initiate to affect the marine environment. These contaminated conditions may become worse in those aquatic environments situated besides the world’s largest cities, which are hubs of various commercial activities. Heavy metal contamination is one of the most important predicaments for marine environments and during past decades this problem has intensified due to an increase in urbanization and industrialization. Coastal regions of Pakistan are facing severe threats from various organic and inorganic pollutants, especially the estuarine and coastal areas of Karachi city, the most populated and industrialized city situated along the coastline. Metal contamination causes severe toxicity in biota resulting the degradation of Marine environments and depletion of fishery resources and sustainability. There are several abiotic (air, water and sediment) and biotic (fauna and flora) indicators that indicate metal contamination. However, all these indicators have certain limitations and complexities, which delay their implementation for rehabilitation and conservation in the marine environment. The inadequate evidences have presented on this significant topic till the time and this study discussed metal pollution and its consequences along the marine environment of Pakistan. This study further helps in identification of possible hazards for the ecological system and allied resources for management strategies and decision making for sustainable approaches.

Keywords: coastal and estuarine environment, heavy metals pollution, pollution indicators, Pakistan

Procedia PDF Downloads 249
1906 Genetic Characterization of Acanthamoeba Isolates from Amoebic Keratitis Patients

Authors: Sumeeta Khurana, Kirti Megha, Amit Gupta, Rakesh Sehgal

Abstract:

Background: Amoebic keratitis is a painful vision threatening infection caused by a free living pathogenic amoeba Acanthamoeba. It can be misdiagnosed and very difficult to treat if not suspected early. The epidemiology of Acanthamoeba genotypes causing infection in our geographical area is not yet known to the best of our knowledge. Objective: To characterize Acanthamoeba isolates from amoebic keratitis patients. Methods: A total of 19 isolates obtained from patients with amoebic keratitis presenting to the Advanced Eye Centre at Postgraduate Institute of Medical Education and Research, a tertiary care centre of North India over a period of last 10 years were included. Their corneal scrapings, lens solution and lens case (in case of lens wearer) were collected for microscopic examination, culture and molecular diagnosis. All the isolates were maintained in the Non Nutrient agar culture medium overlaid with E.coli and 13 strains were axenised and maintained in modified Peptone Yeast Dextrose Agar. Identification of Acanthamoeba genotypes was based on amplification of diagnostic fragment 3 (DF3) region of the 18srRNA gene followed by sequencing. Nucleotide similarity search was performed by BLAST search of sequenced amplicons in GenBank database (http//www.ncbi.nlm.nih.gov/blast). Multiple Sequence alignments were determined by using CLUSTAL X. Results: Nine out of 19 Acanthamoeba isolates were found to belong to Genotype T4 followed by 6 isolates of genotype T11, 3 T5 and 1 T3 genotype. Conclusion: T4 is the predominant Acanthamoeba genotype in our geographical area. Further studies should focus on differences in pathogenicity of these genotypes and their clinical significance.

Keywords: Acanthamoeba, free living amoeba, keratitis, genotype, ocular

Procedia PDF Downloads 238
1905 Proxisch: An Optimization Approach of Large-Scale Unstable Proxy Servers Scheduling

Authors: Xiaoming Jiang, Jinqiao Shi, Qingfeng Tan, Wentao Zhang, Xuebin Wang, Muqian Chen

Abstract:

Nowadays, big companies such as Google, Microsoft, which have adequate proxy servers, have perfectly implemented their web crawlers for a certain website in parallel. But due to lack of expensive proxy servers, it is still a puzzle for researchers to crawl large amounts of information from a single website in parallel. In this case, it is a good choice for researchers to use free public proxy servers which are crawled from the Internet. In order to improve efficiency of web crawler, the following two issues should be considered primarily: (1) Tasks may fail owing to the instability of free proxy servers; (2) A proxy server will be blocked if it visits a single website frequently. In this paper, we propose Proxisch, an optimization approach of large-scale unstable proxy servers scheduling, which allow anyone with extremely low cost to run a web crawler efficiently. Proxisch is designed to work efficiently by making maximum use of reliable proxy servers. To solve second problem, it establishes a frequency control mechanism which can ensure the visiting frequency of any chosen proxy server below the website’s limit. The results show that our approach performs better than the other scheduling algorithms.

Keywords: proxy server, priority queue, optimization algorithm, distributed web crawling

Procedia PDF Downloads 211
1904 Trajectory Design and Power Allocation for Energy -Efficient UAV Communication Based on Deep Reinforcement Learning

Authors: Yuling Cui, Danhao Deng, Chaowei Wang, Weidong Wang

Abstract:

In recent years, unmanned aerial vehicles (UAVs) have been widely used in wireless communication, attracting more and more attention from researchers. UAVs can not only serve as a relay for auxiliary communication but also serve as an aerial base station for ground users (GUs). However, limited energy means that they cannot work all the time and cover a limited range of services. In this paper, we investigate 2D UAV trajectory design and power allocation in order to maximize the UAV's service time and downlink throughput. Based on deep reinforcement learning, we propose a depth deterministic strategy gradient algorithm for trajectory design and power distribution (TDPA-DDPG) to solve the energy-efficient and communication service quality problem. The simulation results show that TDPA-DDPG can extend the service time of UAV as much as possible, improve the communication service quality, and realize the maximization of downlink throughput, which is significantly improved compared with existing methods.

Keywords: UAV trajectory design, power allocation, energy efficient, downlink throughput, deep reinforcement learning, DDPG

Procedia PDF Downloads 150
1903 Efficiency of PCR-RFLP for the Identification of Adulteries in Meat Formulation

Authors: Hela Gargouri, Nizar Moalla, Hassen Hadj Kacem

Abstract:

Meat adulteration affecting the safety and quality of food is becoming one of the main concerns of public interest across the world. The drastic consequences on the meat industry highlighted the urgent necessity to control the products' quality and to point out the complexity of both supply and processing circuits. Due to the expansion of this problem, the authentic testing of foods, particularly meat and its products, is deemed crucial to avoid unfair market competition and to protect consumers from fraudulent practices of meat adulteration. The adoption of authentication methods by the food quality-control laboratories is becoming a priority issue. However, in some developing countries, the number of food tests is still insignificant, although a variety of processed and traditional meat products are widely consumed. Little attention has been paid to provide an easy, fast, reproducible, and low-cost molecular test, which could be conducted in a basic laboratory. In the current study, the 359 bp fragment of the cytochrome-b gene was mapped by PCR-RFLP using firstly fresh biological supports (DNA and meat) and then turkey salami as an example of commercial processed meat. This technique has been established through several optimizations, namely: the selection of restriction enzymes. The digestion with BsmAI, SspI, and TaaI succeed to identify the seven included animal species when meat is formed by individual species and when the meat is a mixture of different origin. In this study, the PCR-RFLP technique using universal primer succeed to meet our needs by providing an indirect sequencing method identifying by restriction enzymes the specificities characterizing different species on the same amplicon reducing the number of potential tests.

Keywords: adulteration, animal species, authentication, meat, mtDNA, PCR-RFLP

Procedia PDF Downloads 112
1902 Identification of Significant Genes in Rheumatoid Arthritis, Melanoma Metastasis, Ulcerative Colitis and Crohn’s Disease

Authors: Krishna Pal Singh, Shailendra Kumar Gupta, Olaf Wolkenhauer

Abstract:

Background: Our study aimed to identify common genes and potential targets across the four diseases, which include rheumatoid arthritis, melanoma metastasis, ulcerative colitis, and Crohn’s disease. We used a network and systems biology approach to identify the hub gene, which can act as a potential target for all four disease conditions. The regulatory network was extracted from the PPI using the MCODE module present in Cytoscape. Our objective was to investigate the significance of hub genes in these diseases using gene ontology and KEGG pathway enrichment analysis. Methods: Our methodology involved collecting disease gene-related information from DisGeNET databases and performing protein-protein interaction (PPI) network and core genes screening. We then conducted gene ontology and KEGG pathway enrichment analysis. Results: We found that IL6 plays a critical role in all disease conditions and in different pathways that can be associated with the development of all four diseases. Conclusions: The theoretical importance of our research is that we employed various systems and structural biology techniques to identify a crucial protein that could serve as a promising target for treating multiple diseases. Our data collection and analysis procedures involved rigorous scrutiny, ensuring high-quality results. Our conclusion is that IL6 plays a significant role in all four diseases, and it can act as a potential target for treating them. Our findings may have important implications for the development of novel therapeutic interventions for these diseases.

Keywords: melanoma metastasis, rheumatoid arthritis, inflammatory bowel diseases, integrated bioinformatics analysis

Procedia PDF Downloads 89
1901 Assessment Using Copulas of Simultaneous Damage to Multiple Buildings Due to Tsunamis

Authors: Yo Fukutani, Shuji Moriguchi, Takuma Kotani, Terada Kenjiro

Abstract:

If risk management of the assets owned by companies, risk assessment of real estate portfolio, and risk identification of the entire region are to be implemented, it is necessary to consider simultaneous damage to multiple buildings. In this research, the Sagami Trough earthquake tsunami that could have a significant effect on the Japanese capital region is focused on, and a method is proposed for simultaneous damage assessment using copulas that can take into consideration the correlation of tsunami depths and building damage between two sites. First, the tsunami inundation depths at two sites were simulated by using a nonlinear long-wave equation. The tsunamis were simulated by varying the slip amount (five cases) and the depths (five cases) for each of 10 sources of the Sagami Trough. For each source, the frequency distributions of the tsunami inundation depth were evaluated by using the response surface method. Then, Monte-Carlo simulation was conducted, and frequency distributions of tsunami inundation depth were evaluated at the target sites for all sources of the Sagami Trough. These are marginal distributions. Kendall’s tau for the tsunami inundation simulation at two sites was 0.83. Based on this value, the Gaussian copula, t-copula, Clayton copula, and Gumbel copula (n = 10,000) were generated. Then, the simultaneous distributions of the damage rate were evaluated using the marginal distributions and the copulas. For the correlation of the tsunami inundation depth at the two sites, the expected value hardly changed compared with the case of no correlation, but the damage rate of the ninety-ninth percentile value was approximately 2%, and the maximum value was approximately 6% when using the Gumbel copula.

Keywords: copulas, Monte-Carlo simulation, probabilistic risk assessment, tsunamis

Procedia PDF Downloads 143
1900 FLIME - Fast Low Light Image Enhancement for Real-Time Video

Authors: Vinay P., Srinivas K. S.

Abstract:

Low Light Image Enhancement is of utmost impor- tance in computer vision based tasks. Applications include vision systems for autonomous driving, night vision devices for defence systems, low light object detection tasks. Many of the existing deep learning methods are resource intensive during the inference step and take considerable time for processing. The algorithm should take considerably less than 41 milliseconds in order to process a real-time video feed with 24 frames per second and should be even less for a video with 30 or 60 frames per second. The paper presents a fast and efficient solution which has two main advantages, it has the potential to be used for a real-time video feed, and it can be used in low compute environments because of the lightweight nature. The proposed solution is a pipeline of three steps, the first one is the use of a simple function to map input RGB values to output RGB values, the second is to balance the colors and the final step is to adjust the contrast of the image. Hence a custom dataset is carefully prepared using images taken in low and bright lighting conditions. The preparation of the dataset, the proposed model, the processing time are discussed in detail and the quality of the enhanced images using different methods is shown.

Keywords: low light image enhancement, real-time video, computer vision, machine learning

Procedia PDF Downloads 206
1899 [Keynote Talk]: Unlocking Transformational Resilience in the Aftermath of a Flood Disaster: A Case Study from Cumbria

Authors: Kate Crinion, Martin Haran, Stanley McGreal, David McIlhatton

Abstract:

Past research has demonstrated that disasters are continuing to escalate in frequency and magnitude worldwide, representing a key concern for the global community. Understanding and responding to the increasing risk posed by disaster events has become a key concern for disaster managers. An emerging trend within literature, acknowledges the need to move beyond a state of coping and reinstatement of the status quo, towards incremental adaptive change and transformational actions for long-term sustainable development. As such, a growing interest in research concerns the understanding of the change required to address ever increasing and unpredictable disaster events. Capturing transformational capacity and resilience, however is not without its difficulties and explains the dearth in attempts to capture this capacity. Adopting a case study approach, this research seeks to enhance an awareness of transformational resilience by identifying key components and indicators that determine the resilience of flood-affected communities within Cumbria. Grounding and testing a theoretical resilience framework within the case studies, permits the identification of how perceptions of risk influence community resilience actions. Further, it assesses how levels of social capital and connectedness impacts upon the extent of interplay between resources and capacities that drive transformational resilience. Thus, this research seeks to expand the existing body of knowledge by enhancing the awareness of resilience in post-disaster affected communities, by investigating indicators of community capacity building and resilience actions that facilitate transformational resilience during the recovery and reconstruction phase of a flood disaster.

Keywords: capacity building, community, flooding, transformational resilience

Procedia PDF Downloads 289
1898 Interactive Winding Geometry Design of Power Transformers

Authors: Paffrath Meinhard, Zhou Yayun, Guo Yiqing, Ertl Harald

Abstract:

Winding geometry design is an important part of power transformer electrical design. Conventionally, the winding geometry is designed manually, which is a time-consuming job because it involves many iteration steps in order to meet all cost, manufacturing and electrical requirements. Here a method is presented which automatically generates the winding geometry for given user parameters and allows the user to interactively set and change parameters. To achieve this goal, the winding problem is transferred to a mixed integer nonlinear optimization problem. The relevant geometrical design parameters are defined as optimization variables. The cost and other requirements are modeled as constraints. For the solution, a stochastic ant colony optimization algorithm is applied. It is well-known, that an optimizer can get stuck in a local minimum. For the winding problem, we present efficient strategies to come out of local minima, furthermore a reduced variable search range helps to accelerate the solution process. Numerical examples show that the optimization result is delivered within seconds such that the user can interactively change the variable search area and constraints to improve the design.

Keywords: ant colony optimization, mixed integer nonlinear programming, power transformer, winding design

Procedia PDF Downloads 380
1897 Support Vector Regression Combined with Different Optimization Algorithms to Predict Global Solar Radiation on Horizontal Surfaces in Algeria

Authors: Laidi Maamar, Achwak Madani, Abdellah El Ahdj Abdellah

Abstract:

The aim of this work is to use Support Vector regression (SVR) combined with dragonfly, firefly, Bee Colony and particle swarm Optimization algorithm to predict global solar radiation on horizontal surfaces in some cities in Algeria. Combining these optimization algorithms with SVR aims principally to enhance accuracy by fine-tuning the parameters, speeding up the convergence of the SVR model, and exploring a larger search space efficiently; these parameters are the regularization parameter (C), kernel parameters, and epsilon parameter. By doing so, the aim is to improve the generalization and predictive accuracy of the SVR model. Overall, the aim is to leverage the strengths of both SVR and optimization algorithms to create a more powerful and effective regression model for various cities and under different climate conditions. Results demonstrate close agreement between predicted and measured data in terms of different metrics. In summary, SVM has proven to be a valuable tool in modeling global solar radiation, offering accurate predictions and demonstrating versatility when combined with other algorithms or used in hybrid forecasting models.

Keywords: support vector regression (SVR), optimization algorithms, global solar radiation prediction, hybrid forecasting models

Procedia PDF Downloads 35
1896 Identification of Breeding Objectives for Begait Goat in Western Tigray, North Ethiopia

Authors: Hagos Abraham, Solomon Gizaw, Mengistu Urge

Abstract:

A sound breeding objective is the basis for genetic improvement in overall economic merit of farm animals. Begait goat is one of the identified breeds in Ethiopia, which is a multipurpose breed as it serves as source of cash income and source of food (meat and milk). Despite its importance, no formal breeding objectives exist for Begait goat. The objective of the present study was to identify breeding objectives for the breed through two approaches: using own-flock ranking experiment and developing deterministic bio-economic models as a preliminary step towards designing sustainable breeding programs for the breed. In the own-flock ranking experiment, a total of forty five households were visited at their homesteads and were asked to select, with reasons, the first best, second best, third best and the most inferior does from their own flock. Age, previous reproduction and production information of the identified animals were inquired; live body weight and some linear body measurements were taken. The bio-economic model included performance traits (weights, daily weight gain, kidding interval, litter size, milk yield, kid mortality, pregnancy and replacement rates) and economic (revenue and costs) parameters. It was observed that there was close agreement between the farmers’ ranking and bio-economic model results. In general, the results of the present study indicated that Begait goat owners could improve performance of their goats and profitability of their farms by selecting for litter size, six month weight, pre-weaning kid survival rate and milk yield.

Keywords: bio-economic model, economic parameters, own-flock ranking, performance traits

Procedia PDF Downloads 67
1895 A Phenomenological Approach to Computational Modeling of Analogy

Authors: José Eduardo García-Mendiola

Abstract:

In this work, a phenomenological approach to computational modeling of analogy processing is carried out. The paper goes through the consideration of the structure of the analogy, based on the possibility of sustaining the genesis of its elements regarding Husserl's genetic theory of association. Among particular processes which take place in order to get analogical inferences, there is one which arises crucial for enabling efficient base cases retrieval through long-term memory, namely analogical transference grounded on familiarity. In general, it has been argued that analogical reasoning is a way by which a conscious agent tries to determine or define a certain scope of objects and relationships between them using previous knowledge of other familiar domain of objects and relations. However, looking for a complete description of analogy process, a deeper consideration of phenomenological nature is required in so far, its simulation by computational programs is aimed. Also, one would get an idea of how complex it would be to have a fully computational account of the analogy elements. In fact, familiarity is not a result of a mere chain of repetitions of objects or events but generated insofar as the object/attribute or event in question is integrable inside a certain context that is taking shape as functionalities and functional approaches or perspectives of the object are being defined. Its familiarity is generated not by the identification of its parts or objective determinations as if they were isolated from those functionalities and approaches. Rather, at the core of such a familiarity between entities of different kinds lays the way they are functionally encoded. So, and hoping to make deeper inroads towards these topics, this essay allows us to consider that cognitive-computational perspectives can visualize, from the phenomenological projection of the analogy process reviewing achievements already obtained as well as exploration of new theoretical-experimental configurations towards implementation of analogy models in specific as well as in general purpose machines.

Keywords: analogy, association, encoding, retrieval

Procedia PDF Downloads 121
1894 Dissection of Genomic Loci for Yellow Vein Mosaic Virus Resistance in Okra (Abelmoschus esculentas)

Authors: Rakesh Kumar Meena, Tanushree Chatterjee

Abstract:

Okra (Abelmoschus esculentas L. Moench) or lady’s finger is an important vegetable crop belonging to the Malvaceae family. Unfortunately, production and productivity of Okra are majorly affected by Yellow Vein mosaic virus (YVMV). The AO: 189 (resistant parent) X AO: 191(susceptible parent) used for the development of mapping population. The mapping population has 143 individuals (F₂:F₃). Population was characterized by physiological and pathological observations. Screening of 360 DNA markers was performed to survey for parental polymorphism between the contrasting parents’, i.e., AO: 189 and AO: 191. Out of 360; 84 polymorphic markers were used for genotyping of the mapping population. Total markers were distributed into four linkage groups (LG1, LG2, LG3, and LG4). LG3 covered the longest span (106.8cM) with maximum number of markers (27) while LG1 represented the smallest linkage group in terms of length (71.2cM). QTL identification using the composite interval mapping approach detected two prominent QTLs, QTL1 and QTL2 for resistance against YVMV disease. These QTLs were placed between the marker intervals of NBS-LRR72-Path02 and NBS-LRR06- NBS-LRR65 on linkage group 02 and linkage group 04 respectively. The LOD values of QTL1 and QTL2 were 5.7 and 6.8 which accounted for 19% and 27% of the total phenotypic variation, respectively. The findings of this study provide two linked markers which can be used as efficient diagnostic tools to distinguish between YVMV resistant and susceptible Okra cultivars/genotypes. Lines identified as highly resistant against YVMV infection can be used as donor lines for this trait. This will be instrumental in accelerating the trait improvement program in Okra and will substantially reduce the yield losses due to this viral disease.

Keywords: Okra, yellow vein mosaic virus, resistant, linkage map, QTLs

Procedia PDF Downloads 215
1893 Roasting Process of Sesame Seeds Modelling Using Gene Expression Programming: A Comparative Analysis with Response Surface Methodology

Authors: Alime Cengiz, Talip Kahyaoglu

Abstract:

Roasting process has the major importance to obtain desired aromatic taste of nuts. In this study, two kinds of roasting process were applied to hulled sesame seeds - vacuum oven and hot air roasting. Efficiency of Gene Expression Programming (GEP), a new soft computing technique of evolutionary algorithm that describes the cause and effect relationships in the data modelling system, and response surface methodology (RSM) were examined in the modelling of roasting processes over a range of temperature (120-180°C) for various times (30-60 min). Color attributes (L*, a*, b*, Browning Index (BI)), textural properties (hardness and fracturability) and moisture content were evaluated and modelled by RSM and GEP. The GEP-based formulations and RSM approach were compared with experimental results and evaluated according to correlation coefficients. The results showed that both GEP and RSM were found to be able to adequately learn the relation between roasting conditions and physical and textural parameters of roasted seeds. However, GEP had better prediction performance than the RSM with the high correlation coefficients (R2 >0.92) for the all quality parameters. This result indicates that the soft computing techniques have better capability for describing the physical changes occuring in sesame seeds during roasting process.

Keywords: genetic expression programming, response surface methodology, roasting, sesame seed

Procedia PDF Downloads 418
1892 Noninvasive Brain-Machine Interface to Control Both Mecha TE Robotic Hands Using Emotiv EEG Neuroheadset

Authors: Adrienne Kline, Jaydip Desai

Abstract:

Electroencephalogram (EEG) is a noninvasive technique that registers signals originating from the firing of neurons in the brain. The Emotiv EEG Neuroheadset is a consumer product comprised of 14 EEG channels and was used to record the reactions of the neurons within the brain to two forms of stimuli in 10 participants. These stimuli consisted of auditory and visual formats that provided directions of ‘right’ or ‘left.’ Participants were instructed to raise their right or left arm in accordance with the instruction given. A scenario in OpenViBE was generated to both stimulate the participants while recording their data. In OpenViBE, the Graz Motor BCI Stimulator algorithm was configured to govern the duration and number of visual stimuli. Utilizing EEGLAB under the cross platform MATLAB®, the electrodes most stimulated during the study were defined. Data outputs from EEGLAB were analyzed using IBM SPSS Statistics® Version 20. This aided in determining the electrodes to use in the development of a brain-machine interface (BMI) using real-time EEG signals from the Emotiv EEG Neuroheadset. Signal processing and feature extraction were accomplished via the Simulink® signal processing toolbox. An Arduino™ Duemilanove microcontroller was used to link the Emotiv EEG Neuroheadset and the right and left Mecha TE™ Hands.

Keywords: brain-machine interface, EEGLAB, emotiv EEG neuroheadset, OpenViBE, simulink

Procedia PDF Downloads 502
1891 The Optimization of TICSI in the Convergence Mechanism of Urban Water Management

Authors: M. Macchiaroli, L. Dolores, V. Pellecchia

Abstract:

With the recent Resolution n. 580/2019/R/idr, the Italian Regulatory Authority for Energy, Networks, and Environment (ARERA) for the Urban Water Management has introduced, for water managements characterized by persistent critical issues regarding the planning and organization of the service and the implementation of the necessary interventions for the improvement of infrastructures and management quality, a new mechanism for determining tariffs: the regulatory scheme of Convergence. The aim of this regulatory scheme is the overcoming of the Water Service Divided in order to improve the stability of the local institutional structures, technical quality, contractual quality, as well as in order to guarantee transparency elements for Users of the Service. Convergence scheme presupposes the identification of the cost items to be considered in the tariff in parametric terms, distinguishing three possible cases according to the type of historical data available to the Manager. The study, in particular, focuses on operations that have neither data on tariff revenues nor data on operating costs. In this case, the Manager's Constraint on Revenues (VRG) is estimated on the basis of a reference benchmark and becomes the starting point for defining the structure of the tariff classes, in compliance with the TICSI provisions (Integrated Text for tariff classes, ARERA's Resolution n. 665/2017/R/idr). The proposed model implements the recent studies on optimization models for the definition of tariff classes in compliance with the constraints dictated by TICSI in the application of the Convergence mechanism, proposing itself as a support tool for the Managers and the local water regulatory Authority in the decision-making process.

Keywords: decision-making process, economic evaluation of projects, optimizing tools, urban water management, water tariff

Procedia PDF Downloads 118