Search results for: genetic breeding models
7314 Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme
Authors: Salah Alrabeei, Mohammad Yousuf
Abstract:
The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.Keywords: Integral differential equations, , L-stable methods, pricing European options, Jump–diffusion model
Procedia PDF Downloads 1517313 Reinforcing The Nagoya Protocol through a Coherent Global Intellectual Property Framework: Effective Protection for Traditional Knowledge Associated with Genetic Resources in Biodiverse African States
Authors: Oluwatobiloba Moody
Abstract:
On October 12, 2014, the Nagoya Protocol, negotiated by Parties to the Convention on Biological Diversity (CBD), entered into force. The Protocol was negotiated to implement the third objective of the CBD which relates to the fair and equitable sharing of benefits arising from the utilization of genetic resources (GRs). The Protocol aims to ‘protect’ GRs and traditional knowledge (TK) associated with GRs from ‘biopiracy’, through the establishment of a binding international regime on access and benefit sharing (ABS). In reflecting on the question of ‘effectiveness’ in the Protocol’s implementation, this paper argues that the underlying problem of ‘biopiracy’, which the Protocol seeks to address, is one which goes beyond the ABS regime. It rather thrives due to indispensable factors emanating from the global intellectual property (IP) regime. It contends that biopiracy therefore constitutes an international problem of ‘borders’ as much as of ‘regimes’ and, therefore, while the implementation of the Protocol may effectively address the ‘trans-border’ issues which have hitherto troubled African provider countries in their establishment of regulatory mechanisms, it remains unable to address the ‘trans-regime’ issues related to the eradication of biopiracy, especially those issues which involve the IP regime. This is due to the glaring incoherence in the Nagoya Protocol’s implementation and the existing global IP system. In arriving at conclusions, the paper examines the ongoing related discussions within the IP regime, specifically those within the WIPO Intergovernmental Committee on Intellectual Property and Genetic Resources, Traditional Knowledge and Folklore (IGC) and the WTO TRIPS Council. It concludes that the Protocol’s effectiveness in protecting TK associated with GRs is conditional on the attainment of outcomes, within the ongoing negotiations of the IP regime, which could be implemented in a coherent manner with the Nagoya Protocol. It proposes specific ways to achieve this coherence. Three main methodological steps have been incorporated in the paper’s development. First, a review of data accumulated over a two year period arising from the coordination of six important negotiating sessions of the WIPO Intergovernmental Committee on Intellectual Property and Genetic Resources, Traditional Knowledge and Folklore. In this respect, the research benefits from reflections on the political, institutional and substantive nuances which have coloured the IP negotiations and which provide both the context and subtext to emerging texts. Second, a desktop review of the history, nature and significance of the Nagoya Protocol, using relevant primary and secondary literature from international and national sources. Third, a comparative analysis of selected biopiracy cases is undertaken for the purpose of establishing the inseparability of the IP regime and the ABS regime in the conceptualization and development of solutions to biopiracy. A comparative analysis of select African regulatory mechanisms (Kenya, South Africa and Ethiopia and the ARIPO Swakopmund Protocol) for the protection of TK is also undertaken.Keywords: biopiracy, intellectual property, Nagoya protocol, traditional knowledge
Procedia PDF Downloads 4297312 SARS-CoV-2: Prediction of Critical Charged Amino Acid Mutations
Authors: Atlal El-Assaad
Abstract:
Viruses change with time through mutations and result in new variants that may persist or disappear. A Mutation refers to an actual change in the virus genetic sequence, and a variant is a viral genome that may contain one or more mutations. Critical mutations may cause the virus to be more transmissible, with high disease severity, and more vulnerable to diagnostics, therapeutics, and vaccines. Thus, variants carrying such mutations may increase the risk to human health and are considered variants of concern (VOC). Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) - the contagious in humans, positive-sense single-stranded RNA virus that caused coronavirus disease 2019 (COVID-19) - has been studied thoroughly, and several variants were revealed across the world with their corresponding mutations. SARS-CoV-2 has four structural proteins, known as the S (spike), E (envelope), M (membrane), and N (nucleocapsid) proteins, but prior study and vaccines development focused on genetic mutations in the S protein due to its vital role in allowing the virus to attach and fuse with the membrane of a host cell. Specifically, subunit S1 catalyzes attachment, whereas subunit S2 mediates fusion. In this perspective, we studied all charged amino acid mutations of the SARS-CoV-2 viral spike protein S1 when bound to Antibody CC12.1 in a crystal structure and assessed the effect of different mutations. We generated all missense mutants of SARS-CoV-2 protein amino acids (AAs) within the SARS-CoV-2:CC12.1 complex model. To generate the family of mutants in each complex, we mutated every charged amino acid with all other charged amino acids (Lysine (K), Arginine (R), Glutamic Acid (E), and Aspartic Acid (D)) and studied the new binding of the complex after each mutation. We applied Poisson-Boltzmann electrostatic calculations feeding into free energy calculations to determine the effect of each mutation on binding. After analyzing our data, we identified charged amino acids keys for binding. Furthermore, we validated those findings against published experimental genetic data. Our results are the first to propose in silico potential life-threatening mutations of SARS-CoV-2 beyond the present mutations found in the five common variants found worldwide.Keywords: SARS-CoV-2, variant, ionic amino acid, protein-protein interactions, missense mutation, AESOP
Procedia PDF Downloads 1137311 Modeling and Simulation Methods Using MATLAB/Simulink
Authors: Jamuna Konda, Umamaheswara Reddy Karumuri, Sriramya Muthugi, Varun Pishati, Ravi Shakya,
Abstract:
This paper investigates the challenges involved in mathematical modeling of plant simulation models ensuring the performance of the plant models much closer to the real time physical model. The paper includes the analysis performed and investigation on different methods of modeling, design and development for plant model. Issues which impact the design time, model accuracy as real time model, tool dependence are analyzed. The real time hardware plant would be a combination of multiple physical models. It is more challenging to test the complete system with all possible test scenarios. There are possibilities of failure or damage of the system due to any unwanted test execution on real time.Keywords: model based design (MBD), MATLAB, Simulink, stateflow, plant model, real time model, real-time workshop (RTW), target language compiler (TLC)
Procedia PDF Downloads 3437310 Application of Human Biomonitoring and Physiologically-Based Pharmacokinetic Modelling to Quantify Exposure to Selected Toxic Elements in Soil
Authors: Eric Dede, Marcus Tindall, John W. Cherrie, Steve Hankin, Christopher Collins
Abstract:
Current exposure models used in contaminated land risk assessment are highly conservative. Use of these models may lead to over-estimation of actual exposures, possibly resulting in negative financial implications due to un-necessary remediation. Thus, we are carrying out a study seeking to improve our understanding of human exposure to selected toxic elements in soil: arsenic (As), cadmium (Cd), chromium (Cr), nickel (Ni), and lead (Pb) resulting from allotment land-use. The study employs biomonitoring and physiologically-based pharmacokinetic (PBPK) modelling to quantify human exposure to these elements. We recruited 37 allotment users (adults > 18 years old) in Scotland, UK, to participate in the study. Concentrations of the elements (and their bioaccessibility) were measured in allotment samples (soil and allotment produce). Amount of produce consumed by the participants and participants’ biological samples (urine and blood) were collected for up to 12 consecutive months. Ethical approval was granted by the University of Reading Research Ethics Committee. PBPK models (coded in MATLAB) were used to estimate the distribution and accumulation of the elements in key body compartments, thus indicating the internal body burden. Simulating low element intake (based on estimated ‘doses’ from produce consumption records), predictive models suggested that detection of these elements in urine and blood was possible within a given period of time following exposure. This information was used in planning biomonitoring, and is currently being used in the interpretation of test results from biological samples. Evaluation of the models is being carried out using biomonitoring data, by comparing model predicted concentrations and measured biomarker concentrations. The PBPK models will be used to generate bioavailability values, which could be incorporated in contaminated land exposure models. Thus, the findings from this study will promote a more sustainable approach to contaminated land management.Keywords: biomonitoring, exposure, PBPK modelling, toxic elements
Procedia PDF Downloads 3197309 Comparisons of Co-Seismic Gravity Changes between GRACE Observations and the Predictions from the Finite-Fault Models for the 2012 Mw = 8.6 Indian Ocean Earthquake Off-Sumatra
Authors: Armin Rahimi
Abstract:
The Gravity Recovery and Climate Experiment (GRACE) has been a very successful project in determining math redistribution within the Earth system. Large deformations caused by earthquakes are in the high frequency band. Unfortunately, GRACE is only capable to provide reliable estimate at the low-to-medium frequency band for the gravitational changes. In this study, we computed the gravity changes after the 2012 Mw8.6 Indian Ocean earthquake off-Sumatra using the GRACE Level-2 monthly spherical harmonic (SH) solutions released by the University of Texas Center for Space Research (UTCSR). Moreover, we calculated gravity changes using different fault models derived from teleseismic data. The model predictions showed non-negligible discrepancies in gravity changes. However, after removing high-frequency signals, using Gaussian filtering 350 km commensurable GRACE spatial resolution, the discrepancies vanished, and the spatial patterns of total gravity changes predicted from all slip models became similar at the spatial resolution attainable by GRACE observations, and predicted-gravity changes were consistent with the GRACE-detected gravity changes. Nevertheless, the fault models, in which give different slip amplitudes, proportionally lead to different amplitude in the predicted gravity changes.Keywords: undersea earthquake, GRACE observation, gravity change, dislocation model, slip distribution
Procedia PDF Downloads 3557308 Artificial Intelligence in Disease Diagnosis
Authors: Shalini Tripathi, Pardeep Kumar
Abstract:
The method of translating observed symptoms into disease names is known as disease diagnosis. The ability to solve clinical problems in a complex manner is critical to a doctor's effectiveness in providing health care. The accuracy of his or her expertise is crucial to the survival and well-being of his or her patients. Artificial Intelligence (AI) has a huge economic influence depending on how well it is applied. In the medical sector, human brain-simulated intellect can help not only with classification accuracy, but also with reducing diagnostic time, cost and pain associated with pathologies tests. In light of AI's present and prospective applications in the biomedical, we will identify them in the paper based on potential benefits and risks, social and ethical consequences and issues that might be contentious but have not been thoroughly discussed in publications and literature. Current apps, personal tracking tools, genetic tests and editing programmes, customizable models, web environments, virtual reality (VR) technologies and surgical robotics will all be investigated in this study. While AI holds a lot of potential in medical diagnostics, it is still a very new method, and many clinicians are uncertain about its reliability, specificity and how it can be integrated into clinical practice without jeopardising clinical expertise. To validate their effectiveness, more systemic refinement of these implementations, as well as training of physicians and healthcare facilities on how to effectively incorporate these strategies into clinical practice, will be needed.Keywords: Artificial Intelligence, medical diagnosis, virtual reality, healthcare ethical implications
Procedia PDF Downloads 1327307 A Demonstration of How to Employ and Interpret Binary IRT Models Using the New IRT Procedure in SAS 9.4
Authors: Ryan A. Black, Stacey A. McCaffrey
Abstract:
Over the past few decades, great strides have been made towards improving the science in the measurement of psychological constructs. Item Response Theory (IRT) has been the foundation upon which statistical models have been derived to increase both precision and accuracy in psychological measurement. These models are now being used widely to develop and refine tests intended to measure an individual's level of academic achievement, aptitude, and intelligence. Recently, the field of clinical psychology has adopted IRT models to measure psychopathological phenomena such as depression, anxiety, and addiction. Because advances in IRT measurement models are being made so rapidly across various fields, it has become quite challenging for psychologists and other behavioral scientists to keep abreast of the most recent developments, much less learn how to employ and decide which models are the most appropriate to use in their line of work. In the same vein, IRT measurement models vary greatly in complexity in several interrelated ways including but not limited to the number of item-specific parameters estimated in a given model, the function which links the expected response and the predictor, response option formats, as well as dimensionality. As a result, inferior methods (a.k.a. Classical Test Theory methods) continue to be employed in efforts to measure psychological constructs, despite evidence showing that IRT methods yield more precise and accurate measurement. To increase the use of IRT methods, this study endeavors to provide a comprehensive overview of binary IRT models; that is, measurement models employed on test data consisting of binary response options (e.g., correct/incorrect, true/false, agree/disagree). Specifically, this study will cover the most basic binary IRT model, known as the 1-parameter logistic (1-PL) model dating back to over 50 years ago, up until the most recent complex, 4-parameter logistic (4-PL) model. Binary IRT models will be defined mathematically and the interpretation of each parameter will be provided. Next, all four binary IRT models will be employed on two sets of data: 1. Simulated data of N=500,000 subjects who responded to four dichotomous items and 2. A pilot analysis of real-world data collected from a sample of approximately 770 subjects who responded to four self-report dichotomous items pertaining to emotional consequences to alcohol use. Real-world data were based on responses collected on items administered to subjects as part of a scale-development study (NIDA Grant No. R44 DA023322). IRT analyses conducted on both the simulated data and analyses of real-world pilot will provide a clear demonstration of how to construct, evaluate, and compare binary IRT measurement models. All analyses will be performed using the new IRT procedure in SAS 9.4. SAS code to generate simulated data and analyses will be available upon request to allow for replication of results.Keywords: instrument development, item response theory, latent trait theory, psychometrics
Procedia PDF Downloads 3567306 Unlocking Health Insights: Studying Data for Better Care
Authors: Valentina Marutyan
Abstract:
Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.Keywords: data mining, healthcare, big data, large amounts of data
Procedia PDF Downloads 767305 Genetic Diversity in Capsicum Germplasm Based on Inter Simple Sequence Repeat Markers
Authors: Siwapech Silapaprayoon, Januluk Khanobdee, Sompid Samipak
Abstract:
Chili peppers are the fruits of Capsicum pepper plants well known for their fiery burning sensation on the tongue after consumption. They are members of the Solanaceae or common nightshade family along with potato, tomato and eggplant. Thai cuisine has gained popularity for its distinct flavors due to usages of various spices and its heat from the addition of chili pepper. Though being used in little quantity for each dish, chili pepper holds a special place in Thai cuisine. There are many varieties of chili peppers in Thailand, and thirty accessions were collected at Rajamangala University of Technology Lanna, Lampang, Thailand. To effectively manage any germplasm it is essential to know the diversity and relationships among members. Thirty-six Inter Simple Sequence Repeat (ISSRs) DNA markers were used to analyze the germplasm. Total of 335 polymorphic bands was obtained giving the average of 9.3 alleles per marker. Unweighted pair-group mean arithmetic method (UPGMA) clustering of data using NTSYS-pc software indicated that the accessions showed varied levels of genetic similarity ranging from 0.57-1.00 similarity coefficient index indicating significant levels of variation. At SM coefficient of 0.81, the germplasm was separated into four groups. Phenotypic variation was discussed in context of phylogenetic tree clustering.Keywords: diversity, germplasm, Chili pepper, ISSR
Procedia PDF Downloads 1527304 Non-Dominated Sorting Genetic Algorithm (NSGA-II) for the Redistricting Problem in Mexico
Authors: Antonin Ponsich, Eric Alfredo Rincon Garcia, Roman Anselmo Mora Gutierrez, Miguel Angel Gutierrez Andrade, Sergio Gerardo De Los Cobos Silva, Pedro Lara Velzquez
Abstract:
The electoral zone design problem consists in redrawing the boundaries of legislative districts for electoral purposes in such a way that federal or state requirements are fulfilled. In Mexico, this process has been historically carried out by the National Electoral Institute (INE), by optimizing an integer nonlinear programming model, in which population equality and compactness of the designed districts are considered as two conflicting objective functions, while contiguity is included as a hard constraint. The solution technique used by the INE is a Simulated Annealing (SA) based algorithm, which handles the multi-objective nature of the problem through an aggregation function. The present work represents the first intent to apply a classical Multi-Objective Evolutionary Algorithm (MOEA), the second version of the Non-dominated Sorting Genetic Algorithm (NSGA-II), to this hard combinatorial problem. First results show that, when compared with the SA algorithm, the NSGA-II obtains promising results. The MOEA manages to produce well-distributed solutions over a wide-spread front, even though some convergence troubles for some instances constitute an issue, which should be corrected in future adaptations of MOEAs to the redistricting problem.Keywords: multi-objective optimization, NSGA-II, redistricting, zone design problem
Procedia PDF Downloads 3677303 A Review Investigating the Potential Of Zooxanthellae to Be Genetically Engineered to Combat Coral Bleaching
Authors: Anuschka Curran, Sandra Barnard
Abstract:
Coral reefs are of the most diverse and productive ecosystems on the planet, but due to the impact of climate change, these infrastructures are dying off primarily through coral bleaching. Coral bleaching can be described as the process by which zooxanthellae (algal endosymbionts) are expelled from the gastrodermal cavity of the respective coral host, causing increased coral whitening. The general consensus is that mass coral bleaching is due to the dysfunction of photosynthetic processes in the zooxanthellae as a result of the combined action of elevated temperature and light-stress. The question then is, do zooxanthellae have the potential to play a key role in the future of coral reef restoration through genetic engineering? The aim of this study is firstly to review the different zooxanthellae taxa and their traits with respect to environmental stress, and secondly, to review the information available on the protective mechanisms present in zooxanthellae cells when experiencing temperature fluctuations, specifically concentrating on heat shock proteins and the antioxidant stress response of zooxanthellae. The eight clades (A-H) previously recognized were redefined into seven genera. Different zooxanthellae taxa exhibit different traits, such as their photosynthetic stress responses to light and temperature. Zooxanthellae have the ability to determine the amount and type of heat shock proteins (hsps) present during a heat response. The zooxanthellae can regulate both the host’s respective hsps as well as their own. Hsps, generally found in genotype C3 zooxanthellae, such as Hsp70 and Hsp90, contribute to the thermal stress response of the respective coral host. Antioxidant activity found both within exposed coral tissue, and the zooxanthellae cells can prevent coral hosts from expelling their endosymbionts. The up-regulation of gene expression, which may mitigate thermal stress induction of any of the physiological aspects discussed, can ensure stable coral-zooxanthellae symbiosis in the future. It presents a viable alternative strategy to preserve reefs amidst climate change. In conclusion, despite their unusual molecular design, genetic engineering poses as a useful tool in understanding and manipulating variables and systems within zooxanthellae and therefore presents a solution that can ensure stable coral-zooxanthellae symbiosis in the future.Keywords: antioxidant enzymes, genetic engineering, heat-shock proteins, Symbiodinium
Procedia PDF Downloads 1897302 Automatic and High Precise Modeling for System Optimization
Authors: Stephanie Chen, Mitja Echim, Christof Büskens
Abstract:
To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization
Procedia PDF Downloads 4097301 Model Updating Based on Modal Parameters Using Hybrid Pattern Search Technique
Authors: N. Guo, C. Xu, Z. C. Yang
Abstract:
In order to ensure the high reliability of an aircraft, the accurate structural dynamics analysis has become an indispensable part in the design of an aircraft structure. Therefore, the structural finite element model which can be used to accurately calculate the structural dynamics and their transfer relations is the prerequisite in structural dynamic design. A dynamic finite element model updating method is presented to correct the uncertain parameters of the finite element model of a structure using measured modal parameters. The coordinate modal assurance criterion is used to evaluate the correlation level at each coordinate over the experimental and the analytical mode shapes. Then, the weighted summation of the natural frequency residual and the coordinate modal assurance criterion residual is used as the objective function. Moreover, the hybrid pattern search (HPS) optimization technique, which synthesizes the advantages of pattern search (PS) optimization technique and genetic algorithm (GA), is introduced to solve the dynamic FE model updating problem. A numerical simulation and a model updating experiment for GARTEUR aircraft model are performed to validate the feasibility and effectiveness of the present dynamic model updating method, respectively. The updated results show that the proposed method can be successfully used to modify the incorrect parameters with good robustness.Keywords: model updating, modal parameter, coordinate modal assurance criterion, hybrid genetic/pattern search
Procedia PDF Downloads 1617300 Screening Some Accessions of Lentil (Lens culinaris M.) for Salt Tolerance at Germination and Early Seedling Stage in Eastern Ethiopia
Authors: Azene Tesfaye, Yohannes Petros, Habtamu Zeleke
Abstract:
To evaluate genetic variation among Ethiopian lentil, laboratory experiment were conducted to screen 12 accessions of lentil (Lens culinaris M.) for salt tolerance. Seeds of 12 Lentil accessions were grown at laboratory (Petri dish) condition with different levels of salinity (0, 2, 4, and 8 dSm-1 NaCl) for 4 weeks. The experimental design was completely randomized design (CRD) in factorial combination with three replications. Data analysis was carried out using SAS software. Average germination time, germination percentage, seedling shoot and root traits, seedling shoot and root weight were evaluated. The two way ANOVA for varieties revealed statistically significant variation among lentil accession, NaCl level and their interactions (p<0.001) with respect to the entire parameters. It was found that salt stress significantly delays germination rate and decreases germination percentage, shoot and root length, seedling shoot and root weight of lentil accessions. The degree of decrement varied with accessions and salinity levels. Accessions 36120, 9235 and 36004 were better salt tolerant than the other accessions. As the result, it is recommended to be used as a genetic resource for the development of lentil accession and other very salt sensitive crop with improved germination under salt stress condition.Keywords: accession, germination, lentil, NaCl, screening, seedling stage
Procedia PDF Downloads 3397299 The Diversity of DRB1 Locus of Exon 2 of MHC Molecule of Sudanese Indigenous Desert Sheep
Authors: Muna A. Eissawi, Safaa Abed Elfataah, Haytham Hago, Fatima E Abukunna, Ibtisam Amin Goreish, Nahid Gornas
Abstract:
The study examined and analyzed the genetic diversity of DRB1locus of exon 2 of major histocompatibility complex of Sudanese desert sheep using PCR-RFLP and DNA sequencing. Five hundred samples belonging to five ecotypes of Desert Sudanese sheep (Abrag (Ab), Ashgar (Ash), Hamari (H), Kabashi (K) and Watish (W) were included. Amplification of exon 2 of the DRB1 gene yielded (300bp) amplified product in different ecotypes. Nine different digestion patterns corresponding to Five distinct alleles were observed with Rsa1 digestion. Genotype (ag) was the most common among all ecotypes, with a percentage comprised (40.4 %). The Hardy-Weinberg equilibrium (HWE) test showed that the studied ecotypes have significantly deviated from the theoretical proportions of Rsa1 patterns; probability values of the Chi-square test for HWE for MHC-DRB1 gene in SDS were 0.00 in all ecotypes. The constructed phylogenetic tree revealed the relation of 22 Sudanese isolates with each other and showed the shared sequences with 47 published foreign sequences randomly selected from different geographic regions. The results of this study highlight the effect of heterozygosity of MHC genes of the Desert sheep of Sudan which may clarify some of genetic back ground of their disease resistance and adaptation to environment.Keywords: desert sheep, MHC, Ovar-DRB1, polymerase chain reaction (PCR), restriction fragment length polymorphism (RFLP)
Procedia PDF Downloads 777298 Adaptation of Requirement Engineering Practices in Pakistan
Authors: Waqas Ali, Nadeem Majeed
Abstract:
Requirement engineering is an essence of software development life cycle. The more time we spend on requirement engineering, higher the probability of success. Effective requirement engineering ensures and predicts successful software product. This paper presents the adaptation of requirement engineering practices in small and medium size companies of Pakistan. The study is conducted by questionnaires to show how much of requirement engineering models and practices are followed in Pakistan.Keywords: requirement engineering, Pakistan, models, practices, organizations
Procedia PDF Downloads 7197297 Machine learning Assisted Selective Emitter design for Solar Thermophotovoltaic System
Authors: Ambali Alade Odebowale, Andargachew Mekonnen Berhe, Haroldo T. Hattori, Andrey E. Miroshnichenko
Abstract:
Solar thermophotovoltaic systems (STPV) have emerged as a promising solution to overcome the Shockley-Queisser limit, a significant impediment in the direct conversion of solar radiation into electricity using conventional solar cells. The STPV system comprises essential components such as an optical concentrator, selective emitter, and a thermophotovoltaic (TPV) cell. The pivotal element in achieving high efficiency in an STPV system lies in the design of a spectrally selective emitter or absorber. Traditional methods for designing and optimizing selective emitters are often time-consuming and may not yield highly selective emitters, posing a challenge to the overall system performance. In recent years, the application of machine learning techniques in various scientific disciplines has demonstrated significant advantages. This paper proposes a novel nanostructure composed of four-layered materials (SiC/W/SiO2/W) to function as a selective emitter in the energy conversion process of an STPV system. Unlike conventional approaches widely adopted by researchers, this study employs a machine learning-based approach for the design and optimization of the selective emitter. Specifically, a random forest algorithm (RFA) is employed for the design of the selective emitter, while the optimization process is executed using genetic algorithms. This innovative methodology holds promise in addressing the challenges posed by traditional methods, offering a more efficient and streamlined approach to selective emitter design. The utilization of a machine learning approach brings several advantages to the design and optimization of a selective emitter within the STPV system. Machine learning algorithms, such as the random forest algorithm, have the capability to analyze complex datasets and identify intricate patterns that may not be apparent through traditional methods. This allows for a more comprehensive exploration of the design space, potentially leading to highly efficient emitter configurations. Moreover, the application of genetic algorithms in the optimization process enhances the adaptability and efficiency of the overall system. Genetic algorithms mimic the principles of natural selection, enabling the exploration of a diverse range of emitter configurations and facilitating the identification of optimal solutions. This not only accelerates the design and optimization process but also increases the likelihood of discovering configurations that exhibit superior performance compared to traditional methods. In conclusion, the integration of machine learning techniques in the design and optimization of a selective emitter for solar thermophotovoltaic systems represents a groundbreaking approach. This innovative methodology not only addresses the limitations of traditional methods but also holds the potential to significantly improve the overall performance of STPV systems, paving the way for enhanced solar energy conversion efficiency.Keywords: emitter, genetic algorithm, radiation, random forest, thermophotovoltaic
Procedia PDF Downloads 617296 Using Photogrammetry to Survey the Côa Valley Iron Age Rock Art Motifs: Vermelhosa Panel 3 Case Study
Authors: Natália Botica, Luís Luís, Paulo Bernardes
Abstract:
The Côa Valley, listed World Heritage since 1998, presents more than 1300 open-air engraved rock panels. The Archaeological Park of the Côa Valley recorded the rock art motifs, testing various techniques based on direct tracing processes on the rock, using natural and artificial lighting. In this work, integrated in the "Open Access Rock Art Repository" (RARAA) project, we present the methodology adopted for the vectorial drawing of the rock art motifs based on orthophotos taken from the photogrammetric survey and 3D models of the rocks. We also present the information system designed to integrate the vector drawing and the characterization data of the motifs, as well as the open access sharing, in order to promote their reuse in multiple areas. The 3D models themselves constitute a very detailed record, ensuring the digital preservation of the rock and iconography. Thus, even if a rock or motif disappears, it can continue to be studied and even recreated.Keywords: rock art, archaeology, iron age, 3D models
Procedia PDF Downloads 837295 Models of Environmental: Cracker Propagation of Some Aluminum Alloys (7xxx)
Authors: H. Jawan
Abstract:
This review describes the models of environmental-related crack propagation of aluminum alloys (7xxx) during the last few decades. Acknowledge on effects of different factors on the susceptibility to SCC permits to propose valuable mechanisms on crack advancement. The reliable mechanism of cracking give a possibility to propose the optimum chemical composition and thermal treatment conditions resulting in microstructure the most suitable for real environmental condition and stress state.Keywords: microstructure, environmental, propagation, mechanism
Procedia PDF Downloads 3907294 Assessing Performance of Data Augmentation Techniques for a Convolutional Network Trained for Recognizing Humans in Drone Images
Authors: Masood Varshosaz, Kamyar Hasanpour
Abstract:
In recent years, we have seen growing interest in recognizing humans in drone images for post-disaster search and rescue operations. Deep learning algorithms have shown great promise in this area, but they often require large amounts of labeled data to train the models. To keep the data acquisition cost low, augmentation techniques can be used to create additional data from existing images. There are many techniques of such that can help generate variations of an original image to improve the performance of deep learning algorithms. While data augmentation is potentially assumed to improve the accuracy and robustness of the models, it is important to ensure that the performance gains are not outweighed by the additional computational cost or complexity of implementing the techniques. To this end, it is important to evaluate the impact of data augmentation on the performance of the deep learning models. In this paper, we evaluated the most currently available 2D data augmentation techniques on a standard convolutional network which was trained for recognizing humans in drone images. The techniques include rotation, scaling, random cropping, flipping, shifting, and their combination. The results showed that the augmented models perform 1-3% better compared to a base network. However, as the augmented images only contain the human parts already visible in the original images, a new data augmentation approach is needed to include the invisible parts of the human body. Thus, we suggest a new method that employs simulated 3D human models to generate new data for training the network.Keywords: human recognition, deep learning, drones, disaster mitigation
Procedia PDF Downloads 937293 Pricing European Continuous-Installment Options under Regime-Switching Models
Authors: Saghar Heidari
Abstract:
In this paper, we study the valuation problem of European continuous-installment options under Markov-modulated models with a partial differential equation approach. Due to the opportunity for continuing or stopping to pay installments, the valuation problem under regime-switching models can be formulated as coupled partial differential equations (CPDE) with free boundary features. To value the installment options, we express the truncated CPDE as a linear complementarity problem (LCP), then a finite element method is proposed to solve the resulted variational inequality. Under some appropriate assumptions, we establish the stability of the method and illustrate some numerical results to examine the rate of convergence and accuracy of the proposed method for the pricing problem under the regime-switching model.Keywords: continuous-installment option, European option, regime-switching model, finite element method
Procedia PDF Downloads 1377292 A Comparative Analysis of Machine Learning Techniques for PM10 Forecasting in Vilnius
Authors: Mina Adel Shokry Fahim, Jūratė Sužiedelytė Visockienė
Abstract:
With the growing concern over air pollution (AP), it is clear that this has gained more prominence than ever before. The level of consciousness has increased and a sense of knowledge now has to be forwarded as a duty by those enlightened enough to disseminate it to others. This realisation often comes after an understanding of how poor air quality indices (AQI) damage human health. The study focuses on assessing air pollution prediction models specifically for Lithuania, addressing a substantial need for empirical research within the region. Concentrating on Vilnius, it specifically examines particulate matter concentrations 10 micrometers or less in diameter (PM10). Utilizing Gaussian Process Regression (GPR) and Regression Tree Ensemble, and Regression Tree methodologies, predictive forecasting models are validated and tested using hourly data from January 2020 to December 2022. The study explores the classification of AP data into anthropogenic and natural sources, the impact of AP on human health, and its connection to cardiovascular diseases. The study revealed varying levels of accuracy among the models, with GPR achieving the highest accuracy, indicated by an RMSE of 4.14 in validation and 3.89 in testing.Keywords: air pollution, anthropogenic and natural sources, machine learning, Gaussian process regression, tree ensemble, forecasting models, particulate matter
Procedia PDF Downloads 537291 Combining Laser Scanning and High Dynamic Range Photography for the Presentation of Bloodstain Pattern Evidence
Authors: Patrick Ho
Abstract:
Bloodstain Pattern Analysis (BPA) forensic evidence can be complex, requiring effective courtroom presentation to ensure clear and comprehensive understanding of the analyst’s findings. BPA witness statements can often involve reference to spatial information (such as location of rooms, objects, walls) which, when coupled with classified blood patterns, may illustrate the reconstructed movements of suspects and injured parties. However, it may be difficult to communicate this information through photography alone, despite this remaining the UK’s established method for presenting BPA evidence. Through an academic-police partnership between the University of Warwick and West Midlands Police (WMP), an integrated 3D scanning and HDR photography workflow for BPA was developed. Homicide scenes were laser scanned and, after processing, the 3D models were utilised in the BPA peer-review process. The same 3D models were made available for court but were not always utilised. This workflow has improved the ease of presentation for analysts and provided 3D scene models that assist with the investigation. However, the effects of incorporating 3D scene models in judicial processes may need to be studied before they are adopted more widely. 3D models from a simulated crime scene and West Midlands Police cases approved for conference disclosure are presented. We describe how the workflow was developed and integrated into established practices at WMP, including peer-review processes and witness statement delivery in court, and explain the impact the work has had on the Criminal Justice System in the West Midlands.Keywords: bloodstain pattern analysis, forensic science, criminal justice, 3D scanning
Procedia PDF Downloads 967290 A Graph-Based Retrieval Model for Passage Search
Authors: Junjie Zhong, Kai Hong, Lei Wang
Abstract:
Passage Retrieval (PR) plays an important role in many Natural Language Processing (NLP) tasks. Traditional efficient retrieval models relying on exact term-matching, such as TF-IDF or BM25, have nowadays been exceeded by pre-trained language models which match by semantics. Though they gain effectiveness, deep language models often require large memory as well as time cost. To tackle the trade-off between efficiency and effectiveness in PR, this paper proposes Graph Passage Retriever (GraphPR), a graph-based model inspired by the development of graph learning techniques. Different from existing works, GraphPR is end-to-end and integrates both term-matching information and semantics. GraphPR constructs a passage-level graph from BM25 retrieval results and trains a GCN-like model on the graph with graph-based objectives. Passages were regarded as nodes in the constructed graph and were embedded in dense vectors. PR can then be implemented using embeddings and a fast vector-similarity search. Experiments on a variety of real-world retrieval datasets show that the proposed model outperforms related models in several evaluation metrics (e.g., mean reciprocal rank, accuracy, F1-scores) while maintaining a relatively low query latency and memory usage.Keywords: efficiency, effectiveness, graph learning, language model, passage retrieval, term-matching model
Procedia PDF Downloads 1487289 Fault Diagnosis of Squirrel-Cage Induction Motor by a Neural Network Multi-Models
Authors: Yahia. Kourd, N. Guersi D. Lefebvre
Abstract:
In this paper we propose to study the faults diagnosis in squirrel-cage induction motor using MLP neural networks. We use neural healthy and faulty models of the behavior in order to detect and isolate some faults in machine. In the first part of this work, we have created a neural model for the healthy state using Matlab and a motor located in LGEB by acquirins data inputs and outputs of this engine. Then we detected the faults in the machine by residual generation. These residuals are not sufficient to isolate the existing faults. For this reason, we proposed additive neural networks to represent the faulty behaviors. From the analysis of these residuals and the choice of a threshold we propose a method capable of performing the detection and diagnosis of some faults in asynchronous machines with squirrel cage rotor.Keywords: faults diagnosis, neural networks, multi-models, squirrel-cage induction motor
Procedia PDF Downloads 6367288 Location Quotients Model in Turkey’s Provinces and Nuts II Regions
Authors: Semih Sözer
Abstract:
One of the most common issues in economic systems is understanding characteristics of economic activities in cities and regions. Although there are critics to economic base models in conceptual and empirical aspects, these models are useful tools to examining the economic structure of a nation, regions or cities. This paper uses one of the methodologies of economic base models namely the location quotients model. Data for this model includes employment numbers of provinces and NUTS II regions in Turkey. Time series of data covers the years of 1990, 2000, 2003, and 2009. Aim of this study is finding which sectors are export-base and which sectors are import-base in provinces and regions. Model results show that big provinces or powerful regions (population, size etc.) mostly have basic sectors in their economic system. However, interesting facts came from different sectors in different provinces and regions in the model results.Keywords: economic base, location quotients model, regional economics, regional development
Procedia PDF Downloads 4247287 Modeling and Simulation of Practical Metamaterial Structures
Authors: Ridha Salhi, Mondher Labidi, Fethi Choubani
Abstract:
Metamaterials have attracted much attention in recent years because of their electromagnetic exquisite proprieties. We will present, in this paper, the modeling of three metamaterial structures by equivalent circuit model. We begin by modeling the SRR (Split Ring Resonator), then we model the HIS (High Impedance Surfaces), and finally, we present the model of the CPW (Coplanar Wave Guide). In order to validate models, we compare the results obtained by an equivalent circuit models with numerical simulation.Keywords: metamaterials, SRR, HIS, CPW, IDC
Procedia PDF Downloads 4297286 Organic Agriculture Harmony in Nutrition, Environment and Health: Case Study in Iran
Authors: Sara Jelodarian
Abstract:
Organic agriculture is a kind of living and dynamic agriculture that was introduced in the early 20th century. The fundamental basis for organic agriculture is in harmony with nature. This version of farming emphasizes removing growth hormones, chemical fertilizers, toxins, radiation, genetic manipulation and instead, integration of modern scientific techniques (such as biologic and microbial control) that leads to the production of healthy food and the preservation of the environment and use of agricultural products such as forage and manure. Supports from governments for the markets producing organic products and taking advantage of the experiences from other successful societies in this field can help progress the positive and effective aspects of this technology, especially in developing countries. This research proves that till 2030, 25% of the global agricultural lands would be covered by organic farming. Consequently Iran, due to its rich genetic resources and various climates, can be a pioneer in promoting organic products. In addition, for sustainable farming, blend of organic and other innovative systems is needed. Important limitations exist to accept these systems, also a diversity of policy instruments will be required to comfort their development and implementation. The paper was conducted to results of compilation of reports, issues, books, articles related to the subject with library studies and research. Likewise we combined experimental and survey to get data.Keywords: develop, production markets, progress, strategic role, technology
Procedia PDF Downloads 1177285 Development and Verification of the Idom Shielding Optimization Tool
Authors: Omar Bouhassoun, Cristian Garrido, César Hueso
Abstract:
The radiation shielding design is an optimization problem with multiple -constrained- objective functions (radiation dose, weight, price, etc.) that depend on several parameters (material, thickness, position, etc.). The classical approach for shielding design consists of a brute force trial-and-error process subject to previous designer experience. Therefore, the result is an empirical solution but not optimal, which can degrade the overall performance of the shielding. In order to automate the shielding design procedure, the IDOM Shielding Optimization Tool (ISOT) has been developed. This software combines optimization algorithms with the capabilities to read/write input files, run calculations, as well as parse output files for different radiation transport codes. In the first stage, the software was established to adjust the input files for two well-known Monte Carlo codes (MCNP and Serpent) and optimize the result (weight, volume, price, dose rate) using multi-objective genetic algorithms. Nevertheless, its modular implementation easily allows the inclusion of more radiation transport codes and optimization algorithms. The work related to the development of ISOT and its verification on a simple 3D multi-layer shielding problem using both MCNP and Serpent will be presented. ISOT looks very promising for achieving an optimal solution to complex shielding problems.Keywords: optimization, shielding, nuclear, genetic algorithm
Procedia PDF Downloads 110