Search results for: predicting models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7245

Search results for: predicting models

6555 Modeling Fertility and Production of Hazelnut Cultivars through the Artificial Neural Network under Climate Change of Karaj

Authors: Marziyeh Khavari

Abstract:

In recent decades, climate change, global warming, and the growing population worldwide face some challenges, such as increasing food consumption and shortage of resources. Assessing how climate change could disturb crops, especially hazelnut production, seems crucial for sustainable agriculture production. For hazelnut cultivation in the mid-warm condition, such as in Iran, here we present an investigation of climate parameters and how much they are effective on fertility and nut production of hazelnut trees. Therefore, the climate change of the northern zones in Iran has investigated (1960-2017) and was reached an uptrend in temperature. Furthermore, the descriptive analysis performed on six cultivars during seven years shows how this small-scale survey could demonstrate the effects of climate change on hazelnut production and stability. Results showed that some climate parameters are more significant on nut production, such as solar radiation, soil temperature, relative humidity, and precipitation. Moreover, some cultivars have produced more stable production, for instance, Negret and Segorbe, while the Mervill de Boliver recorded the most variation during the study. Another aspect that needs to be met is training and predicting an actual model to simulate nut production through a neural network and linear regression simulation. The study developed and estimated the ANN model's generalization capability with different criteria such as RMSE, SSE, and accuracy factors for dependent and independent variables (environmental and yield traits). The models were trained and tested while the accuracy of the model is proper to predict hazelnut production under fluctuations in weather parameters.

Keywords: climate change, neural network, hazelnut, global warming

Procedia PDF Downloads 113
6554 Comparing Machine Learning Estimation of Fuel Consumption of Heavy-Duty Vehicles

Authors: Victor Bodell, Lukas Ekstrom, Somayeh Aghanavesi

Abstract:

Fuel consumption (FC) is one of the key factors in determining expenses of operating a heavy-duty vehicle. A customer may therefore request an estimate of the FC of a desired vehicle. The modular design of heavy-duty vehicles allows their construction by specifying the building blocks, such as gear box, engine and chassis type. If the combination of building blocks is unprecedented, it is unfeasible to measure the FC, since this would first r equire the construction of the vehicle. This paper proposes a machine learning approach to predict FC. This study uses around 40,000 vehicles specific and o perational e nvironmental c onditions i nformation, such as road slopes and driver profiles. A ll v ehicles h ave d iesel engines and a mileage of more than 20,000 km. The data is used to investigate the accuracy of machine learning algorithms Linear regression (LR), K-nearest neighbor (KNN) and Artificial n eural n etworks (ANN) in predicting fuel consumption for heavy-duty vehicles. Performance of the algorithms is evaluated by reporting the prediction error on both simulated data and operational measurements. The performance of the algorithms is compared using nested cross-validation and statistical hypothesis testing. The statistical evaluation procedure finds that ANNs have the lowest prediction error compared to LR and KNN in estimating fuel consumption on both simulated and operational data. The models have a mean relative prediction error of 0.3% on simulated data, and 4.2% on operational data.

Keywords: artificial neural networks, fuel consumption, friedman test, machine learning, statistical hypothesis testing

Procedia PDF Downloads 157
6553 A Computational Study Concerning the Biological Effects of the Most Commonly Used Phthalates

Authors: Dana Craciun, Daniela Dascalu, Adriana Isvoran

Abstract:

Phthalates are a class of plastic additives that are used to enhance the physical properties of plastics and as solvents in paintings and some of them proved to be of particular concern for the human health. There are insufficient data concerning the health risks of phthalates and further research on evaluating their effects in humans is needed. As humans are not volunteers for such experiments, computational analysis may be used to predict the biological effects of phthalates in humans. Within this study we have used some computational approaches (SwissADME, admetSAR, FAFDrugs) for predicting the absorption, distribution, metabolization, excretion and toxicity (ADME-Tox) profiles and pharmacokinetics for the most common used phthalates. These computational tools are based on quantitative structure-activity relationship modeling approach. The predictions are further compared to the known effects of each considered phthalate in humans and correlations between computational results and experimental data are discussed. Our data revealed that phthalates are a class of compounds reflecting high toxicity both when ingested and when inhaled, but by inhalation their toxicity is even greater. The predicted harmful effects of phthalates are: toxicity and irritations of the respiratory and gastrointestinal tracts, dyspnea, skin and eye irritations and disruption of the functions of liver and of the reproductive system. Many of investigated phthalates are predicted to be able to inhibit some of the cytochromes involved in the metabolism of numerous drugs and consequently to affect the efficiency of administrated treatments for many diseases and to intensify the adverse drugs reactions. The obtained predictions are in good agreement with clinical data concerning the observed effects of some phthalates in cases of acute exposures. Our study emphasizes the possible health effects of numerous phthalates and underlines the applicability of computational methods for predicting the biological effects of xenobiotics.

Keywords: phthalates, ADME-Tox, pharmacokinetics, biological effects

Procedia PDF Downloads 235
6552 Models of Environmental, Crack Propagation of Some Aluminium Alloys (7xxx)

Authors: H. A. Jawan

Abstract:

This review describes the models of environmental-related crack propagation of aluminum alloys (7xxx) during the last few decades. Acknowledge on effects of different factors on the susceptibility to SCC permits to propose valuable mechanisms on crack advancement. The reliable mechanism of cracking give a possibility to propose the optimum chemical composition and thermal treatment conditions resulting in microstructure the most suitable for real environmental condition and stress state.

Keywords: microstructure, environmental, propagation, mechanism

Procedia PDF Downloads 399
6551 Baseline Data from Specialist Obesity Clinic in a Large Tertiary Care Facility, Karachi, Pakistan

Authors: Asma Ahmed, Farah Khalid, Sahlah Sohail, Saira Banusokwalla, Sabiha Banu, Inaara Akbar, Safia Awan, Syed Iqbal Azam

Abstract:

Background and Objectives: The level of knowledge regarding obesity as a disease condition and health-seeking behavior regarding its management is grossly lacking. We present data from our multidisciplinary obesity clinic at the large tertiary care facility in Karachi, Pakistan, to provide baseline profiles and outcomes of patients attending these clinics. Methods: 260 who attended the obesity clinic between June 2018 to March 2020 were enrolled in this study. The analysis included descriptive and ROC analysis to identify the best cut-offs of theanthropometric measurements to diagnose obesity-related comorbid conditions. Results: The majority of the studied population were women (72.3%) and employed(43.7%) with a mean age of 35.5 years. Mean BMIwas 37.4, waist circumference was 112.4 cm, visceral fat was 11.7%, and HbA1C was 6.9%. The most common comorbidities were HTN & D.M (33 &31%, respectively). The prevalence of MetS was 16.3% in patients and was slightly higher in males. Visceral fat was the main factor in predicting D.M (0.750; 95% CI: 0.665, 0.836) and MetS (0.709; 95% CI: 0.590, 0.838) compared to total body fat, waist circumference, and BMI. The risk of predicting DM &MetS for the visceral fat above 9.5% in women had the highest sensitivity (80% for D.M & 79% for MetS) and an NPV (92.75% for D.M & 95% for MetS). Conclusions: This study describes and establishes characteristics of these obese individuals, which can help inform clinical practices. These practices may involve using visceral fat for earlier identification and counseling-based interventions to prevent more severe surgical interventions down the line.

Keywords: obesity, metabolic syndrome, tertiary care facility, BMI, waist circumference, visceral fat

Procedia PDF Downloads 138
6550 Application of the Micropolar Beam Theory for the Construction of the Discrete-Continual Model of Carbon Nanotubes

Authors: Samvel H. Sargsyan

Abstract:

Together with the study of electron-optical properties of nanostructures and proceeding from experiment-based data, the study of the mechanical properties of nanostructures has become quite actual. For the study of the mechanical properties of fullerene, carbon nanotubes, graphene and other nanostructures one of the crucial issues is the construction of their adequate mathematical models. Among all mathematical models of graphene or carbon nano-tubes, this so-called discrete-continuous model is specifically important. It substitutes the interactions between atoms by elastic beams or springs. The present paper demonstrates the construction of the discrete-continual beam model for carbon nanotubes or graphene, where the micropolar beam model based on the theory of moment elasticity is accepted. With the account of the energy balance principle, the elastic moment constants for the beam model, expressed by the physical and geometrical parameters of carbon nanotube or graphene, are determined. By switching from discrete-continual beam model to the continual, the models of micropolar elastic cylindrical shell and micropolar elastic plate are confirmed as continual models for carbon nanotube and graphene respectively.

Keywords: carbon nanotube, discrete-continual, elastic, graphene, micropolar, plate, shell

Procedia PDF Downloads 135
6549 Measurement of Project Success in Construction Using Performance Indices

Authors: Annette Joseph

Abstract:

Background: The construction industry is dynamic in nature owing to the increasing uncertainties in technology, budgets, and development processes making projects more complex. Thus, predicting project performance and chances of its likely success has become difficult. The goal of all parties involved in construction projects is to successfully complete it on schedule, within planned budget and with the highest quality and in the safest manner. However, the concept of project success has remained ambiguously defined in the mind of the construction professionals. Purpose: This paper aims to study the analysis of a project in terms of its performance and measure the success. Methodology: The parameters for evaluating project success and the indices to measure success/performance of a project are identified through literature study. Through questionnaire surveys aimed at the stakeholders in the projects, data is collected from two live case studies (an ongoing and completed project) on the overall performance in terms of its success/failure. Finally, with the help of SPSS tool, the data collected from the surveys are analyzed and applied on the selected performance indices. Findings: The score calculated by using the indices and models helps in assessing the overall performance of the project and interpreting it to find out whether the project will be a success or failure. This study acts as a reference for firms to carry out performance evaluation and success measurement on a regular basis helping projects to identify the areas which are performing well and those that require improvement. Originality & Value: The study signifies that by measuring project performance; a project’s deviation towards success/failure can be assessed thus helping in suggesting early remedial measures to bring it on track ensuring that a project will be completed successfully.

Keywords: project, performance, indices, success

Procedia PDF Downloads 168
6548 Gait Analysis in Total Knee Arthroplasty

Authors: Neeraj Vij, Christian Leber, Kenneth Schmidt

Abstract:

Introduction: Total knee arthroplasty is a common procedure. It is well known that the biomechanics of the knee do not fully return to their normal state. Motion analysis has been used to study the biomechanics of the knee after total knee arthroplasty. The purpose of this scoping review is to summarize the current use of gait analysis in total knee arthroplasty and to identify the preoperative motion analysis parameters for which a systematic review aimed at determining the reliability and validity may be warranted. Materials and Methods: This IRB-exempt scoping review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) checklist strictly. Five search engines were searched for a total of 279 articles. Articles underwent a title and abstract screening process followed by full-text screening. Included articles were placed in the following sections: the role of gait analysis as a research tool for operative decisions, other research applications for motion analysis in total knee arthroplasty, gait analysis as a tool in predicting radiologic outcomes, gait analysis as a tool in predicting clinical outcomes. Results: Eleven articles studied gait analysis as a research tool in studying operative decisions. Motion analysis is currently used to study surgical approaches, surgical techniques, and implant choice. Five articles studied other research applications for motion analysis in total knee arthroplasty. Other research applications for motion analysis currently include studying the role of the unicompartmental knee arthroplasty and novel physical therapy protocols aimed at optimizing post-operative care. Two articles studied motion analysis as a tool for predicting radiographic outcomes. Preoperative gait analysis has identified parameters than can predict postoperative tibial component migration. 15 articles studied motion analysis in conjunction with clinical scores. Conclusions: There is a broad range of applications within the research domain of total knee arthroplasty. The potential application is likely larger. However, the current literature is limited by vague definitions of ‘gait analysis’ or ‘motion analysis’ and a limited number of articles with preoperative and postoperative functional and clinical measures. Knee adduction moment, knee adduction impulse, total knee range of motion, varus angle, cadence, stride length, and velocity have the potential for integration into composite clinical scores. A systematic review aimed at determining the validity, reliability, sensitivities, and specificities of these variables is warranted.

Keywords: motion analysis, joint replacement, patient-reported outcomes, knee surgery

Procedia PDF Downloads 77
6547 Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.

Keywords: Integral differential equations, , L-stable methods, pricing European options, Jump–diffusion model

Procedia PDF Downloads 131
6546 Modeling and Simulation Methods Using MATLAB/Simulink

Authors: Jamuna Konda, Umamaheswara Reddy Karumuri, Sriramya Muthugi, Varun Pishati, Ravi Shakya,

Abstract:

This paper investigates the challenges involved in mathematical modeling of plant simulation models ensuring the performance of the plant models much closer to the real time physical model. The paper includes the analysis performed and investigation on different methods of modeling, design and development for plant model. Issues which impact the design time, model accuracy as real time model, tool dependence are analyzed. The real time hardware plant would be a combination of multiple physical models. It is more challenging to test the complete system with all possible test scenarios. There are possibilities of failure or damage of the system due to any unwanted test execution on real time.

Keywords: model based design (MBD), MATLAB, Simulink, stateflow, plant model, real time model, real-time workshop (RTW), target language compiler (TLC)

Procedia PDF Downloads 331
6545 Application of Human Biomonitoring and Physiologically-Based Pharmacokinetic Modelling to Quantify Exposure to Selected Toxic Elements in Soil

Authors: Eric Dede, Marcus Tindall, John W. Cherrie, Steve Hankin, Christopher Collins

Abstract:

Current exposure models used in contaminated land risk assessment are highly conservative. Use of these models may lead to over-estimation of actual exposures, possibly resulting in negative financial implications due to un-necessary remediation. Thus, we are carrying out a study seeking to improve our understanding of human exposure to selected toxic elements in soil: arsenic (As), cadmium (Cd), chromium (Cr), nickel (Ni), and lead (Pb) resulting from allotment land-use. The study employs biomonitoring and physiologically-based pharmacokinetic (PBPK) modelling to quantify human exposure to these elements. We recruited 37 allotment users (adults > 18 years old) in Scotland, UK, to participate in the study. Concentrations of the elements (and their bioaccessibility) were measured in allotment samples (soil and allotment produce). Amount of produce consumed by the participants and participants’ biological samples (urine and blood) were collected for up to 12 consecutive months. Ethical approval was granted by the University of Reading Research Ethics Committee. PBPK models (coded in MATLAB) were used to estimate the distribution and accumulation of the elements in key body compartments, thus indicating the internal body burden. Simulating low element intake (based on estimated ‘doses’ from produce consumption records), predictive models suggested that detection of these elements in urine and blood was possible within a given period of time following exposure. This information was used in planning biomonitoring, and is currently being used in the interpretation of test results from biological samples. Evaluation of the models is being carried out using biomonitoring data, by comparing model predicted concentrations and measured biomarker concentrations. The PBPK models will be used to generate bioavailability values, which could be incorporated in contaminated land exposure models. Thus, the findings from this study will promote a more sustainable approach to contaminated land management.

Keywords: biomonitoring, exposure, PBPK modelling, toxic elements

Procedia PDF Downloads 302
6544 Comparisons of Co-Seismic Gravity Changes between GRACE Observations and the Predictions from the Finite-Fault Models for the 2012 Mw = 8.6 Indian Ocean Earthquake Off-Sumatra

Authors: Armin Rahimi

Abstract:

The Gravity Recovery and Climate Experiment (GRACE) has been a very successful project in determining math redistribution within the Earth system. Large deformations caused by earthquakes are in the high frequency band. Unfortunately, GRACE is only capable to provide reliable estimate at the low-to-medium frequency band for the gravitational changes. In this study, we computed the gravity changes after the 2012 Mw8.6 Indian Ocean earthquake off-Sumatra using the GRACE Level-2 monthly spherical harmonic (SH) solutions released by the University of Texas Center for Space Research (UTCSR). Moreover, we calculated gravity changes using different fault models derived from teleseismic data. The model predictions showed non-negligible discrepancies in gravity changes. However, after removing high-frequency signals, using Gaussian filtering 350 km commensurable GRACE spatial resolution, the discrepancies vanished, and the spatial patterns of total gravity changes predicted from all slip models became similar at the spatial resolution attainable by GRACE observations, and predicted-gravity changes were consistent with the GRACE-detected gravity changes. Nevertheless, the fault models, in which give different slip amplitudes, proportionally lead to different amplitude in the predicted gravity changes.

Keywords: undersea earthquake, GRACE observation, gravity change, dislocation model, slip distribution

Procedia PDF Downloads 339
6543 A Demonstration of How to Employ and Interpret Binary IRT Models Using the New IRT Procedure in SAS 9.4

Authors: Ryan A. Black, Stacey A. McCaffrey

Abstract:

Over the past few decades, great strides have been made towards improving the science in the measurement of psychological constructs. Item Response Theory (IRT) has been the foundation upon which statistical models have been derived to increase both precision and accuracy in psychological measurement. These models are now being used widely to develop and refine tests intended to measure an individual's level of academic achievement, aptitude, and intelligence. Recently, the field of clinical psychology has adopted IRT models to measure psychopathological phenomena such as depression, anxiety, and addiction. Because advances in IRT measurement models are being made so rapidly across various fields, it has become quite challenging for psychologists and other behavioral scientists to keep abreast of the most recent developments, much less learn how to employ and decide which models are the most appropriate to use in their line of work. In the same vein, IRT measurement models vary greatly in complexity in several interrelated ways including but not limited to the number of item-specific parameters estimated in a given model, the function which links the expected response and the predictor, response option formats, as well as dimensionality. As a result, inferior methods (a.k.a. Classical Test Theory methods) continue to be employed in efforts to measure psychological constructs, despite evidence showing that IRT methods yield more precise and accurate measurement. To increase the use of IRT methods, this study endeavors to provide a comprehensive overview of binary IRT models; that is, measurement models employed on test data consisting of binary response options (e.g., correct/incorrect, true/false, agree/disagree). Specifically, this study will cover the most basic binary IRT model, known as the 1-parameter logistic (1-PL) model dating back to over 50 years ago, up until the most recent complex, 4-parameter logistic (4-PL) model. Binary IRT models will be defined mathematically and the interpretation of each parameter will be provided. Next, all four binary IRT models will be employed on two sets of data: 1. Simulated data of N=500,000 subjects who responded to four dichotomous items and 2. A pilot analysis of real-world data collected from a sample of approximately 770 subjects who responded to four self-report dichotomous items pertaining to emotional consequences to alcohol use. Real-world data were based on responses collected on items administered to subjects as part of a scale-development study (NIDA Grant No. R44 DA023322). IRT analyses conducted on both the simulated data and analyses of real-world pilot will provide a clear demonstration of how to construct, evaluate, and compare binary IRT measurement models. All analyses will be performed using the new IRT procedure in SAS 9.4. SAS code to generate simulated data and analyses will be available upon request to allow for replication of results.

Keywords: instrument development, item response theory, latent trait theory, psychometrics

Procedia PDF Downloads 332
6542 Performance the SOFA and APACHEII Scoring System to Predicate the Mortality of the ICU Cases

Authors: Yu-Chuan Huang

Abstract:

Introduction: There is a higher mortality rate for unplanned transfer to intensive care units. It also needs a longer length of stay and makes the intensive care unit beds cannot be effectively used. It affects the immediate medical treatment of critically ill patients, resulting in a drop in the quality of medical care. Purpose: The purpose of this study was using SOFA and APACHEII score to analyze the mortality rate of the cases transferred from ED to ICU. According to the score that should be provide an appropriate care as early as possible. Methods: This study was a descriptive experimental design. The sample size was estimated at 220 to reach a power of 0.8 for detecting a medium effect size of 0.30, with a 0.05 significance level, using G-power. Considering an estimated follow-up loss, the required sample size was estimated as 242 participants. Data were calculated by medical system of SOFA and APACHEII score that cases transferred from ED to ICU in 2016. Results: There were 233 participants meet the study. The medical records showed 33 participants’ mortality. Age and sex with QSOFA , SOFA and sex with APACHEII showed p>0.05. Age with APCHHII in ED and ICU showed r=0.150, 0,268 (p < 0.001**). The score with mortality risk showed: ED QSOFA is r=0.235 (p < 0.001**), exp(B)=1.685(p = 0.007); ICU SOFA 0.78 (p < 0.001**), exp(B)=1.205(p < 0.001). APACHII in ED and ICU showed r= 0.253, 0.286 (p < 0.001**), exp(B) = 1.041,1.073(p = 0.017,0.001). For SOFA, a cutoff score of above 15 points was identified as a predictor of the 95% mortality risk. Conclusions: The SOFA and APACHE II were calculated based on initial laboratory data in the Emergency Department, and during the first 24 hours of ICU admission. In conclusion, the SOFA and APACHII score is significantly associated with mortality and strongly predicting mortality. Early predictors of morbidity and mortality, which we can according the predicting score, and provide patients with a detail assessment and proper care, thereby reducing mortality and length of stay.

Keywords: SOFA, APACHEII, mortality, ICU

Procedia PDF Downloads 132
6541 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 383
6540 Predicting the Human Impact of Natural Onset Disasters Using Pattern Recognition Techniques and Rule Based Clustering

Authors: Sara Hasani

Abstract:

This research focuses on natural sudden onset disasters characterised as ‘occurring with little or no warning and often cause excessive injuries far surpassing the national response capacities’. Based on the panel analysis of the historic record of 4,252 natural onset disasters between 1980 to 2015, a predictive method was developed to predict the human impact of the disaster (fatality, injured, homeless) with less than 3% of errors. The geographical dispersion of the disasters includes every country where the data were available and cross-examined from various humanitarian sources. The records were then filtered into 4252 records of the disasters where the five predictive variables (disaster type, HDI, DRI, population, and population density) were clearly stated. The procedure was designed based on a combination of pattern recognition techniques and rule-based clustering for prediction and discrimination analysis to validate the results further. The result indicates that there is a relationship between the disaster human impact and the five socio-economic characteristics of the affected country mentioned above. As a result, a framework was put forward, which could predict the disaster’s human impact based on their severity rank in the early hours of disaster strike. The predictions in this model were outlined in two worst and best-case scenarios, which respectively inform the lower range and higher range of the prediction. A necessity to develop the predictive framework can be highlighted by noticing that despite the existing research in literature, a framework for predicting the human impact and estimating the needs at the time of the disaster is yet to be developed. This can further be used to allocate the resources at the response phase of the disaster where the data is scarce.

Keywords: disaster management, natural disaster, pattern recognition, prediction

Procedia PDF Downloads 138
6539 Adaptation of Requirement Engineering Practices in Pakistan

Authors: Waqas Ali, Nadeem Majeed

Abstract:

Requirement engineering is an essence of software development life cycle. The more time we spend on requirement engineering, higher the probability of success. Effective requirement engineering ensures and predicts successful software product. This paper presents the adaptation of requirement engineering practices in small and medium size companies of Pakistan. The study is conducted by questionnaires to show how much of requirement engineering models and practices are followed in Pakistan.

Keywords: requirement engineering, Pakistan, models, practices, organizations

Procedia PDF Downloads 696
6538 Using Photogrammetry to Survey the Côa Valley Iron Age Rock Art Motifs: Vermelhosa Panel 3 Case Study

Authors: Natália Botica, Luís Luís, Paulo Bernardes

Abstract:

The Côa Valley, listed World Heritage since 1998, presents more than 1300 open-air engraved rock panels. The Archaeological Park of the Côa Valley recorded the rock art motifs, testing various techniques based on direct tracing processes on the rock, using natural and artificial lighting. In this work, integrated in the "Open Access Rock Art Repository" (RARAA) project, we present the methodology adopted for the vectorial drawing of the rock art motifs based on orthophotos taken from the photogrammetric survey and 3D models of the rocks. We also present the information system designed to integrate the vector drawing and the characterization data of the motifs, as well as the open access sharing, in order to promote their reuse in multiple areas. The 3D models themselves constitute a very detailed record, ensuring the digital preservation of the rock and iconography. Thus, even if a rock or motif disappears, it can continue to be studied and even recreated.

Keywords: rock art, archaeology, iron age, 3D models

Procedia PDF Downloads 65
6537 Models of Environmental: Cracker Propagation of Some Aluminum Alloys (7xxx)

Authors: H. Jawan

Abstract:

This review describes the models of environmental-related crack propagation of aluminum alloys (7xxx) during the last few decades. Acknowledge on effects of different factors on the susceptibility to SCC permits to propose valuable mechanisms on crack advancement. The reliable mechanism of cracking give a possibility to propose the optimum chemical composition and thermal treatment conditions resulting in microstructure the most suitable for real environmental condition and stress state.

Keywords: microstructure, environmental, propagation, mechanism

Procedia PDF Downloads 374
6536 Assessing Performance of Data Augmentation Techniques for a Convolutional Network Trained for Recognizing Humans in Drone Images

Authors: Masood Varshosaz, Kamyar Hasanpour

Abstract:

In recent years, we have seen growing interest in recognizing humans in drone images for post-disaster search and rescue operations. Deep learning algorithms have shown great promise in this area, but they often require large amounts of labeled data to train the models. To keep the data acquisition cost low, augmentation techniques can be used to create additional data from existing images. There are many techniques of such that can help generate variations of an original image to improve the performance of deep learning algorithms. While data augmentation is potentially assumed to improve the accuracy and robustness of the models, it is important to ensure that the performance gains are not outweighed by the additional computational cost or complexity of implementing the techniques. To this end, it is important to evaluate the impact of data augmentation on the performance of the deep learning models. In this paper, we evaluated the most currently available 2D data augmentation techniques on a standard convolutional network which was trained for recognizing humans in drone images. The techniques include rotation, scaling, random cropping, flipping, shifting, and their combination. The results showed that the augmented models perform 1-3% better compared to a base network. However, as the augmented images only contain the human parts already visible in the original images, a new data augmentation approach is needed to include the invisible parts of the human body. Thus, we suggest a new method that employs simulated 3D human models to generate new data for training the network.

Keywords: human recognition, deep learning, drones, disaster mitigation

Procedia PDF Downloads 74
6535 Pricing European Continuous-Installment Options under Regime-Switching Models

Authors: Saghar Heidari

Abstract:

In this paper, we study the valuation problem of European continuous-installment options under Markov-modulated models with a partial differential equation approach. Due to the opportunity for continuing or stopping to pay installments, the valuation problem under regime-switching models can be formulated as coupled partial differential equations (CPDE) with free boundary features. To value the installment options, we express the truncated CPDE as a linear complementarity problem (LCP), then a finite element method is proposed to solve the resulted variational inequality. Under some appropriate assumptions, we establish the stability of the method and illustrate some numerical results to examine the rate of convergence and accuracy of the proposed method for the pricing problem under the regime-switching model.

Keywords: continuous-installment option, European option, regime-switching model, finite element method

Procedia PDF Downloads 118
6534 A Comparative Analysis of Machine Learning Techniques for PM10 Forecasting in Vilnius

Authors: Mina Adel Shokry Fahim, Jūratė Sužiedelytė Visockienė

Abstract:

With the growing concern over air pollution (AP), it is clear that this has gained more prominence than ever before. The level of consciousness has increased and a sense of knowledge now has to be forwarded as a duty by those enlightened enough to disseminate it to others. This realisation often comes after an understanding of how poor air quality indices (AQI) damage human health. The study focuses on assessing air pollution prediction models specifically for Lithuania, addressing a substantial need for empirical research within the region. Concentrating on Vilnius, it specifically examines particulate matter concentrations 10 micrometers or less in diameter (PM10). Utilizing Gaussian Process Regression (GPR) and Regression Tree Ensemble, and Regression Tree methodologies, predictive forecasting models are validated and tested using hourly data from January 2020 to December 2022. The study explores the classification of AP data into anthropogenic and natural sources, the impact of AP on human health, and its connection to cardiovascular diseases. The study revealed varying levels of accuracy among the models, with GPR achieving the highest accuracy, indicated by an RMSE of 4.14 in validation and 3.89 in testing.

Keywords: air pollution, anthropogenic and natural sources, machine learning, Gaussian process regression, tree ensemble, forecasting models, particulate matter

Procedia PDF Downloads 33
6533 Combining Laser Scanning and High Dynamic Range Photography for the Presentation of Bloodstain Pattern Evidence

Authors: Patrick Ho

Abstract:

Bloodstain Pattern Analysis (BPA) forensic evidence can be complex, requiring effective courtroom presentation to ensure clear and comprehensive understanding of the analyst’s findings. BPA witness statements can often involve reference to spatial information (such as location of rooms, objects, walls) which, when coupled with classified blood patterns, may illustrate the reconstructed movements of suspects and injured parties. However, it may be difficult to communicate this information through photography alone, despite this remaining the UK’s established method for presenting BPA evidence. Through an academic-police partnership between the University of Warwick and West Midlands Police (WMP), an integrated 3D scanning and HDR photography workflow for BPA was developed. Homicide scenes were laser scanned and, after processing, the 3D models were utilised in the BPA peer-review process. The same 3D models were made available for court but were not always utilised. This workflow has improved the ease of presentation for analysts and provided 3D scene models that assist with the investigation. However, the effects of incorporating 3D scene models in judicial processes may need to be studied before they are adopted more widely. 3D models from a simulated crime scene and West Midlands Police cases approved for conference disclosure are presented. We describe how the workflow was developed and integrated into established practices at WMP, including peer-review processes and witness statement delivery in court, and explain the impact the work has had on the Criminal Justice System in the West Midlands.

Keywords: bloodstain pattern analysis, forensic science, criminal justice, 3D scanning

Procedia PDF Downloads 70
6532 A Graph-Based Retrieval Model for Passage Search

Authors: Junjie Zhong, Kai Hong, Lei Wang

Abstract:

Passage Retrieval (PR) plays an important role in many Natural Language Processing (NLP) tasks. Traditional efficient retrieval models relying on exact term-matching, such as TF-IDF or BM25, have nowadays been exceeded by pre-trained language models which match by semantics. Though they gain effectiveness, deep language models often require large memory as well as time cost. To tackle the trade-off between efficiency and effectiveness in PR, this paper proposes Graph Passage Retriever (GraphPR), a graph-based model inspired by the development of graph learning techniques. Different from existing works, GraphPR is end-to-end and integrates both term-matching information and semantics. GraphPR constructs a passage-level graph from BM25 retrieval results and trains a GCN-like model on the graph with graph-based objectives. Passages were regarded as nodes in the constructed graph and were embedded in dense vectors. PR can then be implemented using embeddings and a fast vector-similarity search. Experiments on a variety of real-world retrieval datasets show that the proposed model outperforms related models in several evaluation metrics (e.g., mean reciprocal rank, accuracy, F1-scores) while maintaining a relatively low query latency and memory usage.

Keywords: efficiency, effectiveness, graph learning, language model, passage retrieval, term-matching model

Procedia PDF Downloads 110
6531 Fault Diagnosis of Squirrel-Cage Induction Motor by a Neural Network Multi-Models

Authors: Yahia. Kourd, N. Guersi D. Lefebvre

Abstract:

In this paper we propose to study the faults diagnosis in squirrel-cage induction motor using MLP neural networks. We use neural healthy and faulty models of the behavior in order to detect and isolate some faults in machine. In the first part of this work, we have created a neural model for the healthy state using Matlab and a motor located in LGEB by acquirins data inputs and outputs of this engine. Then we detected the faults in the machine by residual generation. These residuals are not sufficient to isolate the existing faults. For this reason, we proposed additive neural networks to represent the faulty behaviors. From the analysis of these residuals and the choice of a threshold we propose a method capable of performing the detection and diagnosis of some faults in asynchronous machines with squirrel cage rotor.

Keywords: faults diagnosis, neural networks, multi-models, squirrel-cage induction motor

Procedia PDF Downloads 612
6530 Location Quotients Model in Turkey’s Provinces and Nuts II Regions

Authors: Semih Sözer

Abstract:

One of the most common issues in economic systems is understanding characteristics of economic activities in cities and regions. Although there are critics to economic base models in conceptual and empirical aspects, these models are useful tools to examining the economic structure of a nation, regions or cities. This paper uses one of the methodologies of economic base models namely the location quotients model. Data for this model includes employment numbers of provinces and NUTS II regions in Turkey. Time series of data covers the years of 1990, 2000, 2003, and 2009. Aim of this study is finding which sectors are export-base and which sectors are import-base in provinces and regions. Model results show that big provinces or powerful regions (population, size etc.) mostly have basic sectors in their economic system. However, interesting facts came from different sectors in different provinces and regions in the model results.

Keywords: economic base, location quotients model, regional economics, regional development

Procedia PDF Downloads 412
6529 Modeling and Simulation of Practical Metamaterial Structures

Authors: Ridha Salhi, Mondher Labidi, Fethi Choubani

Abstract:

Metamaterials have attracted much attention in recent years because of their electromagnetic exquisite proprieties. We will present, in this paper, the modeling of three metamaterial structures by equivalent circuit model. We begin by modeling the SRR (Split Ring Resonator), then we model the HIS (High Impedance Surfaces), and finally, we present the model of the CPW (Coplanar Wave Guide). In order to validate models, we compare the results obtained by an equivalent circuit models with numerical simulation.

Keywords: metamaterials, SRR, HIS, CPW, IDC

Procedia PDF Downloads 406
6528 Effective Charge Coupling in Low Dimensional Doped Quantum Antiferromagnets

Authors: Suraka Bhattacharjee, Ranjan Chaudhury

Abstract:

The interaction between the charge degrees of freedom for itinerant antiferromagnets is investigated in terms of generalized charge stiffness constant corresponding to nearest neighbour t-J model and t1-t2-t3-J model. The low dimensional hole doped antiferromagnets are the well known systems that can be described by the t-J-like models. Accordingly, we have used these models to investigate the fermionic pairing possibilities and the coupling between the itinerant charge degrees of freedom. A detailed comparison between spin and charge couplings highlights that the charge and spin couplings show very similar behaviour in the over-doped region, whereas, they show completely different trends in the lower doping regimes. Moreover, a qualitative equivalence between generalized charge stiffness and effective Coulomb interaction is also established based on the comparisons with other theoretical and experimental results. Thus it is obvious that the enhanced possibility of fermionic pairing is inherent in the reduction of Coulomb repulsion with increase in doping concentration. However, the increased possibility can not give rise to pairing without the presence of any other pair producing mechanism outside the t-J model. Therefore, one can conclude that the t-J-like models themselves solely are not capable of producing conventional momentum-based superconducting pairing on their own.

Keywords: generalized charge stiffness constant, charge coupling, effective Coulomb interaction, t-J-like models, momentum-space pairing

Procedia PDF Downloads 140
6527 Assimilating Remote Sensing Data Into Crop Models: A Global Systematic Review

Authors: Luleka Dlamini, Olivier Crespo, Jos van Dam

Abstract:

Accurately estimating crop growth and yield is pivotal for timely sustainable agricultural management and ensuring food security. Crop models and remote sensing can complement each other and form a robust analysis tool to improve crop growth and yield estimations when combined. This study thus aims to systematically evaluate how research that exclusively focuses on assimilating RS data into crop models varies among countries, crops, data assimilation methods, and farming conditions. A strict search string was applied in the Scopus and Web of Science databases, and 497 potential publications were obtained. After screening for relevance with predefined inclusion/exclusion criteria, 123 publications were considered in the final review. Results indicate that over 81% of the studies were conducted in countries associated with high socio-economic and technological advancement, mainly China, the United States of America, France, Germany, and Italy. Many of these studies integrated MODIS or Landsat data into WOFOST to improve crop growth and yield estimation of staple crops at the field and regional scales. Most studies use recalibration or updating methods alongside various algorithms to assimilate remotely sensed leaf area index into crop models. However, these methods cannot account for the uncertainties in remote sensing observations and the crop model itself. l. Over 85% of the studies were based on commercial and irrigated farming systems. Despite a great global interest in data assimilation into crop models, limited research has been conducted in resource- and data-limited regions like Africa. We foresee a great potential for such application in those conditions. Hence facilitating and expanding the use of such an approach, from which developing farming communities could benefit.

Keywords: crop models, remote sensing, data assimilation, crop yield estimation

Procedia PDF Downloads 106
6526 Assimilating Remote Sensing Data into Crop Models: A Global Systematic Review

Authors: Luleka Dlamini, Olivier Crespo, Jos van Dam

Abstract:

Accurately estimating crop growth and yield is pivotal for timely sustainable agricultural management and ensuring food security. Crop models and remote sensing can complement each other and form a robust analysis tool to improve crop growth and yield estimations when combined. This study thus aims to systematically evaluate how research that exclusively focuses on assimilating RS data into crop models varies among countries, crops, data assimilation methods, and farming conditions. A strict search string was applied in the Scopus and Web of Science databases, and 497 potential publications were obtained. After screening for relevance with predefined inclusion/exclusion criteria, 123 publications were considered in the final review. Results indicate that over 81% of the studies were conducted in countries associated with high socio-economic and technological advancement, mainly China, the United States of America, France, Germany, and Italy. Many of these studies integrated MODIS or Landsat data into WOFOST to improve crop growth and yield estimation of staple crops at the field and regional scales. Most studies use recalibration or updating methods alongside various algorithms to assimilate remotely sensed leaf area index into crop models. However, these methods cannot account for the uncertainties in remote sensing observations and the crop model itself. l. Over 85% of the studies were based on commercial and irrigated farming systems. Despite a great global interest in data assimilation into crop models, limited research has been conducted in resource- and data-limited regions like Africa. We foresee a great potential for such application in those conditions. Hence facilitating and expanding the use of such an approach, from which developing farming communities could benefit.

Keywords: crop models, remote sensing, data assimilation, crop yield estimation

Procedia PDF Downloads 64