Search results for: ensembled models
6052 Supervised Machine Learning Approach for Studying the Effect of Different Joint Sets on Stability of Mine Pit Slopes Under the Presence of Different External Factors
Authors: Sudhir Kumar Singh, Debashish Chakravarty
Abstract:
Slope stability analysis is an important aspect in the field of geotechnical engineering. It is also important from safety, and economic point of view as any slope failure leads to loss of valuable lives and damage to property worth millions. This paper aims at mitigating the risk of slope failure by studying the effect of different joint sets on the stability of mine pit slopes under the influence of various external factors, namely degree of saturation, rainfall intensity, and seismic coefficients. Supervised machine learning approach has been utilized for making accurate and reliable predictions regarding the stability of slopes based on the value of Factor of Safety. Numerous cases have been studied for analyzing the stability of slopes using the popular Finite Element Method, and the data thus obtained has been used as training data for the supervised machine learning models. The input data has been trained on different supervised machine learning models, namely Random Forest, Decision Tree, Support vector Machine, and XGBoost. Distinct test data that is not present in training data has been used for measuring the performance and accuracy of different models. Although all models have performed well on the test dataset but Random Forest stands out from others due to its high accuracy of greater than 95%, thus helping us by providing a valuable tool at our disposition which is neither computationally expensive nor time consuming and in good accordance with the numerical analysis result.Keywords: finite element method, geotechnical engineering, machine learning, slope stability
Procedia PDF Downloads 1016051 Predicting the Impact of Scope Changes on Project Cost and Schedule Using Machine Learning Techniques
Authors: Soheila Sadeghi
Abstract:
In the dynamic landscape of project management, scope changes are an inevitable reality that can significantly impact project performance. These changes, whether initiated by stakeholders, external factors, or internal project dynamics, can lead to cost overruns and schedule delays. Accurately predicting the consequences of these changes is crucial for effective project control and informed decision-making. This study aims to develop predictive models to estimate the impact of scope changes on project cost and schedule using machine learning techniques. The research utilizes a comprehensive dataset containing detailed information on project tasks, including the Work Breakdown Structure (WBS), task type, productivity rate, estimated cost, actual cost, duration, task dependencies, scope change magnitude, and scope change timing. Multiple machine learning models are developed and evaluated to predict the impact of scope changes on project cost and schedule. These models include Linear Regression, Decision Tree, Ridge Regression, Random Forest, Gradient Boosting, and XGBoost. The dataset is split into training and testing sets, and the models are trained using the preprocessed data. Cross-validation techniques are employed to assess the robustness and generalization ability of the models. The performance of the models is evaluated using metrics such as Mean Squared Error (MSE) and R-squared. Residual plots are generated to assess the goodness of fit and identify any patterns or outliers. Hyperparameter tuning is performed to optimize the XGBoost model and improve its predictive accuracy. The feature importance analysis reveals the relative significance of different project attributes in predicting the impact on cost and schedule. Key factors such as productivity rate, scope change magnitude, task dependencies, estimated cost, actual cost, duration, and specific WBS elements are identified as influential predictors. The study highlights the importance of considering both cost and schedule implications when managing scope changes. The developed predictive models provide project managers with a data-driven tool to proactively assess the potential impact of scope changes on project cost and schedule. By leveraging these insights, project managers can make informed decisions, optimize resource allocation, and develop effective mitigation strategies. The findings of this research contribute to improved project planning, risk management, and overall project success.Keywords: cost impact, machine learning, predictive modeling, schedule impact, scope changes
Procedia PDF Downloads 396050 Churn Prediction for Savings Bank Customers: A Machine Learning Approach
Authors: Prashant Verma
Abstract:
Commercial banks are facing immense pressure, including financial disintermediation, interest rate volatility and digital ways of finance. Retaining an existing customer is 5 to 25 less expensive than acquiring a new one. This paper explores customer churn prediction, based on various statistical & machine learning models and uses under-sampling, to improve the predictive power of these models. The results show that out of the various machine learning models, Random Forest which predicts the churn with 78% accuracy, has been found to be the most powerful model for the scenario. Customer vintage, customer’s age, average balance, occupation code, population code, average withdrawal amount, and an average number of transactions were found to be the variables with high predictive power for the churn prediction model. The model can be deployed by the commercial banks in order to avoid the customer churn so that they may retain the funds, which are kept by savings bank (SB) customers. The article suggests a customized campaign to be initiated by commercial banks to avoid SB customer churn. Hence, by giving better customer satisfaction and experience, the commercial banks can limit the customer churn and maintain their deposits.Keywords: savings bank, customer churn, customer retention, random forests, machine learning, under-sampling
Procedia PDF Downloads 1436049 Comparing Performance of Neural Network and Decision Tree in Prediction of Myocardial Infarction
Authors: Reza Safdari, Goli Arji, Robab Abdolkhani Maryam zahmatkeshan
Abstract:
Background and purpose: Cardiovascular diseases are among the most common diseases in all societies. The most important step in minimizing myocardial infarction and its complications is to minimize its risk factors. The amount of medical data is increasingly growing. Medical data mining has a great potential for transforming these data into information. Using data mining techniques to generate predictive models for identifying those at risk for reducing the effects of the disease is very helpful. The present study aimed to collect data related to risk factors of heart infarction from patients’ medical record and developed predicting models using data mining algorithm. Methods: The present work was an analytical study conducted on a database containing 350 records. Data were related to patients admitted to Shahid Rajaei specialized cardiovascular hospital, Iran, in 2011. Data were collected using a four-sectioned data collection form. Data analysis was performed using SPSS and Clementine version 12. Seven predictive algorithms and one algorithm-based model for predicting association rules were applied to the data. Accuracy, precision, sensitivity, specificity, as well as positive and negative predictive values were determined and the final model was obtained. Results: five parameters, including hypertension, DLP, tobacco smoking, diabetes, and A+ blood group, were the most critical risk factors of myocardial infarction. Among the models, the neural network model was found to have the highest sensitivity, indicating its ability to successfully diagnose the disease. Conclusion: Risk prediction models have great potentials in facilitating the management of a patient with a specific disease. Therefore, health interventions or change in their life style can be conducted based on these models for improving the health conditions of the individuals at risk.Keywords: decision trees, neural network, myocardial infarction, Data Mining
Procedia PDF Downloads 4296048 Signs-Only Compressed Row Storage Format for Exact Diagonalization Study of Quantum Fermionic Models
Authors: Michael Danilov, Sergei Iskakov, Vladimir Mazurenko
Abstract:
The present paper describes a high-performance parallel realization of an exact diagonalization solver for quantum-electron models in a shared memory computing system. The proposed algorithm contains a storage format for efficient computing eigenvalues and eigenvectors of a quantum electron Hamiltonian matrix. The results of the test calculations carried out for 15 sites Hubbard model demonstrate reduction in the required memory and good multiprocessor scalability, while maintaining performance of the same order as compressed row storage.Keywords: sparse matrix, compressed format, Hubbard model, Anderson model
Procedia PDF Downloads 4026047 Optimizing the Passenger Throughput at an Airport Security Checkpoint
Authors: Kun Li, Yuzheng Liu, Xiuqi Fan
Abstract:
High-security standard and high efficiency of screening seem to be contradictory to each other in the airport security check process. Improving the efficiency as far as possible while maintaining the same security standard is significantly meaningful. This paper utilizes the knowledge of Operation Research and Stochastic Process to establish mathematical models to explore this problem. We analyze the current process of airport security check and use the M/G/1 and M/G/k models in queuing theory to describe the process. Then we find the least efficient part is the pre-check lane, the bottleneck of the queuing system. To improve passenger throughput and reduce the variance of passengers’ waiting time, we adjust our models and use Monte Carlo method, then put forward three modifications: adjust the ratio of Pre-Check lane to regular lane flexibly, determine the optimal number of security check screening lines based on cost analysis and adjust the distribution of arrival and service time based on Monte Carlo simulation results. We also analyze the impact of cultural differences as the sensitivity analysis. Finally, we give the recommendations for the current process of airport security check process.Keywords: queue theory, security check, stochatic process, Monte Carlo simulation
Procedia PDF Downloads 2006046 Application of Signature Verification Models for Document Recognition
Authors: Boris M. Fedorov, Liudmila P. Goncharenko, Sergey A. Sybachin, Natalia A. Mamedova, Ekaterina V. Makarenkova, Saule Rakhimova
Abstract:
In modern economic conditions, the question of the possibility of correct recognition of a signature on digital documents in order to verify the expression of will or confirm a certain operation is relevant. The additional complexity of processing lies in the dynamic variability of the signature for each individual, as well as in the way information is processed because the signature refers to biometric data. The article discusses the issues of using artificial intelligence models in order to improve the quality of signature confirmation in document recognition. The analysis of several possible options for using the model is carried out. The results of the study are given, in which it is possible to correctly determine the authenticity of the signature on small samples.Keywords: signature recognition, biometric data, artificial intelligence, neural networks
Procedia PDF Downloads 1486045 Analog Input Output Buffer Information Specification Modelling Techniques for Single Ended Inter-Integrated Circuit and Differential Low Voltage Differential Signaling I/O Interfaces
Authors: Monika Rawat, Rahul Kumar
Abstract:
Input output Buffer Information Specification (IBIS) models are used for describing the analog behavior of the Input Output (I/O) buffers of a digital device. They are widely used to perform signal integrity analysis. Advantages of using IBIS models include simple structure, IP protection and fast simulation time with reasonable accuracy. As design complexity of driver and receiver increases, capturing exact behavior from transistor level model into IBIS model becomes an essential task to achieve better accuracy. In this paper, an improvement in existing methodology of generating IBIS model for complex I/O interfaces such as Inter-Integrated Circuit (I2C) and Low Voltage Differential Signaling (LVDS) is proposed. Furthermore, the accuracy and computational performance of standard method and proposed approach with respect to SPICE are presented. The investigations will be useful to further improve the accuracy of IBIS models and to enhance their wider acceptance.Keywords: IBIS, signal integrity, open-drain buffer, low voltage differential signaling, behavior modelling, transient simulation
Procedia PDF Downloads 1966044 An As-Is Analysis and Approach for Updating Building Information Models and Laser Scans
Authors: Rene Hellmuth
Abstract:
Factory planning has the task of designing products, plants, processes, organization, areas, and the construction of a factory. The requirements for factory planning and the building of a factory have changed in recent years. Regular restructuring of the factory building is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity & Ambiguity) lead to more frequent restructuring measures within a factory. A building information model (BIM) is the planning basis for rebuilding measures and becomes an indispensable data repository to be able to react quickly to changes. Use as a planning basis for restructuring measures in factories only succeeds if the BIM model has adequate data quality. Under this aspect and the industrial requirement, three data quality factors are particularly important for this paper regarding the BIM model: up-to-dateness, completeness, and correctness. The research question is: how can a BIM model be kept up to date with required data quality and which visualization techniques can be applied in a short period of time on the construction site during conversion measures? An as-is analysis is made of how BIM models and digital factory models (including laser scans) are currently being kept up to date. Industrial companies are interviewed, and expert interviews are conducted. Subsequently, the results are evaluated, and a procedure conceived how cost-effective and timesaving updating processes can be carried out. The availability of low-cost hardware and the simplicity of the process are of importance to enable service personnel from facility mnagement to keep digital factory models (BIM models and laser scans) up to date. The approach includes the detection of changes to the building, the recording of the changing area, and the insertion into the overall digital twin. Finally, an overview of the possibilities for visualizations suitable for construction sites is compiled. An augmented reality application is created based on an updated BIM model of a factory and installed on a tablet. Conversion scenarios with costs and time expenditure are displayed. A user interface is designed in such a way that all relevant conversion information is available at a glance for the respective conversion scenario. A total of three essential research results are achieved: As-is analysis of current update processes for BIM models and laser scans, development of a time-saving and cost-effective update process and the conception and implementation of an augmented reality solution for BIM models suitable for construction sites.Keywords: building information modeling, digital factory model, factory planning, restructuring
Procedia PDF Downloads 1146043 Surface-Enhanced Raman Spectroscopy on Gold Nanoparticles in the Kidney Disease
Authors: Leonardo C. Pacheco-Londoño, Nataly J Galan-Freyle, Lisandro Pacheco-Lugo, Antonio Acosta-Hoyos, Elkin Navarro, Gustavo Aroca-Martinez, Karin Rondón-Payares, Alberto C. Espinosa-Garavito, Samuel P. Hernández-Rivera
Abstract:
At the Life Science Research Center at Simon Bolivar University, a primary focus is the diagnosis of various diseases, and the use of gold nanoparticles (Au-NPs) in diverse biomedical applications is continually expanding. In the present study, Au-NPs were employed as substrates for Surface-Enhanced Raman Spectroscopy (SERS) aimed at diagnosing kidney diseases arising from Lupus Nephritis (LN), preeclampsia (PC), and Hypertension (H). Discrimination models were developed for distinguishing patients with and without kidney diseases based on the SERS signals from urine samples by partial least squares-discriminant analysis (PLS-DA). A comparative study of the Raman signals across the three conditions was conducted, leading to the identification of potential metabolite signals. Model performance was assessed through cross-validation and external validation, determining parameters like sensitivity and specificity. Additionally, a secondary analysis was performed using machine learning (ML) models, wherein different ML algorithms were evaluated for their efficiency. Models’ validation was carried out using cross-validation and external validation, and other parameters were determined, such as sensitivity and specificity; the models showed average values of 0.9 for both parameters. Additionally, it is not possible to highlight this collaborative effort involved two university research centers and two healthcare institutions, ensuring ethical treatment and informed consent of patient samples.Keywords: SERS, Raman, PLS-DA, kidney diseases
Procedia PDF Downloads 456042 Plot Scale Estimation of Crop Biophysical Parameters from High Resolution Satellite Imagery
Authors: Shreedevi Moharana, Subashisa Dutta
Abstract:
The present study focuses on the estimation of crop biophysical parameters like crop chlorophyll, nitrogen and water stress at plot scale in the crop fields. To achieve these, we have used high-resolution satellite LISS IV imagery. A new methodology has proposed in this research work, the spectral shape function of paddy crop is employed to get the significant wavelengths sensitive to paddy crop parameters. From the shape functions, regression index models were established for the critical wavelength with minimum and maximum wavelengths of multi-spectrum high-resolution LISS IV data. Moreover, the functional relationships were utilized to develop the index models. From these index models crop, biophysical parameters were estimated and mapped from LISS IV imagery at plot scale in crop field level. The result showed that the nitrogen content of the paddy crop varied from 2-8%, chlorophyll from 1.5-9% and water content variation observed from 40-90% respectively. It was observed that the variability in rice agriculture system in India was purely a function of field topography.Keywords: crop parameters, index model, LISS IV imagery, plot scale, shape function
Procedia PDF Downloads 1686041 Application of Data Driven Based Models as Early Warning Tools of High Stream Flow Events and Floods
Authors: Mohammed Seyam, Faridah Othman, Ahmed El-Shafie
Abstract:
The early warning of high stream flow events (HSF) and floods is an important aspect in the management of surface water and rivers systems. This process can be performed using either process-based models or data driven-based models such as artificial intelligence (AI) techniques. The main goal of this study is to develop efficient AI-based model for predicting the real-time hourly stream flow (Q) and apply it as early warning tool of HSF and floods in the downstream area of the Selangor River basin, taken here as a paradigm of humid tropical rivers in Southeast Asia. The performance of AI-based models has been improved through the integration of the lag time (Lt) estimation in the modelling process. A total of 8753 patterns of Q, water level, and rainfall hourly records representing one-year period (2011) were utilized in the modelling process. Six hydrological scenarios have been arranged through hypothetical cases of input variables to investigate how the changes in RF intensity in upstream stations can lead formation of floods. The initial SF was changed for each scenario in order to include wide range of hydrological situations in this study. The performance evaluation of the developed AI-based model shows that high correlation coefficient (R) between the observed and predicted Q is achieved. The AI-based model has been successfully employed in early warning throughout the advance detection of the hydrological conditions that could lead to formations of floods and HSF, where represented by three levels of severity (i.e., alert, warning, and danger). Based on the results of the scenarios, reaching the danger level in the downstream area required high RF intensity in at least two upstream areas. According to results of applications, it can be concluded that AI-based models are beneficial tools to the local authorities for flood control and awareness.Keywords: floods, stream flow, hydrological modelling, hydrology, artificial intelligence
Procedia PDF Downloads 2486040 The Prediction of Effective Equation on Drivers' Behavioral Characteristics of Lane Changing
Authors: Khashayar Kazemzadeh, Mohammad Hanif Dasoomi
Abstract:
According to the increasing volume of traffic, lane changing plays a crucial role in traffic flow. Lane changing in traffic depends on several factors including road geometrical design, speed, drivers’ behavioral characteristics, etc. A great deal of research has been carried out regarding these fields. Despite of the other significant factors, the drivers’ behavioral characteristics of lane changing has been emphasized in this paper. This paper has predicted the effective equation based on personal characteristics of lane changing by regression models.Keywords: effective equation, lane changing, drivers’ behavioral characteristics, regression models
Procedia PDF Downloads 4506039 Developing and Evaluating Clinical Risk Prediction Models for Coronary Artery Bypass Graft Surgery
Authors: Mohammadreza Mohebbi, Masoumeh Sanagou
Abstract:
The ability to predict clinical outcomes is of great importance to physicians and clinicians. A number of different methods have been used in an effort to accurately predict these outcomes. These methods include the development of scoring systems based on multivariate statistical modelling, and models involving the use of classification and regression trees. The process usually consists of two consecutive phases, namely model development and external validation. The model development phase consists of building a multivariate model and evaluating its predictive performance by examining calibration and discrimination, and internal validation. External validation tests the predictive performance of a model by assessing its calibration and discrimination in different but plausibly related patients. A motivate example focuses on prediction modeling using a sample of patients undergone coronary artery bypass graft (CABG) has been used for illustrative purpose and a set of primary considerations for evaluating prediction model studies using specific quality indicators as criteria to help stakeholders evaluate the quality of a prediction model study has been proposed.Keywords: clinical prediction models, clinical decision rule, prognosis, external validation, model calibration, biostatistics
Procedia PDF Downloads 2976038 An Approach for Pattern Recognition and Prediction of Information Diffusion Model on Twitter
Authors: Amartya Hatua, Trung Nguyen, Andrew Sung
Abstract:
In this paper, we study the information diffusion process on Twitter as a multivariate time series problem. Our model concerns three measures (volume, network influence, and sentiment of tweets) based on 10 features, and we collected 27 million tweets to build our information diffusion time series dataset for analysis. Then, different time series clustering techniques with Dynamic Time Warping (DTW) distance were used to identify different patterns of information diffusion. Finally, we built the information diffusion prediction models for new hashtags which comprise two phrases: The first phrase is recognizing the pattern using k-NN with DTW distance; the second phrase is building the forecasting model using the traditional Autoregressive Integrated Moving Average (ARIMA) model and the non-linear recurrent neural network of Long Short-Term Memory (LSTM). Preliminary results of performance evaluation between different forecasting models show that LSTM with clustering information notably outperforms other models. Therefore, our approach can be applied in real-world applications to analyze and predict the information diffusion characteristics of selected topics or memes (hashtags) in Twitter.Keywords: ARIMA, DTW, information diffusion, LSTM, RNN, time series clustering, time series forecasting, Twitter
Procedia PDF Downloads 3916037 Construction of QSAR Models to Predict Potency on a Series of substituted Imidazole Derivatives as Anti-fungal Agents
Authors: Sara El Mansouria Beghdadi
Abstract:
Quantitative structure–activity relationship (QSAR) modelling is one of the main computer tools used in medicinal chemistry. Over the past two decades, the incidence of fungal infections has increased due to the development of resistance. In this study, the QSAR was performed on a series of esters of 2-carboxamido-3-(1H-imidazole-1-yl) propanoic acid derivatives. These compounds have showed moderate and very good antifungal activity. The multiple linear regression (MLR) was used to generate the linear 2d-QSAR models. The dataset consists of 115 compounds with their antifungal activity (log MIC) against «Candida albicans» (ATCC SC5314). Descriptors were calculated, and different models were generated using Chemoffice, Avogadro, GaussView software. The selected model was validated. The study suggests that the increase in lipophilicity and the reduction in the electronic character of the substituent in R1, as well as the reduction in the steric hindrance of the substituent in R2 and its aromatic character, supporting the potentiation of the antifungal effect. The results of QSAR could help scientists to propose new compounds with higher antifungal activities intended for immunocompromised patients susceptible to multi-resistant nosocomial infections.Keywords: quantitative structure–activity relationship, imidazole, antifungal, candida albicans (ATCC SC5314)
Procedia PDF Downloads 846036 The Design of the Questionnaire of Attitudes in Physics Teaching
Authors: Ricardo Merlo
Abstract:
Attitude is a hypothetical construct that can be significantly measured to know the favorable or unfavorable predisposition that students have towards the teaching of sciences such as Physics. Although the state-of-the-art attitude test used in Physics teaching indicated different design and validation models in different groups of students, the analysis of the weight given to each dimension that supported the attitude was scarcely evaluated. Then, in this work, a methodology of attitude questionnaire construction process was proposed that allowed the teacher to design and validate the measurement instrument for different subjects of Physics at the university level developed in the classroom according to the weight considered to the affective, knowledge, and behavioural dimensions. Finally, questionnaire models were tested for the case of incoming university students, achieving significant results in the improvement of Physics teaching.Keywords: attitude, physics teaching, motivation, academic performance
Procedia PDF Downloads 696035 Testing and Validation Stochastic Models in Epidemiology
Authors: Snigdha Sahai, Devaki Chikkavenkatappa Yellappa
Abstract:
This study outlines approaches for testing and validating stochastic models used in epidemiology, focusing on the integration and functional testing of simulation code. It details methods for combining simple functions into comprehensive simulations, distinguishing between deterministic and stochastic components, and applying tests to ensure robustness. Techniques include isolating stochastic elements, utilizing large sample sizes for validation, and handling special cases. Practical examples are provided using R code to demonstrate integration testing, handling of incorrect inputs, and special cases. The study emphasizes the importance of both functional and defensive programming to enhance code reliability and user-friendliness.Keywords: computational epidemiology, epidemiology, public health, infectious disease modeling, statistical analysis, health data analysis, disease transmission dynamics, predictive modeling in health, population health modeling, quantitative public health, random sampling simulations, randomized numerical analysis, simulation-based analysis, variance-based simulations, algorithmic disease simulation, computational public health strategies, epidemiological surveillance, disease pattern analysis, epidemic risk assessment, population-based health strategies, preventive healthcare models, infection dynamics in populations, contagion spread prediction models, survival analysis techniques, epidemiological data mining, host-pathogen interaction models, risk assessment algorithms for disease spread, decision-support systems in epidemiology, macro-level health impact simulations, socioeconomic determinants in disease spread, data-driven decision making in public health, quantitative impact assessment of health policies, biostatistical methods in population health, probability-driven health outcome predictions
Procedia PDF Downloads 66034 From the Sharing Economy to Social Manufacturing: Analyzing Collaborative Service Networks in the Manufacturing Domain
Authors: Babak Mohajeri
Abstract:
In recent years, the conventional business model of ownership has been changed towards accessibility in a variety of markets. Two trends can be observed in the evolution of this rental-like business model. Firstly, the technological development that enables the emergence of new business models. These new business models increasingly become agile and flexible. For example Spotify, an online music stream company provides consumers access to over millions of music tracks, conveniently through the smartphone, tablet or computer. Similarly, Car2Go, the car sharing company accesses its members with flexible and nearby sharing cars. The second trend is the increasing communication and connections via social networks. This trend enables a shift to peer-to-peer accessibility based business models. Conventionally, companies provide access for their customers to own companies products or services. In peer-to-peer model, nonetheless, companies facilitate access and connection across their customers to use other customers owned property or skills, competencies or services .The is so-called the sharing economy business model. The aim of this study is to investigate into a new and emerging type of the sharing economy model in which role of customers and service providers may dramatically change. This new model is called Collaborative Service Networks. We propose a mechanism for Collaborative Service Networks business model. Uber and Airbnb, two successful growing companies, have been selected for our case studies and their business models are analyzed. Finally, we study the emergence of the collaborative service networks in the manufacturing domain. Our finding results to a new manufacturing paradigm called social manufacturing.Keywords: sharing economy, collaborative service networks, social manufacturing, manufacturing development
Procedia PDF Downloads 3176033 Comparative Operating Speed and Speed Differential Day and Night Time Models for Two Lane Rural Highways
Authors: Vinayak Malaghan, Digvijay Pawar
Abstract:
Speed is the independent parameter which plays a vital role in the highway design. Design consistency of the highways is checked based on the variation in the operating speed. Often the design consistency fails to meet the driver’s expectation which results in the difference between operating and design speed. Literature reviews have shown that significant crashes take place in horizontal curves due to lack of design consistency. The paper focuses on continuous speed profile study on tangent to curve transition for both day and night daytime. Data is collected using GPS device which gives continuous speed profile and other parameters such as acceleration, deceleration were analyzed along with Tangent to Curve Transition. In this present study, models were developed to predict operating speed on tangents and horizontal curves as well as model indicating the speed reduction from tangent to curve based on continuous speed profile data. It is observed from the study that vehicle tends to decelerate from approach tangent to between beginning of the curve and midpoint of the curve and then accelerates from curve to tangent transition. The models generated were compared for both day and night and can be used in the road safety improvement by evaluating the geometric design consistency.Keywords: operating speed, design consistency, continuous speed profile data, day and night time
Procedia PDF Downloads 1576032 A Basic Modeling Approach for the 3D Protein Structure of Insulin
Authors: Daniel Zarzo Montes, Manuel Zarzo Castelló
Abstract:
Proteins play a fundamental role in biology, but their structure is complex, and it is a challenge for teachers to conceptually explain the differences between their primary, secondary, tertiary, and quaternary structures. On the other hand, there are currently many computer programs to visualize the 3D structure of proteins, but they require advanced training and knowledge. Moreover, it becomes difficult to visualize the sequence of amino acids in these models, and how the protein conformation is reached. Given this drawback, a simple and instructive procedure is proposed in order to teach the protein structure to undergraduate and graduate students. For this purpose, insulin has been chosen because it is a protein that consists of 51 amino acids, a relatively small number. The methodology has consisted of the use of plastic atom models, which are frequently used in organic chemistry and biochemistry to explain the chirality of biomolecules. For didactic purposes, when the aim is to teach the biochemical foundations of proteins, a manipulative system seems convenient, starting from the chemical structure of amino acids. It has the advantage that the bonds between amino acids can be conveniently rotated, following the pattern marked by the 3D models. First, the 51 amino acids were modeled, and then they were linked according to the sequence of this protein. Next, the three disulfide bonds that characterize the stability of insulin have been established, and then the alpha-helix structure has been formed. In order to reach the tertiary 3D conformation of this protein, different interactive models available on the Internet have been visualized. In conclusion, the proposed methodology seems very suitable for biology and biochemistry students because they can learn the fundamentals of protein modeling by means of a manipulative procedure as a basis for understanding the functionality of proteins. This methodology would be conveniently useful for a biology or biochemistry laboratory practice, either at the pre-graduate or university level.Keywords: protein structure, 3D model, insulin, biomolecule
Procedia PDF Downloads 556031 Numerical Model Validation Using Durbin Method
Authors: H. Al-Hajeri
Abstract:
The computation of the effectiveness of turbulence enhancement surface features, such as ribs as means of promoting mixing and hence heat transfer, has attracted the continued attention of the engineering community. In this study, the simulation of a three-dimensional cooling passage is carried out employing a number of turbulence models including Durbin model. The cooling passage consists of a square section duct whose upper and lower surfaces feature staggered cuboid ribs. The main objective of this paper is to provide comparisons of the performance of the v2-f model against other established turbulence models as implemented in the commercial CFD code Ansys Fluent. The present study demonstrates that the v2-f model can successfully capture the isothermal air flow phenomena in flow over obstacles.Keywords: CFD, cooling passage, Durbin model, turbulence model
Procedia PDF Downloads 5036030 A Sliding Mesh Technique and Compressibility Correction Effects of Two-Equation Turbulence Models for a Pintle-Perturbed Flow Analysis
Authors: J. Y. Heo, H. G. Sung
Abstract:
Numerical simulations have been performed for assessment of compressibility correction of two-equation turbulence models suitable for large scale separation flows perturbed by pintle strokes. In order to take into account pintle movement, a sliding mesh method was applied. The chamber pressure, mass flow rate, and thrust have been analyzed, and the response lag and sensitivity at the chamber and nozzle were estimated for a movable pintle. The nozzle performance for pintle reciprocating as its insertion and extraction processes, were analyzed to better understand the dynamic performance of the pintle nozzle.Keywords: pintle, sliding mesh, turbulent model, compressibility correction
Procedia PDF Downloads 4896029 Hate Speech Detection Using Deep Learning and Machine Learning Models
Authors: Nabil Shawkat, Jamil Saquer
Abstract:
Social media has accelerated our ability to engage with others and eliminated many communication barriers. On the other hand, the widespread use of social media resulted in an increase in online hate speech. This has drastic impacts on vulnerable individuals and societies. Therefore, it is critical to detect hate speech to prevent innocent users and vulnerable communities from becoming victims of hate speech. We investigate the performance of different deep learning and machine learning algorithms on three different datasets. Our results show that the BERT model gives the best performance among all the models by achieving an F1-score of 90.6% on one of the datasets and F1-scores of 89.7% and 88.2% on the other two datasets.Keywords: hate speech, machine learning, deep learning, abusive words, social media, text classification
Procedia PDF Downloads 1366028 Enhancing Sell-In and Sell-Out Forecasting Using Ensemble Machine Learning Method
Authors: Vishal Das, Tianyi Mao, Zhicheng Geng, Carmen Flores, Diego Pelloso, Fang Wang
Abstract:
Accurate sell-in and sell-out forecasting is a ubiquitous problem in the retail industry. It is an important element of any demand planning activity. As a global food and beverage company, Nestlé has hundreds of products in each geographical location that they operate in. Each product has its sell-in and sell-out time series data, which are forecasted on a weekly and monthly scale for demand and financial planning. To address this challenge, Nestlé Chilein collaboration with Amazon Machine Learning Solutions Labhas developed their in-house solution of using machine learning models for forecasting. Similar products are combined together such that there is one model for each product category. In this way, the models learn from a larger set of data, and there are fewer models to maintain. The solution is scalable to all product categories and is developed to be flexible enough to include any new product or eliminate any existing product in a product category based on requirements. We show how we can use the machine learning development environment on Amazon Web Services (AWS) to explore a set of forecasting models and create business intelligence dashboards that can be used with the existing demand planning tools in Nestlé. We explored recent deep learning networks (DNN), which show promising results for a variety of time series forecasting problems. Specifically, we used a DeepAR autoregressive model that can group similar time series together and provide robust predictions. To further enhance the accuracy of the predictions and include domain-specific knowledge, we designed an ensemble approach using DeepAR and XGBoost regression model. As part of the ensemble approach, we interlinked the sell-out and sell-in information to ensure that a future sell-out influences the current sell-in predictions. Our approach outperforms the benchmark statistical models by more than 50%. The machine learning (ML) pipeline implemented in the cloud is currently being extended for other product categories and is getting adopted by other geomarkets.Keywords: sell-in and sell-out forecasting, demand planning, DeepAR, retail, ensemble machine learning, time-series
Procedia PDF Downloads 2736027 Forecasting 24-Hour Ahead Electricity Load Using Time Series Models
Authors: Ramin Vafadary, Maryam Khanbaghi
Abstract:
Forecasting electricity load is important for various purposes like planning, operation, and control. Forecasts can save operating and maintenance costs, increase the reliability of power supply and delivery systems, and correct decisions for future development. This paper compares various time series methods to forecast 24 hours ahead of electricity load. The methods considered are the Holt-Winters smoothing, SARIMA Modeling, LSTM Network, Fbprophet, and Tensorflow probability. The performance of each method is evaluated by using the forecasting accuracy criteria, namely, the mean absolute error and root mean square error. The National Renewable Energy Laboratory (NREL) residential energy consumption data is used to train the models. The results of this study show that the SARIMA model is superior to the others for 24 hours ahead forecasts. Furthermore, a Bagging technique is used to make the predictions more robust. The obtained results show that by Bagging multiple time-series forecasts, we can improve the robustness of the models for 24 hours ahead of electricity load forecasting.Keywords: bagging, Fbprophet, Holt-Winters, LSTM, load forecast, SARIMA, TensorFlow probability, time series
Procedia PDF Downloads 956026 Hidden Markov Movement Modelling with Irregular Data
Authors: Victoria Goodall, Paul Fatti, Norman Owen-Smith
Abstract:
Hidden Markov Models have become popular for the analysis of animal tracking data. These models are being used to model the movements of a variety of species in many areas around the world. A common assumption of the model is that the observations need to have regular time steps. In many ecological studies, this will not be the case. The objective of the research is to modify the movement model to allow for irregularly spaced locations and investigate the effect on the inferences which can be made about the latent states. A modification of the likelihood function to allow for these irregular spaced locations is investigated, without using interpolation or averaging the movement rate. The suitability of the modification is investigated using GPS tracking data for lion (Panthera leo) in South Africa, with many observations obtained during the night, and few observations during the day. Many nocturnal predator tracking studies are set up in this way, to obtain many locations at night when the animal is most active and is difficult to observe. Few observations are obtained during the day, when the animal is expected to rest and is potentially easier to observe. Modifying the likelihood function allows the popular Hidden Markov Model framework to be used to model these irregular spaced locations, making use of all the observed data.Keywords: hidden Markov Models, irregular observations, animal movement modelling, nocturnal predator
Procedia PDF Downloads 2446025 Comparison of Adsorbents for Ammonia Removal from Mining Wastewater
Authors: F. Al-Sheikh, C. Moralejo, M. Pritzker, W. A. Anderson, A. Elkamel
Abstract:
Ammonia in mining wastewater is a significant problem, and treatment can be especially difficult in cold climates where biological treatment is not feasible. An adsorption process is one of the alternative processes that can be used to reduce ammonia concentrations to acceptable limits, and therefore a LEWATIT resin strongly acidic H+ form ion exchange resin and a Bowie Chabazite Na form AZLB-Na zeolite were tested to assess their effectiveness. For these adsorption tests, two packed bed columns (a mini-column constructed from a 32-cm long x 1-cm diameter piece of glass tubing, and a 60-cm long x 2.5-cm diameter Ace Glass chromatography column) were used containing varying quantities of the adsorbents. A mining wastewater with ammonia concentrations of 22.7 mg/L was fed through the columns at controlled flowrates. In the experimental work, maximum capacities of the LEWATIT ion exchange resin were 0.438, 0.448, and 1.472 mg/g for 3, 6, and 9 g respectively in a mini column and 1.739 mg/g for 141.5 g in a larger Ace column while the capacities for the AZLB-Na zeolite were 0.424, and 0.784 mg/g for 3, and 6 g respectively in the mini column and 1.1636 mg/g for 38.5 g in the Ace column. In the theoretical work, Thomas, Adams-Bohart, and Yoon-Nelson models were constructed to describe a breakthrough curve of the adsorption process and find the constants of the above-mentioned models. In the regeneration tests, 5% hydrochloric acid, HCl (v/v) and 10% sodium hydroxide, NaOH (w/v) were used to regenerate the LEWATIT resin and AZLB-Na zeolite with 44 and 63.8% recovery, respectively. In conclusion, continuous flow adsorption using a LEWATIT ion exchange resin and an AZLB-Na zeolite is efficient when using a co-flow technique for removal of the ammonia from wastewater. Thomas, Adams-Bohart, and Yoon-Nelson models satisfactorily fit the data with R2 closer to 1 in all cases.Keywords: AZLB-Na zeolite, continuous adsorption, Lewatit resin, models, regeneration
Procedia PDF Downloads 3896024 Towards a Standardization in Scheduling Models: Assessing the Variety of Homonyms
Authors: Marcel Rojahn, Edzard Weber, Norbert Gronau
Abstract:
Terminology is a critical instrument for each researcher. Different terminologies for the same research object may arise in different research communities. By this inconsistency, many synergistic effects get lost. Theories and models will be more understandable and reusable if a common terminology is applied. This paper examines the terminological (in) consistency for the research field of job-shop scheduling through a literature review. There is an enormous variety in the choice of terms and mathematical notation for the same concept. The comparability, reusability, and combinability of scheduling methods are unnecessarily hampered by the arbitrary use of homonyms and synonyms. The acceptance in the community of used variables and notation forms is shown by means of a compliance quotient. This is proven by the evaluation of 240 scientific publications on planning methods.Keywords: job-shop scheduling, terminology, notation, standardization
Procedia PDF Downloads 1096023 Assessment of Material Type, Diameter, Orientation and Closeness of Fibers in Vulcanized Reinforced Rubbers
Authors: Ali Osman Güney, Bahattin Kanber
Abstract:
In this work, the effect of material type, diameter, orientation and closeness of fibers on the general performance of reinforced vulcanized rubbers are investigated using finite element method with experimental verification. Various fiber materials such as hemp, nylon, polyester are used for different fiber diameters, orientations and closeness. 3D finite element models are developed by considering bonded contact elements between fiber and rubber sheet interfaces. The fibers are assumed as linear elastic, while vulcanized rubber is considered as hyper-elastic. After an experimental verification of finite element results, the developed models are analyzed under prescribed displacement that causes tension. The normal stresses in fibers and shear stresses between fibers and rubber sheet are investigated in all models. Large deformation of reinforced rubber sheet also represented with various fiber conditions under incremental loading. A general assessment is achieved about best fiber properties of reinforced rubber sheets for tension-load conditions.Keywords: reinforced vulcanized rubbers, fiber properties, out of plane loading, finite element method
Procedia PDF Downloads 346