Search results for: price prediction
2780 On the Creep of Concrete Structures
Authors: A. Brahma
Abstract:
Analysis of deferred deformations of concrete under sustained load shows that the creep has a leading role on deferred deformations of concrete structures. Knowledge of the creep characteristics of concrete is a Necessary starting point in the design of structures for crack control. Such knowledge will enable the designer to estimate the probable deformation in pre-stressed concrete or reinforced and the appropriate steps can be taken in design to accommodate this movement. In this study, we propose a prediction model that involves the acting principal parameters on the deferred behaviour of concrete structures. For the estimation of the model parameters Levenberg-Marquardt method has proven very satisfactory. A confrontation between the experimental results and the predictions of models designed shows that it is well suited to describe the evolution of the creep of concrete structures.Keywords: concrete structure, creep, modelling, prediction
Procedia PDF Downloads 2912779 Improved 3D Structure Prediction of Beta-Barrel Membrane Proteins by Using Evolutionary Coupling Constraints, Reduced State Space and an Empirical Potential Function
Authors: Wei Tian, Jie Liang, Hammad Naveed
Abstract:
Beta-barrel membrane proteins are found in the outer membrane of gram-negative bacteria, mitochondria, and chloroplasts. They carry out diverse biological functions, including pore formation, membrane anchoring, enzyme activity, and bacterial virulence. In addition, beta-barrel membrane proteins increasingly serve as scaffolds for bacterial surface display and nanopore-based DNA sequencing. Due to difficulties in experimental structure determination, they are sparsely represented in the protein structure databank and computational methods can help to understand their biophysical principles. We have developed a novel computational method to predict the 3D structure of beta-barrel membrane proteins using evolutionary coupling (EC) constraints and a reduced state space. Combined with an empirical potential function, we can successfully predict strand register at > 80% accuracy for a set of 49 non-homologous proteins with known structures. This is a significant improvement from previous results using EC alone (44%) and using empirical potential function alone (73%). Our method is general and can be applied to genome-wide structural prediction.Keywords: beta-barrel membrane proteins, structure prediction, evolutionary constraints, reduced state space
Procedia PDF Downloads 6182778 Project Time Prediction Model: A Case Study of Construction Projects in Sindh, Pakistan
Authors: Tauha Hussain Ali, Shabir Hussain Khahro, Nafees Ahmed Memon
Abstract:
Accurate prediction of project time for planning and bid preparation stage should contain realistic dates. Constructors use their experience to estimate the project duration for the new projects, which is based on intuitions. It has been a constant concern to both researchers and constructors to analyze the accurate prediction of project duration for bid preparation stage. In Pakistan, such study for time cost relationship has been lacked to predict duration performance for the construction projects. This study is an attempt to explore the time cost relationship that would conclude with a mathematical model to predict the time for the drainage rehabilitation projects in the province of Sindh, Pakistan. The data has been collected from National Engineering Services (NESPAK), Pakistan and regression analysis has been carried out for the analysis of results. Significant relationship has been found between time and cost of the construction projects in Sindh and the generated mathematical model can be used by the constructors to predict the project duration for the upcoming projects of same nature. This study also provides the professionals with a requisite knowledge to make decisions regarding project duration, which is significantly important to win the projects at the bid stage.Keywords: BTC Model, project time, relationship of time cost, regression
Procedia PDF Downloads 3822777 A Review of Current Knowledge on Assessment of Precast Structures Using Fragility Curves
Authors: E. Akpinar, A. Erol, M.F. Cakir
Abstract:
Precast reinforced concrete (RC) structures are excellent alternatives for construction world all over the globe, thanks to their rapid erection phase, ease mounting process, better quality and reasonable prices. Such structures are rather popular for industrial buildings. For the sake of economic importance of such industrial buildings as well as significance of safety, like every other type of structures, performance assessment and structural risk analysis are important. Fragility curves are powerful tools for damage projection and assessment for any sort of building as well as precast structures. In this study, a comparative review of current knowledge on fragility analysis of industrial precast RC structures were presented and findings in previous studies were compiled. Effects of different structural variables, parameters and building geometries as well as soil conditions on fragility analysis of precast structures are reviewed. It was aimed to briefly present the information in the literature about the procedure of damage probability prediction including fragility curves for such industrial facilities. It is found that determination of the aforementioned structural parameters as well as selecting analysis procedure are critically important for damage prediction of industrial precast RC structures using fragility curves.Keywords: damage prediction, fragility curve, industrial buildings, precast reinforced concrete structures
Procedia PDF Downloads 1902776 Spatially Distributed Rainfall Prediction Based on Automated Kriging for Landslide Early Warning Systems
Authors: Ekrem Canli, Thomas Glade
Abstract:
The precise prediction of rainfall in space and time is a key element to most landslide early warning systems. Unfortunately, the spatial variability of rainfall in many early warning applications is often disregarded. A common simplification is to use uniformly distributed rainfall to characterize aerial rainfall intensity. With spatially differentiated rainfall information, real-time comparison with rainfall thresholds or the implementation in process-based approaches might form the basis for improved landslide warnings. This study suggests an automated workflow from the hourly, web-based collection of rain gauge data to the generation of spatially differentiated rainfall predictions based on kriging. Because the application of kriging is usually a labor intensive task, a simplified and consequently automated variogram modeling procedure was applied to up-to-date rainfall data. The entire workflow was carried out purely with open source technology. Validation results, albeit promising, pointed out the challenges that are involved in pure distance based, automated geostatistical interpolation techniques for ever-changing environmental phenomena over short temporal and spatial extent.Keywords: kriging, landslide early warning system, spatial rainfall prediction, variogram modelling, web scraping
Procedia PDF Downloads 2802775 An Intelligent Prediction Method for Annular Pressure Driven by Mechanism and Data
Authors: Zhaopeng Zhu, Xianzhi Song, Gensheng Li, Shuo Zhu, Shiming Duan, Xuezhe Yao
Abstract:
Accurate calculation of wellbore pressure is of great significance to prevent wellbore risk during drilling. The traditional mechanism model needs a lot of iterative solving procedures in the calculation process, which reduces the calculation efficiency and is difficult to meet the demand of dynamic control of wellbore pressure. In recent years, many scholars have introduced artificial intelligence algorithms into wellbore pressure calculation, which significantly improves the calculation efficiency and accuracy of wellbore pressure. However, due to the ‘black box’ property of intelligent algorithm, the existing intelligent calculation model of wellbore pressure is difficult to play a role outside the scope of training data and overreacts to data noise, often resulting in abnormal calculation results. In this study, the multi-phase flow mechanism is embedded into the objective function of the neural network model as a constraint condition, and an intelligent prediction model of wellbore pressure under the constraint condition is established based on more than 400,000 sets of pressure measurement while drilling (MPD) data. The constraint of the multi-phase flow mechanism makes the prediction results of the neural network model more consistent with the distribution law of wellbore pressure, which overcomes the black-box attribute of the neural network model to some extent. The main performance is that the accuracy of the independent test data set is further improved, and the abnormal calculation values basically disappear. This method is a prediction method driven by MPD data and multi-phase flow mechanism, and it is the main way to predict wellbore pressure accurately and efficiently in the future.Keywords: multiphase flow mechanism, pressure while drilling data, wellbore pressure, mechanism constraints, combined drive
Procedia PDF Downloads 1752774 Preference for Housing Services and Rational House Price Bubbles
Authors: Stefanie Jeanette Huber
Abstract:
This paper explores the relevance and implications of preferences for housing services on house price fluctuations through the lens of an overlapping generation’s model. The model implies that an economy whose agents have lower preferences for housing services is characterized with lower expenditure shares on housing services and will tend to experience more frequent and more volatile housing bubbles. These model predictions are tested empirically in the companion paper Housing Booms and Busts - Convergences and Divergences across OECD countries. Between 1970 - 2013, countries who spend less on housing services as a share of total income experienced significantly more housing cycles and the associated housing boom-bust cycles were more violent. Finally, the model is used to study the impact of rental subsidies and help-to-buy schemes on rational housing bubbles. Rental subsidies are found to contribute to the control of housing bubbles, whereas help-to- buy scheme makes the economy more bubble-prone.Keywords: housing bubbles, housing booms and busts, preference for housing services, expenditure shares for housing services, rental and purchase subsidies
Procedia PDF Downloads 2992773 Development of Geo-computational Model for Analysis of Lassa Fever Dynamics and Lassa Fever Outbreak Prediction
Authors: Adekunle Taiwo Adenike, I. K. Ogundoyin
Abstract:
Lassa fever is a neglected tropical virus that has become a significant public health issue in Nigeria, with the country having the greatest burden in Africa. This paper presents a Geo-Computational Model for Analysis and Prediction of Lassa Fever Dynamics and Outbreaks in Nigeria. The model investigates the dynamics of the virus with respect to environmental factors and human populations. It confirms the role of the rodent host in virus transmission and identifies how climate and human population are affected. The proposed methodology is carried out on a Linux operating system using the OSGeoLive virtual machine for geographical computing, which serves as a base for spatial ecology computing. The model design uses Unified Modeling Language (UML), and the performance evaluation uses machine learning algorithms such as random forest, fuzzy logic, and neural networks. The study aims to contribute to the control of Lassa fever, which is achievable through the combined efforts of public health professionals and geocomputational and machine learning tools. The research findings will potentially be more readily accepted and utilized by decision-makers for the attainment of Lassa fever elimination.Keywords: geo-computational model, lassa fever dynamics, lassa fever, outbreak prediction, nigeria
Procedia PDF Downloads 952772 Motives and Barriers of Using Airbnb: Findings from Mixed Method Approach
Authors: Ghada Mohammed, Mohamed Abdel Salam, Passent Tantawi
Abstract:
The study aimed to investigate the impact of motives and barriers for Egyptian users to use Airbnb as a platform of peer-to-peer accommodation instead of hotels on overall attitude towards Airbnb. A sequential mixed-methods approach was adopted to this study and it proposed a comprehensive research model adapted from both literature and results of qualitative phase and then tested via an online questionnaire. The findings revealed that, motives, price, home benefits, privacy, and online reviews significantly explained overall attitude towards Airbnb, while the main barriers were respectively: perceived risk and distrust in which they can predict the overall attitude. While from the subjective norms, only social influence can predict behavioral intention to use Airbnb. The study may serve as a practical reference for practitioners as well as researchers when developing programs and strategies to manage Airbnb consumers' needs and decision process. Some of the main conclusions drawn from this study are that variety was one of the major things that users like about Airbnb and the most important motives are the functional ones like price rather than the experiential ones like authenticity.Keywords: airbnb, barriers, disruptive innovation, motives, sharing economy
Procedia PDF Downloads 1472771 A Multilayer Perceptron Neural Network Model Optimized by Genetic Algorithm for Significant Wave Height Prediction
Authors: Luis C. Parra
Abstract:
The significant wave height prediction is an issue of great interest in the field of coastal activities because of the non-linear behavior of the wave height and its complexity of prediction. This study aims to present a machine learning model to forecast the significant wave height of the oceanographic wave measuring buoys anchored at Mooloolaba of the Queensland Government Data. Modeling was performed by a multilayer perceptron neural network-genetic algorithm (GA-MLP), considering Relu(x) as the activation function of the MLPNN. The GA is in charge of optimized the MLPNN hyperparameters (learning rate, hidden layers, neurons, and activation functions) and wrapper feature selection for the window width size. Results are assessed using Mean Square Error (MSE), Root Mean Square Error (RMSE), and Mean Absolute Error (MAE). The GAMLPNN algorithm was performed with a population size of thirty individuals for eight generations for the prediction optimization of 5 steps forward, obtaining a performance evaluation of 0.00104 MSE, 0.03222 RMSE, 0.02338 MAE, and 0.71163% of MAPE. The results of the analysis suggest that the MLPNNGA model is effective in predicting significant wave height in a one-step forecast with distant time windows, presenting 0.00014 MSE, 0.01180 RMSE, 0.00912 MAE, and 0.52500% of MAPE with 0.99940 of correlation factor. The GA-MLP algorithm was compared with the ARIMA forecasting model, presenting better performance criteria in all performance criteria, validating the potential of this algorithm.Keywords: significant wave height, machine learning optimization, multilayer perceptron neural networks, evolutionary algorithms
Procedia PDF Downloads 1082770 Prediction of Compressive Strength in Geopolymer Composites by Adaptive Neuro Fuzzy Inference System
Authors: Mehrzad Mohabbi Yadollahi, Ramazan Demirboğa, Majid Atashafrazeh
Abstract:
Geopolymers are highly complex materials which involve many variables which makes modeling its properties very difficult. There is no systematic approach in mix design for Geopolymers. Since the amounts of silica modulus, Na2O content, w/b ratios and curing time have a great influence on the compressive strength an ANFIS (Adaptive neuro fuzzy inference system) method has been established for predicting compressive strength of ground pumice based Geopolymers and the possibilities of ANFIS for predicting the compressive strength has been studied. Consequently, ANFIS can be used for geopolymer compressive strength prediction with acceptable accuracy.Keywords: geopolymer, ANFIS, compressive strength, mix design
Procedia PDF Downloads 8562769 Forecasting Electricity Spot Price with Generalized Long Memory Modeling: Wavelet and Neural Network
Authors: Souhir Ben Amor, Heni Boubaker, Lotfi Belkacem
Abstract:
This aims of this paper is to forecast the electricity spot prices. First, we focus on modeling the conditional mean of the series so we adopt a generalized fractional -factor Gegenbauer process (k-factor GARMA). Secondly, the residual from the -factor GARMA model has used as a proxy for the conditional variance; these residuals were predicted using two different approaches. In the first approach, a local linear wavelet neural network model (LLWNN) has developed to predict the conditional variance using the Back Propagation learning algorithms. In the second approach, the Gegenbauer generalized autoregressive conditional heteroscedasticity process (G-GARCH) has adopted, and the parameters of the k-factor GARMA-G-GARCH model has estimated using the wavelet methodology based on the discrete wavelet packet transform (DWPT) approach. The empirical results have shown that the k-factor GARMA-G-GARCH model outperform the hybrid k-factor GARMA-LLWNN model, and find it is more appropriate for forecasts.Keywords: electricity price, k-factor GARMA, LLWNN, G-GARCH, forecasting
Procedia PDF Downloads 2322768 Prediction of Deformations of Concrete Structures
Authors: A. Brahma
Abstract:
Drying is a phenomenon that accompanies the hardening of hydraulic materials. It can, if it is not prevented, lead to significant spontaneous dimensional variations, which the cracking is one of events. In this context, cracking promotes the transport of aggressive agents in the material, which can affect the durability of concrete structures. Drying shrinkage develops over a long period almost 30 years although most occurred during the first three years. Drying shrinkage stabilizes when the material is water balance with the external environment. The drying shrinkage of cementitious materials is due to the formation of capillary tensions in the pores of the material, which has the consequences of bringing the solid walls of each other. Knowledge of the shrinkage characteristics of concrete is a necessary starting point in the design of structures for crack control. Such knowledge will enable the designer to estimate the probable shrinkage movement in reinforced or prestressed concrete and the appropriate steps can be taken in design to accommodate this movement. This study is concerned the modelling of drying shrinkage of the hydraulic materials and the prediction of the rate of spontaneous deformations of hydraulic materials during hardening. The model developed takes in consideration the main factors affecting drying shrinkage. There was agreement between drying shrinkage predicted by the developed model and experimental results. In last we show that developed model describe the evolution of the drying shrinkage of high performances concretes correctly.Keywords: drying, hydraulic concretes, shrinkage, modeling, prediction
Procedia PDF Downloads 3372767 The Impact of Dispatching with Rolling Horizon Control in Sizing Thermal Storage for Solar Tower Plant Participating in Wholesale Spot Electricity Market
Authors: Navid Mohammadzadeh, Huy Truong-Ba, Michael Cholette
Abstract:
The solar tower (ST) plant is a promising technology to exploit large-scale solar irradiation. With thermal energy storage, ST plant has the potential to shift generation to high electricity price periods. However, the size of storage limits the dispatchability of the plant, particularly when it should compete with uncertainty in forecasts of solar irradiation and electricity prices. The purpose of this study is to explore the size of storage when Rolling Horizon Control (RHC) is employed for dispatch scheduling. To this end, RHC is benchmarked against perfect knowledge (PK) forecast and two day-ahead dispatching policies. With optimisation of dispatch planning using PK policy, the optimal achievable profit for a specific size of the storage is determined. A sensitivity analysis using Monte-Carlo simulation is conducted, and the size of storage for RHC and day-ahead policies is determined with the objective of reaching the profit obtained from the PK policy. A case study is conducted for a hypothetical ST plant with thermal storage located in South Australia and intends to dispatch under two market scenarios: 1) fixed price and 2) wholesale spot price. The impact of each individual source of uncertainty on storage size is examined for January and August. The exploration of results shows that dispatching with RH controller reaches optimal achievable profit with ~15% smaller storage compared to that in day-ahead policies. The results of this study may be applied to the CSP plant design procedure.Keywords: solar tower plant, spot market, thermal storage system, optimized dispatch planning, sensitivity analysis, Monte Carlo simulation
Procedia PDF Downloads 1252766 Landslide Susceptibility Mapping: A Comparison between Logistic Regression and Multivariate Adaptive Regression Spline Models in the Municipality of Oudka, Northern of Morocco
Authors: S. Benchelha, H. C. Aoudjehane, M. Hakdaoui, R. El Hamdouni, H. Mansouri, T. Benchelha, M. Layelmam, M. Alaoui
Abstract:
The logistic regression (LR) and multivariate adaptive regression spline (MarSpline) are applied and verified for analysis of landslide susceptibility map in Oudka, Morocco, using geographical information system. From spatial database containing data such as landslide mapping, topography, soil, hydrology and lithology, the eight factors related to landslides such as elevation, slope, aspect, distance to streams, distance to road, distance to faults, lithology map and Normalized Difference Vegetation Index (NDVI) were calculated or extracted. Using these factors, landslide susceptibility indexes were calculated by the two mentioned methods. Before the calculation, this database was divided into two parts, the first for the formation of the model and the second for the validation. The results of the landslide susceptibility analysis were verified using success and prediction rates to evaluate the quality of these probabilistic models. The result of this verification was that the MarSpline model is the best model with a success rate (AUC = 0.963) and a prediction rate (AUC = 0.951) higher than the LR model (success rate AUC = 0.918, rate prediction AUC = 0.901).Keywords: landslide susceptibility mapping, regression logistic, multivariate adaptive regression spline, Oudka, Taounate
Procedia PDF Downloads 1882765 Scour Depth Prediction around Bridge Piers Using Neuro-Fuzzy and Neural Network Approaches
Authors: H. Bonakdari, I. Ebtehaj
Abstract:
The prediction of scour depth around bridge piers is frequently considered in river engineering. One of the key aspects in efficient and optimum bridge structure design is considered to be scour depth estimation around bridge piers. In this study, scour depth around bridge piers is estimated using two methods, namely the Adaptive Neuro-Fuzzy Inference System (ANFIS) and Artificial Neural Network (ANN). Therefore, the effective parameters in scour depth prediction are determined using the ANN and ANFIS methods via dimensional analysis, and subsequently, the parameters are predicted. In the current study, the methods’ performances are compared with the nonlinear regression (NLR) method. The results show that both methods presented in this study outperform existing methods. Moreover, using the ratio of pier length to flow depth, ratio of median diameter of particles to flow depth, ratio of pier width to flow depth, the Froude number and standard deviation of bed grain size parameters leads to optimal performance in scour depth estimation.Keywords: adaptive neuro-fuzzy inference system (ANFIS), artificial neural network (ANN), bridge pier, scour depth, nonlinear regression (NLR)
Procedia PDF Downloads 2202764 An Application for Risk of Crime Prediction Using Machine Learning
Authors: Luis Fonseca, Filipe Cabral Pinto, Susana Sargento
Abstract:
The increase of the world population, especially in large urban centers, has resulted in new challenges particularly with the control and optimization of public safety. Thus, in the present work, a solution is proposed for the prediction of criminal occurrences in a city based on historical data of incidents and demographic information. The entire research and implementation will be presented start with the data collection from its original source, the treatment and transformations applied to them, choice and the evaluation and implementation of the Machine Learning model up to the application layer. Classification models will be implemented to predict criminal risk for a given time interval and location. Machine Learning algorithms such as Random Forest, Neural Networks, K-Nearest Neighbors and Logistic Regression will be used to predict occurrences, and their performance will be compared according to the data processing and transformation used. The results show that the use of Machine Learning techniques helps to anticipate criminal occurrences, which contributed to the reinforcement of public security. Finally, the models were implemented on a platform that will provide an API to enable other entities to make requests for predictions in real-time. An application will also be presented where it is possible to show criminal predictions visually.Keywords: crime prediction, machine learning, public safety, smart city
Procedia PDF Downloads 1132763 Analysis of Brain Signals Using Neural Networks Optimized by Co-Evolution Algorithms
Authors: Zahra Abdolkarimi, Naser Zourikalatehsamad,
Abstract:
Up to 40 years ago, after recognition of epilepsy, it was generally believed that these attacks occurred randomly and suddenly. However, thanks to the advance of mathematics and engineering, such attacks can be predicted within a few minutes or hours. In this way, various algorithms for long-term prediction of the time and frequency of the first attack are presented. In this paper, by considering the nonlinear nature of brain signals and dynamic recorded brain signals, ANFIS model is presented to predict the brain signals, since according to physiologic structure of the onset of attacks, more complex neural structures can better model the signal during attacks. Contribution of this work is the co-evolution algorithm for optimization of ANFIS network parameters. Our objective is to predict brain signals based on time series obtained from brain signals of the people suffering from epilepsy using ANFIS. Results reveal that compared to other methods, this method has less sensitivity to uncertainties such as presence of noise and interruption in recorded signals of the brain as well as more accuracy. Long-term prediction capacity of the model illustrates the usage of planted systems for warning medication and preventing brain signals.Keywords: co-evolution algorithms, brain signals, time series, neural networks, ANFIS model, physiologic structure, time prediction, epilepsy suffering, illustrates model
Procedia PDF Downloads 2842762 Rainfall-Runoff Forecasting Utilizing Genetic Programming Technique
Authors: Ahmed Najah Ahmed Al-Mahfoodh, Ali Najah Ahmed Al-Mahfoodh, Ahmed Al-Shafie
Abstract:
In this study, genetic programming (GP) technique has been investigated in prediction of set of rainfall-runoff data. To assess the effect of input parameters on the model, the sensitivity analysis was adopted. To evaluate the performance of the proposed model, three statistical indexes were used, namely; Correlation Coefficient (CC), Mean Square Error (MSE) and Correlation of Efficiency (CE). The principle aim of this study is to develop a computationally efficient and robust approach for predict of rainfall-runoff which could reduce the cost and labour for measuring these parameters. This research concentrates on the Johor River in Johor State, Malaysia.Keywords: genetic programming, prediction, rainfall-runoff, Malaysia
Procedia PDF Downloads 4822761 A Study for Area-level Mosquito Abundance Prediction by Using Supervised Machine Learning Point-level Predictor
Authors: Theoktisti Makridou, Konstantinos Tsaprailis, George Arvanitakis, Charalampos Kontoes
Abstract:
In the literature, the data-driven approaches for mosquito abundance prediction relaying on supervised machine learning models that get trained with historical in-situ measurements. The counterpart of this approach is once the model gets trained on pointlevel (specific x,y coordinates) measurements, the predictions of the model refer again to point-level. These point-level predictions reduce the applicability of those solutions once a lot of early warning and mitigation actions applications need predictions for an area level, such as a municipality, village, etc... In this study, we apply a data-driven predictive model, which relies on public-open satellite Earth Observation and geospatial data and gets trained with historical point-level in-Situ measurements of mosquito abundance. Then we propose a methodology to extract information from a point-level predictive model to a broader area-level prediction. Our methodology relies on the randomly spatial sampling of the area of interest (similar to the Poisson hardcore process), obtaining the EO and geomorphological information for each sample, doing the point-wise prediction for each sample, and aggregating the predictions to represent the average mosquito abundance of the area. We quantify the performance of the transformation from the pointlevel to the area-level predictions, and we analyze it in order to understand which parameters have a positive or negative impact on it. The goal of this study is to propose a methodology that predicts the mosquito abundance of a given area by relying on point-level prediction and to provide qualitative insights regarding the expected performance of the area-level prediction. We applied our methodology to historical data (of Culex pipiens) of two areas of interest (Veneto region of Italy and Central Macedonia of Greece). In both cases, the results were consistent. The mean mosquito abundance of a given area can be estimated with similar accuracy to the point-level predictor, sometimes even better. The density of the samples that we use to represent one area has a positive effect on the performance in contrast to the actual number of sampling points which is not informative at all regarding the performance without the size of the area. Additionally, we saw that the distance between the sampling points and the real in-situ measurements that were used for training did not strongly affect the performance.Keywords: mosquito abundance, supervised machine learning, culex pipiens, spatial sampling, west nile virus, earth observation data
Procedia PDF Downloads 1492760 Time and Cost Efficiency Analysis of Quick Die Change System on Metal Stamping Industry
Authors: Rudi Kurniawan Arief
Abstract:
Manufacturing cost and setup time are the hot topics to improve in Metal Stamping industry because material and components price are always rising up while costumer requires to cut down the component price year by year. The Single Minute Exchange of Die (SMED) is one of many methods to reduce waste in stamping industry. The Japanese Quick Die Change (QDC) dies system is one of SMED systems that could reduce both of setup time and manufacturing cost. However, this system is rarely used in stamping industries. This paper will analyze how deep the QDC dies system could reduce setup time and the manufacturing cost. The research is conducted by direct observation, simulating and comparing of QDC dies system with conventional dies system. In this research, we found that the QDC dies system could save up to 35% of manufacturing cost and reduce 70% of setup times. This simulation proved that the QDC die system is effective for cost reduction but must be applied in several parallel production processes.Keywords: press die, metal stamping, QDC system, single minute exchange die, manufacturing cost saving, SMED
Procedia PDF Downloads 1712759 Application of Latent Class Analysis and Self-Organizing Maps for the Prediction of Treatment Outcomes for Chronic Fatigue Syndrome
Authors: Ben Clapperton, Daniel Stahl, Kimberley Goldsmith, Trudie Chalder
Abstract:
Chronic fatigue syndrome (CFS) is a condition characterised by chronic disabling fatigue and other symptoms that currently can't be explained by any underlying medical condition. Although clinical trials support the effectiveness of cognitive behaviour therapy (CBT), the success rate for individual patients is modest. Patients vary in their response and little is known which factors predict or moderate treatment outcomes. The aim of the project is to develop a prediction model from baseline characteristics of patients, such as demographics, clinical and psychological variables, which may predict likely treatment outcome and provide guidance for clinical decision making and help clinicians to recommend the best treatment. The project is aimed at identifying subgroups of patients with similar baseline characteristics that are predictive of treatment effects using modern cluster analyses and data mining machine learning algorithms. The characteristics of these groups will then be used to inform the types of individuals who benefit from a specific treatment. In addition, results will provide a better understanding of for whom the treatment works. The suitability of different clustering methods to identify subgroups and their response to different treatments of CFS patients is compared.Keywords: chronic fatigue syndrome, latent class analysis, prediction modelling, self-organizing maps
Procedia PDF Downloads 2262758 The Combination of the Mel Frequency Cepstral Coefficients, Perceptual Linear Prediction, Jitter and Shimmer Coefficients for the Improvement of Automatic Recognition System for Dysarthric Speech
Authors: Brahim Fares Zaidi
Abstract:
Our work aims to improve our Automatic Recognition System for Dysarthria Speech based on the Hidden Models of Markov and the Hidden Markov Model Toolkit to help people who are sick. With pronunciation problems, we applied two techniques of speech parameterization based on Mel Frequency Cepstral Coefficients and Perceptual Linear Prediction and concatenated them with JITTER and SHIMMER coefficients in order to increase the recognition rate of a dysarthria speech. For our tests, we used the NEMOURS database that represents speakers with dysarthria and normal speakers.Keywords: ARSDS, HTK, HMM, MFCC, PLP
Procedia PDF Downloads 1102757 Predicting the Diagnosis of Alzheimer’s Disease: Development and Validation of Machine Learning Models
Authors: Jay L. Fu
Abstract:
Patients with Alzheimer's disease progressively lose their memory and thinking skills and, eventually, the ability to carry out simple daily tasks. The disease is irreversible, but early detection and treatment can slow down the disease progression. In this research, publicly available MRI data and demographic data from 373 MRI imaging sessions were utilized to build models to predict dementia. Various machine learning models, including logistic regression, k-nearest neighbor, support vector machine, random forest, and neural network, were developed. Data were divided into training and testing sets, where training sets were used to build the predictive model, and testing sets were used to assess the accuracy of prediction. Key risk factors were identified, and various models were compared to come forward with the best prediction model. Among these models, the random forest model appeared to be the best model with an accuracy of 90.34%. MMSE, nWBV, and gender were the three most important contributing factors to the detection of Alzheimer’s. Among all the models used, the percent in which at least 4 of the 5 models shared the same diagnosis for a testing input was 90.42%. These machine learning models allow early detection of Alzheimer’s with good accuracy, which ultimately leads to early treatment of these patients.Keywords: Alzheimer's disease, clinical diagnosis, magnetic resonance imaging, machine learning prediction
Procedia PDF Downloads 1432756 Is More Inclusive More Effective? The 'New Style' Public Distribution System in India
Authors: Avinash Kishore, Suman Chakrabarti
Abstract:
In September 2013, the parliament of India enacted the National Food Security Act (NFSA) which entitles two-thirds of India’s population to five kilograms of rice, wheat or coarse cereals per person per month at one to three rupees per kilogram. Five states in India—Andhra Pradesh, Chhattisgarh, Tamil Nadu, Odisha and West Bengal—had already implemented somewhat similar changes in the TPDS a few years earlier using their own budgetary resources. They made rice—coincidentally, all five states are predominantly rice-eating—available in fair price shops to a majority of their population at very low prices (less than Rs.3/kg). This paper tries to account for the changes in household consumption patterns associated with the change in TPDS policy in these states using data from household consumption surveys by the National Sample Survey Organization (NSSO). NSS data show improvement in the coverage of TPDS and average off-take of grains from fair price shops between 2004-05 and 2009-10 across all states of India. However, the increase in coverage and off-take was significantly higher in four out of these five states than in the rest of India. An average household in these states purchased three kilos more rice per month from fair price shops than its counterpart in non-treated states as a result of more generous TPDS policies backed by administrative reforms. The increase in consumption of PDS rice was the highest in Chhattisgarh, the poster state of PDS reforms. Households in Chhattisgarh used money saved on rice to spend more on pulses, edible oil, vegetables and sugar and other non-food items. We also find evidence that making TPDS more inclusive and more generous is not enough unless it is supported by administrative reforms to improve grain delivery and control diversion to open markets.Keywords: public distribution system, social safety-net, national food security act, diet quality, Chhattisgarh
Procedia PDF Downloads 3752755 Shedding Light on the Black Box: Explaining Deep Neural Network Prediction of Clinical Outcome
Authors: Yijun Shao, Yan Cheng, Rashmee U. Shah, Charlene R. Weir, Bruce E. Bray, Qing Zeng-Treitler
Abstract:
Deep neural network (DNN) models are being explored in the clinical domain, following the recent success in other domains such as image recognition. For clinical adoption, outcome prediction models require explanation, but due to the multiple non-linear inner transformations, DNN models are viewed by many as a black box. In this study, we developed a deep neural network model for predicting 1-year mortality of patients who underwent major cardio vascular procedures (MCVPs), using temporal image representation of past medical history as input. The dataset was obtained from the electronic medical data warehouse administered by Veteran Affairs Information and Computing Infrastructure (VINCI). We identified 21,355 veterans who had their first MCVP in 2014. Features for prediction included demographics, diagnoses, procedures, medication orders, hospitalizations, and frailty measures extracted from clinical notes. Temporal variables were created based on the patient history data in the 2-year window prior to the index MCVP. A temporal image was created based on these variables for each individual patient. To generate the explanation for the DNN model, we defined a new concept called impact score, based on the presence/value of clinical conditions’ impact on the predicted outcome. Like (log) odds ratio reported by the logistic regression (LR) model, impact scores are continuous variables intended to shed light on the black box model. For comparison, a logistic regression model was fitted on the same dataset. In our cohort, about 6.8% of patients died within one year. The prediction of the DNN model achieved an area under the curve (AUC) of 78.5% while the LR model achieved an AUC of 74.6%. A strong but not perfect correlation was found between the aggregated impact scores and the log odds ratios (Spearman’s rho = 0.74), which helped validate our explanation.Keywords: deep neural network, temporal data, prediction, frailty, logistic regression model
Procedia PDF Downloads 1532754 Prediction of Rotating Machines with Rolling Element Bearings and Its Components Deterioration
Authors: Marimuthu Gurusamy
Abstract:
In vibration analysis (with accelerometers) of rotating machines with rolling element bearing, the customers are interested to know the failure of the machine well in advance to plan the spare inventory and maintenance. But in real world most of the machines fails before the prediction of vibration analyst or Expert analysis software. Presently the prediction of failure is based on ISO 10816 vibration limits only. But this is not enough to monitor the failure of machines well in advance. Because more than 50% of the machines will fail even the vibration readings are within acceptable zone as per ISO 10816.Hence it requires further detail analysis and different techniques to predict the failure well in advance. In vibration Analysis, the velocity spectrum is used to analyse the root cause of the mechanical problems like unbalance, misalignment and looseness etc. The envelope spectrum are used to analyse the bearing frequency components, hence the failure in inner race, outer race and rolling elements are identified. But so far there is no correlation made between these two concepts. The author used both velocity spectrum and Envelope spectrum to analyse the machine behaviour and bearing condition to correlated the changes in dynamic load (by unbalance, misalignment and looseness etc.) and effect of impact on the bearing. Hence we could able to predict the expected life of the machine and bearings in the rotating equipment (with rolling element bearings). Also we used process parameters like temperature, flow and pressure to correlate with flow induced vibration and load variations, when abnormal vibration occurs due to changes in process parameters. Hence by correlation of velocity spectrum, envelope spectrum and process data with 20 years of experience in vibration analysis, the author could able to predict the rotating Equipment and its component’s deterioration and expected duration for maintenance.Keywords: vibration analysis, velocity spectrum, envelope spectrum, prediction of deterioration
Procedia PDF Downloads 4512753 Profitability Assessment of Granite Aggregate Production and the Development of a Profit Assessment Model
Authors: Melodi Mbuyi Mata, Blessing Olamide Taiwo, Afolabi Ayodele David
Abstract:
The purpose of this research is to create empirical models for assessing the profitability of granite aggregate production in Akure, Ondo state aggregate quarries. In addition, an artificial neural network (ANN) model and multivariate predicting models for granite profitability were developed in the study. A formal survey questionnaire was used to collect data for the study. The data extracted from the case study mine for this study includes granite marketing operations, royalty, production costs, and mine production information. The following methods were used to achieve the goal of this study: descriptive statistics, MATLAB 2017, and SPSS16.0 software in analyzing and modeling the data collected from granite traders in the study areas. The ANN and Multi Variant Regression models' prediction accuracy was compared using a coefficient of determination (R²), Root mean square error (RMSE), and mean square error (MSE). Due to the high prediction error, the model evaluation indices revealed that the ANN model was suitable for predicting generated profit in a typical quarry. More quarries in Nigeria's southwest region and other geopolitical zones should be considered to improve ANN prediction accuracy.Keywords: national development, granite, profitability assessment, ANN models
Procedia PDF Downloads 1012752 Impact of Ethnomedicinal Plants on Toothpaste Improvement
Authors: Muna Jalal Ali, Essam A. Makky, Mashitah M. Yusoff
Abstract:
Objectives: The aim of this study to evaluate the antimicrobial susceptibility of combined toothpaste with medicinal plants and the relations between the commercial toothpaste to its price and the patient age as well. Materials and Methods: Oral isolates of different patients aged 3 to 60 years were obtained, purified, and tested against four different ethnomedicinal plant extracts for antimicrobial activity. A total of 10 different commercial toothpastes (different brands and prices) were collected from the market, and the combined action of the medicinal plants and toothpaste was studied. Results: We found a higher bacterial population in the age group of 3–40 years than the group of 40–60 years, with approximately 44% and 32%, respectively. The combined action of ethanolic extract (alone) against oral isolates showed a synergistic effect, with 32.20, 30.50, and 25.42% for combinations A (Ci/Ca), B (Ci/Ca/P), and C (Ci/Ca/P/N), respectively. By contrast, the combined action of ethnomedicinal plants with 10 different toothpastes improved the antimicrobial sensitivity by 60, 100, and 0% for combinations A, B, and C respectively. Clinical relevance: The ethanolic extract of only combinations A and B with commercial toothpaste showed high antibacterial activity against oral isolates and the effectiveness of toothpaste is not related to the price.Keywords: microbial evolution, oral isolates, ethnomedicinal plants, antimicrobial activity, toothpaste
Procedia PDF Downloads 3152751 Prediction of Coronary Heart Disease Using Fuzzy Logic
Authors: Elda Maraj, Shkelqim Kuka
Abstract:
Coronary heart disease causes many deaths in the world. Unfortunately, this problem will continue to increase in the future. In this paper, a fuzzy logic model to predict coronary heart disease is presented. This model has been developed with seven input variables and one output variable that was implemented for 30 patients in Albania. Here fuzzy logic toolbox of MATLAB is used. Fuzzy model inputs are considered as cholesterol, blood pressure, physical activity, age, BMI, smoking, and diabetes, whereas the output is the disease classification. The fuzzy sets and membership functions are chosen in an appropriate manner. Centroid method is used for defuzzification. The database is taken from University Hospital Center "Mother Teresa" in Tirana, Albania.Keywords: coronary heart disease, fuzzy logic toolbox, membership function, prediction model
Procedia PDF Downloads 162