Search results for: predictive model.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7485

Search results for: predictive model.

7365 Using the Transtheoretical Model to Investigate Stages of Change in Regular Volunteer Service among Seniors in Community

Authors: Pei-Ti Hsu, I-Ju Chen, Jeu-Jung Chen, Cheng-Fen Chang, Shiu-Yan Yang

Abstract:

Background: Taiwan now is an aging society. Research on the elderly should not be confined to caring for seniors, but should also be focused on ways to improve health and the quality of life. Senior citizens who participate in volunteer services could become less lonely, have new growth opportunities, and regain a sense of accomplishment. Thus, the question of how to get the elderly to participate in volunteer service is worth exploring. Objective: Apply the Transtheoretical Model to understand stages of change in regular volunteer service and voluntary service behaviour among the seniors. Methods: 1525 adults over the age of 65 from the Renai district of Keelung City were interviewed. The research tool was a self-constructed questionnaire, and individual interviews were conducted to collect data. Then the data was processed and analyzed using the IBM SPSS Statistics 20 (Windows version) statistical software program. Results: In the past six months, research subjects averaged 9.92 days of volunteer services. A majority of these elderly individuals had no intention to change their regular volunteer services. We discovered that during the maintenance stage, the self-efficacy for volunteer services was higher than during all other stages, but self-perceived barriers were less during the preparation stage and action stage. Self-perceived benefits were found to have an important predictive power for those with regular volunteer service behaviors in the previous stage, and self-efficacy was found to have an important predictive power for those with regular volunteer service behaviors in later stages. Conclusions/Implications for Practice: The research results support the conclusion that community nursing staff should group elders based on their regular volunteer services change stages and design appropriate behavioral change strategies.

Keywords: Seniors, stages of change in regular volunteer services, volunteer service behavior, self-efficacy, self-perceived benefits.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911
7364 The Story of Mergers and Acquisitions: Using Narrative Theory to Understand the Uncertainty of Organizational Change

Authors: Philip T. Roundy

Abstract:

This paper examines the influence of communication form on employee uncertainty during mergers and acquisitions (M&As). Specifically, the author uses narrative theory to analyze how narrative organizational communication affects the three components of uncertainty – decreased predictive, explanatory, and descriptive ability. It is hypothesized that employees whose organizations use narrative M&A communication will have greater predictive, explanatory, and descriptive abilities than employees of organizations using non-narrative M&A communication. This paper contributes to the stream of research examining uncertainty during mergers and acquisitions and argues that narratives are an effective means of managing uncertainty in the mergers and acquisitions context.

Keywords: Narrative Theory, Mergers and Acquisitions, Employee Uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2970
7363 Development of Predictive Model for Surface Roughness in End Milling of Al-SiCp Metal Matrix Composites using Fuzzy Logic

Authors: M. Chandrasekaran, D. Devarasiddappa

Abstract:

Metal matrix composites have been increasingly used as materials for components in automotive and aerospace industries because of their improved properties compared with non-reinforced alloys. During machining the selection of appropriate machining parameters to produce job for desired surface roughness is of great concern considering the economy of manufacturing process. In this study, a surface roughness prediction model using fuzzy logic is developed for end milling of Al-SiCp metal matrix composite component using carbide end mill cutter. The surface roughness is modeled as a function of spindle speed (N), feed rate (f), depth of cut (d) and the SiCp percentage (S). The predicted values surface roughness is compared with experimental result. The model predicts average percentage error as 4.56% and mean square error as 0.0729. It is observed that surface roughness is most influenced by feed rate, spindle speed and SiC percentage. Depth of cut has least influence.

Keywords: End milling, fuzzy logic, metal matrix composites, surface roughness

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2120
7362 A Posterior Predictive Model-Based Control Chart for Monitoring Healthcare

Authors: Yi-Fan Lin, Peter P. Howley, Frank A. Tuyl

Abstract:

Quality measurement and reporting systems are used in healthcare internationally. In Australia, the Australian Council on Healthcare Standards records and reports hundreds of clinical indicators (CIs) nationally across the healthcare system. These CIs are measures of performance in the clinical setting, and are used as a screening tool to help assess whether a standard of care is being met. Existing analysis and reporting of these CIs incorporate Bayesian methods to address sampling variation; however, such assessments are retrospective in nature, reporting upon the previous six or twelve months of data. The use of Bayesian methods within statistical process control for monitoring systems is an important pursuit to support more timely decision-making. Our research has developed and assessed a new graphical monitoring tool, similar to a control chart, based on the beta-binomial posterior predictive (BBPP) distribution to facilitate the real-time assessment of health care organizational performance via CIs. The BBPP charts have been compared with the traditional Bernoulli CUSUM (BC) chart by simulation. The more traditional “central” and “highest posterior density” (HPD) interval approaches were each considered to define the limits, and the multiple charts were compared via in-control and out-of-control average run lengths (ARLs), assuming that the parameter representing the underlying CI rate (proportion of cases with an event of interest) required estimation. Preliminary results have identified that the BBPP chart with HPD-based control limits provides better out-of-control run length performance than the central interval-based and BC charts. Further, the BC chart’s performance may be improved by using Bayesian parameter estimation of the underlying CI rate.

Keywords: Average run length, Bernoulli CUSUM chart, beta binomial posterior predictive distribution, clinical indicator, health care organization, highest posterior density interval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 810
7361 Correlation of Viscosity in Nanofluids using Genetic Algorithm-neural Network (GA-NN)

Authors: Hajir Karimi, Fakheri Yousefi, Mahmood Reza Rahimi

Abstract:

An accurate and proficient artificial neural network (ANN) based genetic algorithm (GA) is developed for predicting of nanofluids viscosity. A genetic algorithm (GA) is used to optimize the neural network parameters for minimizing the error between the predictive viscosity and the experimental one. The experimental viscosity in two nanofluids Al2O3-H2O and CuO-H2O from 278.15 to 343.15 K and volume fraction up to 15% were used from literature. The result of this study reveals that GA-NN model is outperform to the conventional neural nets in predicting the viscosity of nanofluids with mean absolute relative error of 1.22% and 1.77% for Al2O3-H2O and CuO-H2O, respectively. Furthermore, the results of this work have also been compared with others models. The findings of this work demonstrate that the GA-NN model is an effective method for prediction viscosity of nanofluids and have better accuracy and simplicity compared with the others models.

Keywords: genetic algorithm, nanofluids, neural network, viscosity

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2031
7360 Decision Support System for Hospital Selection in Emergency Medical Services: A Discrete Event Simulation Approach

Authors: D. Tedesco, G. Feletti, P. Trucco

Abstract:

The present study aims to develop a Decision Support System (DSS) to support operational decisions in Emergency Medical Service (EMS) systems regarding the assignment of medical emergency requests to Emergency Departments (ED). This problem is called “hospital selection” and concerns the definition of policies for the selection of the ED to which patients who require further treatment are transported by ambulance. The employed research methodology consists of a first phase of review of the technical-scientific literature concerning DSSs to support the EMS management and, in particular, the hospital selection decision. From the literature analysis, it emerged that current studies mainly focused on the EMS phases related to the ambulance service and consider a process that ends when the ambulance is available after completing a mission. Therefore, all the ED-related issues are excluded and considered as part of a separate process. Indeed, the most studied hospital selection policy turned out to be proximity, thus allowing to minimize the travelling time and to free-up the ambulance in the shortest possible time. The purpose of the present study consists in developing an optimization model for assigning medical emergency requests to the EDs also considering the expected time performance in the subsequent phases of the process, such as the case mix, the expected service throughput times, and the operational capacity of different EDs in hospitals. To this end, a Discrete Event Simulation (DES) model was created to compare different hospital selection policies. The model was implemented with the AnyLogic software and finally validated on a realistic case. The hospital selection policy that returned the best results was the minimization of the Time To Provider (TTP), considered as the time from the beginning of the ambulance journey to the ED at the beginning of the clinical evaluation by the doctor. Finally, two approaches were further compared: a static approach, based on a retrospective estimation of the TTP, and a dynamic approach, focused on a predictive estimation of the TTP which is determined with a constantly updated Winters forecasting model. Findings reveal that considering the minimization of TTP is the best hospital selection policy. It allows to significantly reducing service throughput times in the ED with a negligible increase in travel time. Furthermore, an immediate view of the saturation state of the ED is produced and the case mix present in the ED structures (i.e., the different triage codes) is considered, as different severity codes correspond to different service throughput times. Besides, the use of a predictive approach is certainly more reliable in terms on TTP estimation, than a retrospective approach. These considerations can support decision-makers in introducing different hospital selection policies to enhance EMSs performance.

Keywords: Emergency medical services, hospital selection, discrete event simulation, forecast model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 148
7359 A New Approach to ECG Biometric Systems: A Comparitive Study between LPC and WPD Systems

Authors: Justin Leo Cheang Loong, Khazaimatol S Subari, Rosli Besar, Muhammad Kamil Abdullah

Abstract:

In this paper, a novel method for a biometric system based on the ECG signal is proposed, using spectral coefficients computed through linear predictive coding (LPC). ECG biometric systems have traditionally incorporated characteristics of fiducial points of the ECG signal as the feature set. These systems have been shown to contain loopholes and thus a non-fiducial system allows for tighter security. In the proposed system, incorporating non-fiducial features from the LPC spectrum produced a segment and subject recognition rate of 99.52% and 100% respectively. The recognition rates outperformed the biometric system that is based on the wavelet packet decomposition (WPD) algorithm in terms of recognition rates and computation time. This allows for LPC to be used in a practical ECG biometric system that requires fast, stringent and accurate recognition.

Keywords: biometric, ecg, linear predictive coding, wavelet packet decomposition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2835
7358 A Study of Relationship between Mountaineering Participation Motivation and Risk Perception

Authors: Yen-Chieh Wen, Ching-Hui Lin

Abstract:

The main purpose of this study is to analyze climbers involved in motivation and risk perception and analysis of the predictive ability of the risk perception "mountaineering" involved in motivation. This study used questionnaires, to have to climb the 3000m high mountain in Taiwan climbers object to carry out an investigation in order to non-random sampling, a total of 231 valid questionnaires were. After statistical analysis, the study found that: 1. Climbers the highest climbers involved in motivation "to enjoy the natural beauty of the fun. 2 climbers for climbers "risk perception" the highest: the natural environment of risk. 3. Climbers “seeking adventure stimulate", “competence achievement" motivation highly predictive of risk perception. Based on these findings, this study not only practices the recommendations of the outdoor leisure industry, and also related research proposals for future researchers.

Keywords: Mountaineering, motivation, risk perception, decision-making.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2795
7357 Cirrhosis Mortality Prediction as Classification Using Frequent Subgraph Mining

Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride

Abstract:

In this work, we use machine learning and data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. Our work applies modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.

Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 375
7356 An ANN-Based Predictive Model for Diagnosis and Forecasting of Hypertension

Authors: O. O. Obe, V. Balanica, E. Neagoe

Abstract:

The effects of hypertension are often lethal thus its early detection and prevention is very important for everybody. In this paper, a neural network (NN) model was developed and trained based on a dataset of hypertension causative parameters in order to forecast the likelihood of occurrence of hypertension in patients. Our research goal was to analyze the potential of the presented NN to predict, for a period of time, the risk of hypertension or the risk of developing this disease for patients that are or not currently hypertensive. The results of the analysis for a given patient can support doctors in taking pro-active measures for averting the occurrence of hypertension such as recommendations regarding the patient behavior in order to lower his hypertension risk. Moreover, the paper envisages a set of three example scenarios in order to determine the age when the patient becomes hypertensive, i.e. determine the threshold for hypertensive age, to analyze what happens if the threshold hypertensive age is set to a certain age and the weight of the patient if being varied, and, to set the ideal weight for the patient and analyze what happens with the threshold of hypertensive age.

Keywords: Neural Network, hypertension, data set, training set, supervised learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1623
7355 Aggregate Angularity on the Permanent Deformation Zones of Hot Mix Asphalt

Authors: Lee P. Leon, Raymond Charles

Abstract:

This paper presents a method of evaluating the effect of aggregate angularity on hot mix asphalt (HMA) properties and its relationship to the Permanent Deformation resistance. The research concluded that aggregate particle angularity had a significant effect on the Permanent Deformation performance, and also that with an increase in coarse aggregate angularity there was an increase in the resistance of mixes to Permanent Deformation. A comparison between the measured data and predictive data of permanent deformation predictive models showed the limits of existing prediction models. The numerical analysis described the permanent deformation zones and concluded that angularity has an effect of the onset of these zones. Prediction of permanent deformation help road agencies and by extension economists and engineers determine the best approach for maintenance, rehabilitation, and new construction works of the road infrastructure.

Keywords: Aggregate angularity, asphalt concrete, permanent deformation, rutting prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035
7354 Improved Computational Efficiency of Machine Learning Algorithms Based on Evaluation Metrics to Control the Spread of Coronavirus in the UK

Authors: Swathi Ganesan, Nalinda Somasiri, Rebecca Jeyavadhanam, Gayathri Karthick

Abstract:

The COVID-19 crisis presents a substantial and critical hazard to worldwide health. Since the occurrence of the disease in late January 2020 in the UK, the number of infected people confirmed to acquire the illness has increased tremendously across the country, and the number of individuals affected is undoubtedly considerably high. The purpose of this research is to figure out a predictive machine learning (ML) archetypal that could forecast the COVID-19 cases within the UK. This study concentrates on the statistical data collected from 31st January 2020 to 31st March 2021 in the United Kingdom. Information on total COVID-19 cases registered, new cases encountered on a daily basis, total death registered, and patients’ death per day due to Coronavirus is collected from World Health Organization (WHO). Data preprocessing is carried out to identify any missing values, outliers, or anomalies in the dataset. The data are split into 8:2 ratio for training and testing purposes to forecast future new COVID-19 cases. Support Vector Machine (SVM), Random Forest (RF), and linear regression (LR) algorithms are chosen to study the model performance in the prediction of new COVID-19 cases. From the evaluation metrics such as r-squared value and mean squared error, the statistical performance of the model in predicting the new COVID-19 cases is evaluated. RF outperformed the other two ML algorithms with a training accuracy of 99.47% and testing accuracy of 98.26% when n = 30. The mean square error obtained for RF is 4.05e11, which is lesser compared to the other predictive models used for this study. From the experimental analysis, RF algorithm can perform more effectively and efficiently in predicting the new COVID-19 cases, which could help the health sector to take relevant control measures for the spread of the virus.

Keywords: COVID-19, machine learning, supervised learning, unsupervised learning, linear regression, support vector machine, random forest.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 99
7353 Predictive Maintenance of Industrial Shredders: Efficient Operation through Real-Time Monitoring Using Statistical Machine Learning

Authors: Federico Pittino, Dominik Holzmann, Krithika Sayar-Chand, Stefan Moser, Sebastian Pliessnig, Thomas Arnold

Abstract:

The shredding of waste materials is a key step in the recycling process towards circular economy. Industrial shredders for waste processing operate in very harsh operating conditions, leading to the need of frequent maintenance of critical components. The maintenance optimization is particularly important also to increase the machine’s efficiency, thereby reducing the operational costs. In this work, a monitoring system has been developed and deployed on an industrial shredder located at a waste recycling plant in Austria. The machine has been monitored for several months and methods for predictive maintenance have been developed for two key components: the cutting knives and the drive belt. The large amount of collected data is leveraged by statistical machine learning techniques, thereby not requiring a very detailed knowledge of the machine or its live operating conditions. The results show that, despite the wide range of operating conditions, a reliable estimate of the optimal time for maintenance can be derived. Moreover, the trade-off between the cost of maintenance and the increase in power consumption due to the wear state of the monitored components of the machine is investigated. This work proves the benefits of real-time monitoring system for efficient operation of industrial shredders.

Keywords: predictive maintenance, circular economy, industrial shredder, cost optimization, statistical machine learning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 579
7352 Assessing Habitat-Suitability Models with a Virtual Species at Khao Nan National Park, Thailand

Authors: W. Srisang, K. Jaroensutasinee, M. Jaroensutasinee

Abstract:

This study examined a habitat-suitability assessment method namely the Ecological Niche Factor Analysis (ENFA). A virtual species was created and then dispatched in a geographic information system model of a real landscape in three historic scenarios: (1) spreading, (2) equilibrium, and (3) overabundance. In each scenario, the virtual species was sampled and these simulated data sets were used as inputs for the ENFA to reconstruct the habitat suitability model. The 'equilibrium' scenario gives the highest quantity and quality among three scenarios. ENFA was sensitive to the distribution scenarios but not sensitive to sample sizes. The use of a virtual species proved to be a very efficient method, allowing one to fully control the quality of the input data as well as to accurately evaluate the predictive power of the analyses.

Keywords: Habitat-Suitability Models, Ecological niche factoranalysis, Climatic factors, Geographic information system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1768
7351 Movie Genre Preference Prediction Using Machine Learning for Customer-Based Information

Authors: Haifeng Wang, Haili Zhang

Abstract:

Most movie recommendation systems have been developed for customers to find items of interest. This work introduces a predictive model usable by small and medium-sized enterprises (SMEs) who are in need of a data-based and analytical approach to stock proper movies for local audiences and retain more customers. We used classification models to extract features from thousands of customers’ demographic, behavioral and social information to predict their movie genre preference. In the implementation, a Gaussian kernel support vector machine (SVM) classification model and a logistic regression model were established to extract features from sample data and their test error-in-sample were compared. Comparison of error-out-sample was also made under different Vapnik–Chervonenkis (VC) dimensions in the machine learning algorithm to find and prevent overfitting. Gaussian kernel SVM prediction model can correctly predict movie genre preferences in 85% of positive cases. The accuracy of the algorithm increased to 93% with a smaller VC dimension and less overfitting. These findings advance our understanding of how to use machine learning approach to predict customers’ preferences with a small data set and design prediction tools for these enterprises.

Keywords: Computational social science, movie preference, machine learning, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1597
7350 A Survey of Model Comparison Strategies and Techniques in Model Driven Engineering

Authors: Junaid Rashid, Waqar Mehmood, Muhammad Wasif Nisar

Abstract:

This survey paper shows the recent state of model comparison as it’s applies to Model Driven engineering. In Model Driven Engineering to calculate the difference between the models is a very important and challenging task. There are number of tasks involved in model differencing that firstly starts with identifying and matching the elements of the model. In this paper, we discuss how model matching is accomplished, the strategies, techniques and the types of the model. We also discuss the future direction. We found out that many of the latest model comparison strategies are geared near enabling Meta model and similarity based matching. Therefore model versioning is the most dominant application of the model comparison. Recently to work on comparison for versioning has begun to deteriorate, giving way to different applications. Ultimately there is wide change among the tools in the measure of client exertion needed to perform model comparisons, as some require more push to encourage more sweeping statement and expressive force.

Keywords: Model comparison, model clone detection, model versioning, EMF Model, model diff.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2126
7349 Modeling of Compaction Curves for Corn Cob Ash-Cement Stabilized Lateritic Soils

Authors: O. A. Apampa, Y. A. Jimoh, K. A. Olonade

Abstract:

The need to save time and cost of soil testing at the planning stage of road work has necessitated developing predictive models. This study proposes a model for predicting the dry density of lateritic soils stabilized with corn cob ash (CCA) and blended cement - CCA. Lateritic soil was first stabilized with CCA at 1.5, 3.0, 4.5 and 6% of the weight of soil and then stabilized with the same proportions as replacement for cement. Dry density, specific gravity, maximum degree of saturation and moisture content were determined for each stabilized soil specimen, following standard procedure. Polynomial equations containing alpha and beta parameters for CCA and blended CCA-cement were developed. Experimental values were correlated with the values predicted from the Matlab curve fitting tool, and the Solver function of Microsoft Excel 2010. The correlation coefficient (R2) of 0.86 was obtained indicating that the model could be accepted in predicting the maximum dry density of CCA stabilized soils to facilitate quick decision making in roadworks.

Keywords: Corn cob ash, lateritic soil, stabilization, maximum dry density, moisture content.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1652
7348 The Entrepreneur's General Personality Traits and Technological Developments

Authors: Bostjan Antoncic

Abstract:

Technological newness and innovativeness are important aspects of small firm development, growth and wealth creation. The contribution of the study to entrepreneurship personality research and to technology-related research in entrepreneurship is that the model of the general personality driven technological development was developed and empirically tested. Hypotheses relating the big five personality factors (OCEAN: openness, conscientiousness, extraversion, agreeableness, and neuroticism) and technological developments were tested by using multiple regression analysis on survey data from a sample of 160 entrepreneurs from Slovenia. The model reveals two personality factors, which are predictive of technological developments: openness (positive impact) and neuroticism (negative impact). In addition, a positive impact of firm age on technological developments was found. Other personality factors (conscientiousness, extraversion and agreeableness) of entrepreneurs may not be considered important for their firm technological developments.

Keywords: Big five factors, entrepreneur, personality, technology development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3103
7347 Tools for Analysis and Optimization of Standalone Green Microgrids

Authors: William Anderson, Kyle Kobold, Oleg Yakimenko

Abstract:

Green microgrids using mostly renewable energy (RE) for generation, are complex systems with inherent nonlinear dynamics. Among a variety of different optimization tools there are only a few ones that adequately consider this complexity. This paper evaluates applicability of two somewhat similar optimization tools tailored for standalone RE microgrids and also assesses a machine learning tool for performance prediction that can enhance the reliability of any chosen optimization tool. It shows that one of these microgrid optimization tools has certain advantages over another and presents a detailed routine of preparing input data to simulate RE microgrid behavior. The paper also shows how neural-network-based predictive modeling can be used to validate and forecast solar power generation based on weather time series data, which improves the overall quality of standalone RE microgrid analysis.

Keywords: Microgrid, renewable energy, complex systems, optimization, predictive modeling, neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1000
7346 Concepts Extraction from Discharge Notes using Association Rule Mining

Authors: Basak Oguz Yolcular

Abstract:

A large amount of valuable information is available in plain text clinical reports. New techniques and technologies are applied to extract information from these reports. In this study, we developed a domain based software system to transform 600 Otorhinolaryngology discharge notes to a structured form for extracting clinical data from the discharge notes. In order to decrease the system process time discharge notes were transformed into a data table after preprocessing. Several word lists were constituted to identify common section in the discharge notes, including patient history, age, problems, and diagnosis etc. N-gram method was used for discovering terms co-Occurrences within each section. Using this method a dataset of concept candidates has been generated for the validation step, and then Predictive Apriori algorithm for Association Rule Mining (ARM) was applied to validate candidate concepts.

Keywords: association rule mining, otorhinolaryngology, predictive apriori, text mining

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1572
7345 Design of an Stable GPC for Nonminimum Phase LTI Systems

Authors: Mahdi Yaghobi, Mohammad Haeri

Abstract:

The current methods of predictive controllers are utilized for those processes in which the rate of output variations is not high. For such processes, therefore, stability can be achieved by implementing the constrained predictive controller or applying infinite prediction horizon. When the rate of the output growth is high (e.g. for unstable nonminimum phase process) the stabilization seems to be problematic. In order to avoid this, it is suggested to change the method in the way that: first, the prediction error growth should be decreased at the early stage of the prediction horizon, and second, the rate of the error variation should be penalized. The growth of the error is decreased through adjusting its weighting coefficients in the cost function. Reduction in the error variation is possible by adding the first order derivate of the error into the cost function. By studying different examples it is shown that using these two remedies together, the closed-loop stability of unstable nonminimum phase process can be achieved.

Keywords: GPC, Stability, Varying Weighting Coefficients.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1218
7344 A Large Dataset Imputation Approach Applied to Country Conflict Prediction Data

Authors: Benjamin D. Leiby, Darryl K. Ahner

Abstract:

This study demonstrates an alternative stochastic imputation approach for large datasets when preferred commercial packages struggle to iterate due to numerical problems. A large country conflict dataset motivates the search to impute missing values well over a common threshold of 20% missingness. The methodology capitalizes on correlation while using model residuals to provide the uncertainty in estimating unknown values. Examination of the methodology provides insight toward choosing linear or nonlinear modeling terms. Static tolerances common in most packages are replaced with tailorable tolerances that exploit residuals to fit each data element. The methodology evaluation includes observing computation time, model fit, and the comparison of known  values to replaced values created through imputation. Overall, the country conflict dataset illustrates promise with modeling first-order interactions, while presenting a need for further refinement that mimics predictive mean matching.

Keywords: Correlation, country conflict, imputation, stochastic regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 351
7343 A Comparison of Artificial Neural Networks for Prediction of Suspended Sediment Discharge in River- A Case Study in Malaysia

Authors: M.R. Mustafa, M.H. Isa, R.B. Rezaur

Abstract:

Prediction of highly non linear behavior of suspended sediment flow in rivers has prime importance in the field of water resources engineering. In this study the predictive performance of two Artificial Neural Networks (ANNs) namely, the Radial Basis Function (RBF) Network and the Multi Layer Feed Forward (MLFF) Network have been compared. Time series data of daily suspended sediment discharge and water discharge at Pari River was used for training and testing the networks. A number of statistical parameters i.e. root mean square error (RMSE), mean absolute error (MAE), coefficient of efficiency (CE) and coefficient of determination (R2) were used for performance evaluation of the models. Both the models produced satisfactory results and showed a good agreement between the predicted and observed data. The RBF network model provided slightly better results than the MLFF network model in predicting suspended sediment discharge.

Keywords: ANN, discharge, modeling, prediction, suspendedsediment,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1680
7342 COVID_ICU_BERT: A Fine-tuned Language Model for COVID-19 Intensive Care Unit Clinical Notes

Authors: Shahad Nagoor, Lucy Hederman, Kevin Koidl, Annalina Caputo

Abstract:

Doctors’ notes reflect their impressions, attitudes, clinical sense, and opinions about patients’ conditions and progress, and other information that is essential for doctors’ daily clinical decisions. Despite their value, clinical notes are insufficiently researched within the language processing community. Automatically extracting information from unstructured text data is known to be a difficult task as opposed to dealing with structured information such as physiological vital signs, images and laboratory results. The aim of this research is to investigate how Natural Language Processing (NLP) techniques and machine learning techniques applied to clinician notes can assist in doctors’ decision making in Intensive Care Unit (ICU) for coronavirus disease 2019 (COVID-19) patients. The hypothesis is that clinical outcomes like survival or mortality can be useful to influence the judgement of clinical sentiment in ICU clinical notes. This paper presents two contributions: first, we introduce COVID_ICU_BERT, a fine-tuned version of a clinical transformer model that can reliably predict clinical sentiment for notes of COVID patients in ICU. We train the model on clinical notes for COVID-19 patients, ones not previously seen by Bio_ClinicalBERT or Bio_Discharge_Summary_BERT. The model which was based on Bio_ClinicalBERT achieves higher predictive accuracy than the one based on Bio_Discharge_Summary_BERT (Acc 93.33%, AUC 0.98, and Precision 0.96). Second, we perform data augmentation using clinical contextual word embedding that is based on a pre-trained clinical model to balance the samples in each class in the data (survived vs. deceased patients). Data augmentation improves the accuracy of prediction slightly (Acc 96.67%, AUC 0.98, and Precision 0.92).

Keywords: BERT fine-tuning, clinical sentiment, COVID-19, data augmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 168
7341 Ensemble Approach for Predicting Student's Academic Performance

Authors: L. A. Muhammad, M. S. Argungu

Abstract:

Educational data mining (EDM) has recorded substantial considerations. Techniques of data mining in one way or the other have been proposed to dig out out-of-sight knowledge in educational data. The result of the study got assists academic institutions in further enhancing their process of learning and methods of passing knowledge to students. Consequently, the performance of students boasts and the educational products are by no doubt enhanced. This study adopted a student performance prediction model premised on techniques of data mining with Students' Essential Features (SEF). SEF are linked to the learner's interactivity with the e-learning management system. The performance of the student's predictive model is assessed by a set of classifiers, viz. Bayes Network, Logistic Regression, and Reduce Error Pruning Tree (REP). Consequently, ensemble methods of Bagging, Boosting, and Random Forest (RF) are applied to improve the performance of these single classifiers. The study reveals that the result shows a robust affinity between learners' behaviors and their academic attainment. Result from the study shows that the REP Tree and its ensemble record the highest accuracy of 83.33% using SEF. Hence, in terms of the Receiver Operating Curve (ROC), boosting method of REP Tree records 0.903, which is the best. This result further demonstrates the dependability of the proposed model.

Keywords: Ensemble, bagging, Random Forest, boosting, data mining, classifiers, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 663
7340 Distributed Detection and Optimal Traffic-blocking of Network Worms

Authors: Zoran Nikoloski, Narsingh Deo, Ludek Kucera

Abstract:

Despite the recent surge of research in control of worm propagation, currently, there is no effective defense system against such cyber attacks. We first design a distributed detection architecture called Detection via Distributed Blackholes (DDBH). Our novel detection mechanism could be implemented via virtual honeypots or honeynets. Simulation results show that a worm can be detected with virtual honeypots on only 3% of the nodes. Moreover, the worm is detected when less than 1.5% of the nodes are infected. We then develop two control strategies: (1) optimal dynamic trafficblocking, for which we determine the condition that guarantees minimum number of removed nodes when the worm is contained and (2) predictive dynamic traffic-blocking–a realistic deployment of the optimal strategy on scale-free graphs. The predictive dynamic traffic-blocking, coupled with the DDBH, ensures that more than 40% of the network is unaffected by the propagation at the time when the worm is contained.

Keywords: Network worms, distributed detection, optimaltraffic-blocking, individual-based simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1388
7339 The Design of a Vehicle Traffic Flow Prediction Model for a Gauteng Freeway Based on an Ensemble of Multi-Layer Perceptron

Authors: Tebogo Emma Makaba, Barnabas Ndlovu Gatsheni

Abstract:

The cities of Johannesburg and Pretoria both located in the Gauteng province are separated by a distance of 58 km. The traffic queues on the Ben Schoeman freeway which connects these two cities can stretch for almost 1.5 km. Vehicle traffic congestion impacts negatively on the business and the commuter’s quality of life. The goal of this paper is to identify variables that influence the flow of traffic and to design a vehicle traffic prediction model, which will predict the traffic flow pattern in advance. The model will unable motorist to be able to make appropriate travel decisions ahead of time. The data used was collected by Mikro’s Traffic Monitoring (MTM). Multi-Layer perceptron (MLP) was used individually to construct the model and the MLP was also combined with Bagging ensemble method to training the data. The cross—validation method was used for evaluating the models. The results obtained from the techniques were compared using predictive and prediction costs. The cost was computed using combination of the loss matrix and the confusion matrix. The predicted models designed shows that the status of the traffic flow on the freeway can be predicted using the following parameters travel time, average speed, traffic volume and day of month. The implications of this work is that commuters will be able to spend less time travelling on the route and spend time with their families. The logistics industry will save more than twice what they are currently spending.

Keywords: Bagging ensemble methods, confusion matrix, multi-layer perceptron, vehicle traffic flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1729
7338 What the Future Holds for Social Media Data Analysis

Authors: P. Wlodarczak, J. Soar, M. Ally

Abstract:

The dramatic rise in the use of Social Media (SM) platforms such as Facebook and Twitter provide access to an unprecedented amount of user data. Users may post reviews on products and services they bought, write about their interests, share ideas or give their opinions and views on political issues. There is a growing interest in the analysis of SM data from organisations for detecting new trends, obtaining user opinions on their products and services or finding out about their online reputations. A recent research trend in SM analysis is making predictions based on sentiment analysis of SM. Often indicators of historic SM data are represented as time series and correlated with a variety of real world phenomena like the outcome of elections, the development of financial indicators, box office revenue and disease outbreaks. This paper examines the current state of research in the area of SM mining and predictive analysis and gives an overview of the analysis methods using opinion mining and machine learning techniques.

Keywords: Social Media, text mining, knowledge discovery, predictive analysis, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3789
7337 Influence of the Granular Mixture Properties on the Rheological Properties of Concrete: Yield Stress Determination Using Modified Chateau et al. Model

Authors: Rachid Zentar, Mokrane Bala, Pascal Boustingorry

Abstract:

The prediction of the rheological behavior of concrete is at the center of current concerns of the concrete industry for different reasons. The shortage of good quality standard materials combined with variable properties of available materials imposes to improve existing models to take into account these variations at the design stage of concrete. The main reasons for improving the predictive models are, of course, saving time and cost at the design stage as well as to optimize concrete performances. In this study, we will highlight the different properties of the granular mixtures that affect the rheological properties of concrete. Our objective is to identify the intrinsic parameters of the aggregates which make it possible to predict the yield stress of concrete. The work was done using two typologies of grains: crushed and rolled aggregates. The experimental results have shown that the rheology of concrete is improved by increasing the packing density of the granular mixture using rolled aggregates. The experimental program realized allowed to model the yield stress of concrete by a modified model of Chateau et al. through a dimensionless parameter following Krieger-Dougherty law. The modelling confirms that the yield stress of concrete depends not only on the properties of cement paste but also on the packing density of the granular skeleton and the shape of grains.

Keywords: Crushed aggregates, intrinsic viscosity, packing density, rolled aggregates, slump, yield stress of concrete.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 543
7336 Concurrent Approach to Data Parallel Model using Java

Authors: Bala Dhandayuthapani Veerasamy

Abstract:

Parallel programming models exist as an abstraction of hardware and memory architectures. There are several parallel programming models in commonly use; they are shared memory model, thread model, message passing model, data parallel model, hybrid model, Flynn-s models, embarrassingly parallel computations model, pipelined computations model. These models are not specific to a particular type of machine or memory architecture. This paper expresses the model program for concurrent approach to data parallel model through java programming.

Keywords: Concurrent, Data Parallel, JDK, Parallel, Thread

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2042