Search results for: predicting models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7245

Search results for: predicting models

6465 Parametric Study for Obtaining the Structural Response of Segmental Tunnels in Soft Soil by Using No-Linear Numerical Models

Authors: Arturo Galván, Jatziri Y. Moreno-Martínez, Israel Enrique Herrera Díaz, José Ramón Gasca Tirado

Abstract:

In recent years, one of the methods most used for the construction of tunnels in soft soil is the shield-driven tunneling. The advantage of this construction technique is that it allows excavating the tunnel while at the same time a primary lining is placed, which consists of precast segments. There are joints between segments, also called longitudinal joints, and joints between rings (called as circumferential joints). This is the reason because of this type of constructions cannot be considered as a continuous structure. The effect of these joints influences in the rigidity of the segmental lining and therefore in its structural response. A parametric study was performed to take into account the effect of different parameters in the structural response of typical segmental tunnels built in soft soil by using non-linear numerical models based on Finite Element Method by means of the software package ANSYS v. 11.0. In the first part of this study, two types of numerical models were performed. In the first one, the segments were modeled by using beam elements based on Timoshenko beam theory whilst the segment joints were modeled by using inelastic rotational springs considering the constitutive moment-rotation relation proposed by Gladwell. In this way, the mechanical behavior of longitudinal joints was simulated. On the other hand for simulating the mechanical behavior of circumferential joints elastic springs were considered. As well as, the stability given by the soil was modeled by means of elastic-linear springs. In the second type of models, the segments were modeled by means of three-dimensional solid elements and the joints with contact elements. In these models, the zone of the joints is modeled as a discontinuous (increasing the computational effort) therefore a discrete model is obtained. With these contact elements the mechanical behavior of joints is simulated considering that when the joint is closed, there is transmission of compressive and shear stresses but not of tensile stresses and when the joint is opened, there is no transmission of stresses. This type of models can detect changes in the geometry because of the relative movement of the elements that form the joints. A comparison between the numerical results with two types of models was carried out. In this way, the hypothesis considered in the simplified models were validated. In addition, the numerical models were calibrated with (Lab-based) experimental results obtained from the literature of a typical tunnel built in Europe. In the second part of this work, a parametric study was performed by using the simplified models due to less used computational effort compared to complex models. In the parametric study, the effect of material properties, the geometry of the tunnel, the arrangement of the longitudinal joints and the coupling of the rings were studied. Finally, it was concluded that the mechanical behavior of segment and ring joints and the arrangement of the segment joints affect the global behavior of the lining. As well as, the effect of the coupling between rings modifies the structural capacity of the lining.

Keywords: numerical models, parametric study, segmental tunnels, structural response

Procedia PDF Downloads 213
6464 Bridging the Gap between M and E, and KM: Towards the Integration of Evidence-Based Information and Policy Decision-Making

Authors: Xueqing Ivy Chen, Christo De Coning

Abstract:

It is clear from practice that a gap exists between Result-Based Monitoring and Evaluation (RBME) as a discipline, and Knowledge Management (KM) on the other hand. Whereas various government departments have institutionalised these functions, KM and M&E has functioned in isolation from each other in a practical sense in the public sector. It’s therefore necessary to explore the relationship between KM and M&E and the necessity for integration, so that a convergence of these disciplines can be established. An integration of KM and M&E will lead to integration and improvement of evidence-based information and policy decision-making. M&E and KM process models are available but the complementarity between specific process steps of these process models are not exploited. A need exists to clarify the relationships between these functions in order to ensure evidence based information and policy decision-making. This paper will depart from the well-known policy process models, such as the generic model and consider recent on the interface between policy, M&E and KM.

Keywords: result-based monitoring and evaluation, RBME, knowledge management, KM, evident based decision making, public policy, information systems, institutional arrangement

Procedia PDF Downloads 132
6463 Prediction of Mechanical Strength of Multiscale Hybrid Reinforced Cementitious Composite

Authors: Salam Alrekabi, A. B. Cundy, Mohammed Haloob Al-Majidi

Abstract:

Novel multiscale hybrid reinforced cementitious composites based on carbon nanotubes (MHRCC-CNT), and carbon nanofibers (MHRCC-CNF) are new types of cement-based material fabricated with micro steel fibers and nanofilaments, featuring superior strain hardening, ductility, and energy absorption. This study focused on established models to predict the compressive strength, and direct and splitting tensile strengths of the produced cementitious composites. The analysis was carried out based on the experimental data presented by the previous author’s study, regression analysis, and the established models that available in the literature. The obtained models showed small differences in the predictions and target values with experimental verification indicated that the estimation of the mechanical properties could be achieved with good accuracy.

Keywords: multiscale hybrid reinforced cementitious composites, carbon nanotubes, carbon nanofibers, mechanical strength prediction

Procedia PDF Downloads 146
6462 Urine Neutrophil Gelatinase-Associated Lipocalin as an Early Marker of Acute Kidney Injury in Hematopoietic Stem Cell Transplantation Patients

Authors: Sara Ataei, Maryam Taghizadeh-Ghehi, Amir Sarayani, Asieh Ashouri, Amirhossein Moslehi, Molouk Hadjibabaie, Kheirollah Gholami

Abstract:

Background: Acute kidney injury (AKI) is common in hematopoietic stem cell transplantation (HSCT) patients with an incidence of 21–73%. Prevention and early diagnosis reduces the frequency and severity of this complication. Predictive biomarkers are of major importance to timely diagnosis. Neutrophil gelatinase associated lipocalin (NGAL) is a widely investigated novel biomarker for early diagnosis of AKI. However, no study assessed NGAL for AKI diagnosis in HSCT patients. Methods: We performed further analyses on gathered data from our recent trial to evaluate the performance of urine NGAL (uNGAL) as an indicator of AKI in 72 allogeneic HSCT patients. AKI diagnosis and severity were assessed using Risk–Injury–Failure–Loss–End-stage renal disease and AKI Network criteria. We assessed uNGAL on days -6, -3, +3, +9 and +15. Results: Time-dependent Cox regression analysis revealed a statistically significant relationship between uNGAL and AKI occurrence. (HR=1.04 (1.008-1.07), P=0.01). There was a relation between uNGAL day +9 to baseline ratio and incidence of AKI (unadjusted HR=.1.047(1.012-1.083), P<0.01). The area under the receiver-operating characteristic curve for day +9 to baseline ratio was 0.86 (0.74-0.99, P<0.01) and a cut-off value of 2.62 was 85% sensitive and 83% specific in predicting AKI. Conclusions: Our results indicated that increase in uNGAL augmented the risk of AKI and the changes of day +9 uNGAL concentrations from baseline could be of value for predicting AKI in HSCT patients. Additionally uNGAL changes preceded serum creatinine rises by nearly 2 days.

Keywords: acute kidney injury, hemtopoietic stem cell transplantation, neutrophil gelatinase-associated lipocalin, Receiver-operating characteristic curve

Procedia PDF Downloads 390
6461 Assisting Dating of Greek Papyri Images with Deep Learning

Authors: Asimina Paparrigopoulou, John Pavlopoulos, Maria Konstantinidou

Abstract:

Dating papyri accurately is crucial not only to editing their texts but also for our understanding of palaeography and the history of writing, ancient scholarship, material culture, networks in antiquity, etc. Most ancient manuscripts offer little evidence regarding the time of their production, forcing papyrologists to date them on palaeographical grounds, a method often criticized for its subjectivity. By experimenting with data obtained from the Collaborative Database of Dateable Greek Bookhands and the PapPal online collections of objectively dated Greek papyri, this study shows that deep learning dating models, pre-trained on generic images, can achieve accurate chronological estimates for a test subset (67,97% accuracy for book hands and 55,25% for documents). To compare the estimates of these models with those of humans, experts were asked to complete a questionnaire with samples of literary and documentary hands that had to be sorted chronologically by century. The same samples were dated by the models in question. The results are presented and analysed.

Keywords: image classification, papyri images, dating

Procedia PDF Downloads 62
6460 A Systemic Maturity Model

Authors: Emir H. Pernet, Jeimy J. Cano

Abstract:

Maturity models, used descriptively to explain changes in reality or normatively to guide managers to make interventions to make organizations more effective and efficient, are based on the principles of statistical quality control promulgated by Shewhart in the years 30, and on the principles of PDCA continuous improvement (Plan, Do, Check, Act) developed by Deming and Juran. Some frameworks developed over the concept of maturity models includes COBIT, CMM, and ITIL. This paper presents some limitations of traditional maturity models, most of them based on points of reflection and analysis done by some authors. Almost all limitations are related to the mechanistic and reductionist approach of the principles over those models are built. As Systems Theory helps the understanding of the dynamics of organizations and organizational change, the development of a systemic maturity model can help to overcome some of those limitations. This document proposes a systemic maturity model, based on a systemic conceptualization of organizations, focused on the study of the functioning of the parties, the relationships among them, and their behavior as a whole. The concept of maturity from the system theory perspective is conceptually defined as an emergent property of the organization, which arises from as a result of the degree of alignment and integration of their processes. This concept is operationalized through a systemic function that measures the maturity of an organization, and finally validated by the measuring of maturity in organizations. For its operationalization and validation, the model was applied to measure the maturity of organizational Governance, Risk and Compliance (GRC) processes.

Keywords: GRC, maturity model, systems theory, viable system model

Procedia PDF Downloads 294
6459 The Evolution of Domestic Terrorism: Global Contemporary Models

Authors: Bret Brooks

Abstract:

As the international community has focused their attention in recent times on international and transnational terrorism, many nations have ignored their own domestic terrorist groups. Domestic terrorism has significantly evolved over the last 15 years and as such nation states must adequately understand their own individual issues as well as the broader worldwide perspective. Contemporary models show that obtaining peace with domestic groups is not only the end goal, but also very obtainable. By evaluating modern examples and incorporating successful strategies, countries around the world have the ability to bring about a diplomatic resolution to domestic extremism and domestic terrorism.

Keywords: domestic, evolution, peace, terrorism

Procedia PDF Downloads 489
6458 Optimizing Privacy, Accuracy and Calibration in Deep Learning Models

Authors: Rizwan Rizwan

Abstract:

Differentially private ({DP}) training preserves the data privacy but often leads to slower convergence and lower accuracy, along with notable mis-calibration compared to non-private training. Analyzing {DP} training through a continuous-time approach with the neural tangent kernel ({NTK}). The {NTK} helps characterize per sample {(PS)} gradient clipping and the incorporation of noise during {DP} training across arbitrary network architectures as well as loss functions. Our analysis reveals that noise addition impacts privacy risk exclusively, leaving convergence and calibration unaffected. In contrast, {PS} gradient clipping (flat styles, layerwise styles) influences convergence as well as calibration but not privacy risk. Models with a small clipping norm generally achieve optimal accuracy but exhibit poor calibration, making them less reliable. Conversely, {DP} models that are trained with a large clipping norm maintain the similar accuracy and same privacy guarantee, yet they demonstrate notably improved calibration.

Keywords: deep learning, convergence, differential privacy, calibration

Procedia PDF Downloads 19
6457 Fuzzy Time Series Forecasting Based on Fuzzy Logical Relationships, PSO Technique, and Automatic Clustering Algorithm

Authors: A. K. M. Kamrul Islam, Abdelhamid Bouchachia, Suang Cang, Hongnian Yu

Abstract:

Forecasting model has a great impact in terms of prediction and continues to do so into the future. Although many forecasting models have been studied in recent years, most researchers focus on different forecasting methods based on fuzzy time series to solve forecasting problems. The forecasted models accuracy fully depends on the two terms that are the length of the interval in the universe of discourse and the content of the forecast rules. Moreover, a hybrid forecasting method can be an effective and efficient way to improve forecasts rather than an individual forecasting model. There are different hybrids forecasting models which combined fuzzy time series with evolutionary algorithms, but the performances are not quite satisfactory. In this paper, we proposed a hybrid forecasting model which deals with the first order as well as high order fuzzy time series and particle swarm optimization to improve the forecasted accuracy. The proposed method used the historical enrollments of the University of Alabama as dataset in the forecasting process. Firstly, we considered an automatic clustering algorithm to calculate the appropriate interval for the historical enrollments. Then particle swarm optimization and fuzzy time series are combined that shows better forecasting accuracy than other existing forecasting models.

Keywords: fuzzy time series (fts), particle swarm optimization, clustering algorithm, hybrid forecasting model

Procedia PDF Downloads 230
6456 A Review on 3D Smart City Platforms Using Remotely Sensed Data to Aid Simulation and Urban Analysis

Authors: Slim Namouchi, Bruno Vallet, Imed Riadh Farah

Abstract:

3D urban models provide powerful tools for decision making, urban planning, and smart city services. The accuracy of this 3D based systems is directly related to the quality of these models. Since manual large-scale modeling, such as cities or countries is highly time intensive and very expensive process, a fully automatic 3D building generation is needed. However, 3D modeling process result depends on the input data, the proprieties of the captured objects, and the required characteristics of the reconstructed 3D model. Nowadays, producing 3D real-world model is no longer a problem. Remotely sensed data had experienced a remarkable increase in the recent years, especially data acquired using unmanned aerial vehicles (UAV). While the scanning techniques are developing, the captured data amount and the resolution are getting bigger and more precise. This paper presents a literature review, which aims to identify different methods of automatic 3D buildings extractions either from LiDAR or the combination of LiDAR and satellite or aerial images. Then, we present open source technologies, and data models (e.g., CityGML, PostGIS, Cesiumjs) used to integrate these models in geospatial base layers for smart city services.

Keywords: CityGML, LiDAR, remote sensing, SIG, Smart City, 3D urban modeling

Procedia PDF Downloads 109
6455 The Museum of Museums: A Mobile Augmented Reality Application

Authors: Qian Jin

Abstract:

Museums have been using interactive technology to spark visitor interest and improve understanding. These technologies can play a crucial role in helping visitors understand more about an exhibition site by using multimedia to provide information. Google Arts and Culture and Smartify are two very successful digital heritage products. They used mobile augmented reality to visualise the museum's 3D models and heritage images but did not include 3D models of the collection and audio information. In this research, service-oriented mobile augmented reality application was developed for users to access collections from multiple museums(including V and A, the British Museum, and British Library). The third-party API (Application Programming Interface) is requested to collect metadata (including images, 3D models, videos, and text) of three museums' collections. The acquired content is then visualized in AR environments. This product will help users who cannot visit the museum offline due to various reasons (inconvenience of transportation, physical disability, time schedule).

Keywords: digital heritage, argument reality, museum, flutter, ARcore

Procedia PDF Downloads 58
6454 Artificial Intelligence-Based Detection of Individuals Suffering from Vestibular Disorder

Authors: Dua Hişam, Serhat İkizoğlu

Abstract:

Identifying the problem behind balance disorder is one of the most interesting topics in the medical literature. This study has considerably enhanced the development of artificial intelligence (AI) algorithms applying multiple machine learning (ML) models to sensory data on gait collected from humans to classify between normal people and those suffering from Vestibular System (VS) problems. Although AI is widely utilized as a diagnostic tool in medicine, AI models have not been used to perform feature extraction and identify VS disorders through training on raw data. In this study, three machine learning (ML) models, the Random Forest Classifier (RF), Extreme Gradient Boosting (XGB), and K-Nearest Neighbor (KNN), have been trained to detect VS disorder, and the performance comparison of the algorithms has been made using accuracy, recall, precision, and f1-score. With an accuracy of 95.28 %, Random Forest Classifier (RF) was the most accurate model.

Keywords: vestibular disorder, machine learning, random forest classifier, k-nearest neighbor, extreme gradient boosting

Procedia PDF Downloads 50
6453 On Four Models of a Three Server Queue with Optional Server Vacations

Authors: Kailash C. Madan

Abstract:

We study four models of a three server queueing system with Bernoulli schedule optional server vacations. Customers arriving at the system one by one in a Poisson process are provided identical exponential service by three parallel servers according to a first-come, first served queue discipline. In model A, all three servers may be allowed a vacation at one time, in Model B at the most two of the three servers may be allowed a vacation at one time, in model C at the most one server is allowed a vacation, and in model D no server is allowed a vacation. We study steady the state behavior of the four models and obtain steady state probability generating functions for the queue size at a random point of time for all states of the system. In model D, a known result for a three server queueing system without server vacations is derived.

Keywords: a three server queue, Bernoulli schedule server vacations, queue size distribution at a random epoch, steady state

Procedia PDF Downloads 284
6452 Time Series Regression with Meta-Clusters

Authors: Monika Chuchro

Abstract:

This paper presents a preliminary attempt to apply classification of time series using meta-clusters in order to improve the quality of regression models. In this case, clustering was performed as a method to obtain a subgroups of time series data with normal distribution from inflow into waste water treatment plant data which Composed of several groups differing by mean value. Two simple algorithms: K-mean and EM were chosen as a clustering method. The rand index was used to measure the similarity. After simple meta-clustering, regression model was performed for each subgroups. The final model was a sum of subgroups models. The quality of obtained model was compared with the regression model made using the same explanatory variables but with no clustering of data. Results were compared by determination coefficient (R2), measure of prediction accuracy mean absolute percentage error (MAPE) and comparison on linear chart. Preliminary results allows to foresee the potential of the presented technique.

Keywords: clustering, data analysis, data mining, predictive models

Procedia PDF Downloads 448
6451 Creation and Management of Knowledge for Organization Sustainability and Learning

Authors: Deepa Kapoor, Rajshree Singh

Abstract:

This paper appreciates the emergence and growing importance as a new production factor makes the development of technologies, methodologies and strategies for measurement, creation, and diffusion into one of the main priorities of the organizations in the knowledge society. There are many models for creation and management of knowledge and diverse and varied perspectives for study, analysis, and understanding. In this article, we will conduct a theoretical approach to the type of models for the creation and management of knowledge; we will discuss some of them and see some of the difficulties and the key factors that determine the success of the processes for the creation and management of knowledge.

Keywords: knowledge creation, knowledge management, organizational development, organization learning

Procedia PDF Downloads 320
6450 Removal of Toxic Ni++ Ions from Wastewater by Nano-Bentonite

Authors: A. M. Ahmed, Mona A. Darwish

Abstract:

Removal of Ni++ ions from aqueous solution by sorption ontoNano-bentonite was investigated. Experiments were carried out as a function amount of Nano-bentonite, pH, concentration of metal, constant time, agitation speed and temperature. The adsorption parameter of metal ions followed the Langmuir Freundlich adsorption isotherm were applied to analyze adsorption data. The adsorption process has fit pseudo-second order kinetic models. Thermodynamics parameters e.g.ΔG*, ΔS °and ΔH ° of adsorption process have also been calculated and the sorption process was found to be endothermic. The adsorption process has fit pseudo-second order kinetic models. Langmuir and Freundich adsorption isotherm models were applied to analyze adsorption data and both were found to be applicable to the adsorption process. Thermodynamic parameters, e.g., ∆G °, ∆S ° and ∆H ° of the on-going adsorption process have also been calculated and the sorption process was found to be endothermic. Finally, it can be seen that Bentonite was found to be more effective for the removal of Ni (II) same with some experimental conditions.

Keywords: waste water, nickel, bentonite, adsorption

Procedia PDF Downloads 236
6449 Analyzing the Performance of Machine Learning Models to Predict Alzheimer's Disease and its Stages Addressing Missing Value Problem

Authors: Carlos Theran, Yohn Parra Bautista, Victor Adankai, Richard Alo, Jimwi Liu, Clement G. Yedjou

Abstract:

Alzheimer's disease (AD) is a neurodegenerative disorder primarily characterized by deteriorating cognitive functions. AD has gained relevant attention in the last decade. An estimated 24 million people worldwide suffered from this disease by 2011. In 2016 an estimated 40 million were diagnosed with AD, and for 2050 is expected to reach 131 million people affected by AD. Therefore, detecting and confirming AD at its different stages is a priority for medical practices to provide adequate and accurate treatments. Recently, Machine Learning (ML) models have been used to study AD's stages handling missing values in multiclass, focusing on the delineation of Early Mild Cognitive Impairment (EMCI), Late Mild Cognitive Impairment (LMCI), and normal cognitive (CN). But, to our best knowledge, robust performance information of these models and the missing data analysis has not been presented in the literature. In this paper, we propose studying the performance of five different machine learning models for AD's stages multiclass prediction in terms of accuracy, precision, and F1-score. Also, the analysis of three imputation methods to handle the missing value problem is presented. A framework that integrates ML model for AD's stages multiclass prediction is proposed, performing an average accuracy of 84%.

Keywords: alzheimer's disease, missing value, machine learning, performance evaluation

Procedia PDF Downloads 215
6448 Persian Pistachio Nut (Pistacia vera L.) Dehydration in Natural and Industrial Conditions

Authors: Hamid Tavakolipour, Mohsen Mokhtarian, Ahmad Kalbasi Ashtari

Abstract:

In this study, the effect of various drying methods (sun drying, shade drying and industrial drying) on final moisture content, shell splitting degree, shrinkage and color change were studied. Sun drying resulted higher degree of pistachio nuts shell splitting on pistachio nuts relative other drying methods. The ANOVA results showed that the different drying methods did not significantly effects on color change of dried pistachio nut. The results illustrated that pistachio nut dried by industrial drying had the lowest moisture content. After the end of drying process, initially, the experimental drying data were fitted with five famous drying models namely Newton, Page, Silva et al., Peleg and Henderson and Pabis. The results indicated that Peleg and Page models gave better results compared with other models to monitor the moisture ratio’s pistachio nut in industrial drying and open sun (or shade drying) methods, respectively.

Keywords: industrial drying, pistachio, quality properties, traditional drying

Procedia PDF Downloads 315
6447 Credit Risk Evaluation Using Genetic Programming

Authors: Ines Gasmi, Salima Smiti, Makram Soui, Khaled Ghedira

Abstract:

Credit risk is considered as one of the important issues for financial institutions. It provokes great losses for banks. To this objective, numerous methods for credit risk evaluation have been proposed. Many evaluation methods are black box models that cannot adequately reveal information hidden in the data. However, several works have focused on building transparent rules-based models. For credit risk assessment, generated rules must be not only highly accurate, but also highly interpretable. In this paper, we aim to build both, an accurate and transparent credit risk evaluation model which proposes a set of classification rules. In fact, we consider the credit risk evaluation as an optimization problem which uses a genetic programming (GP) algorithm, where the goal is to maximize the accuracy of generated rules. We evaluate our proposed approach on the base of German and Australian credit datasets. We compared our finding with some existing works; the result shows that the proposed GP outperforms the other models.

Keywords: credit risk assessment, rule generation, genetic programming, feature selection

Procedia PDF Downloads 327
6446 An Infinite Mixture Model for Modelling Stutter Ratio in Forensic Data Analysis

Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer

Abstract:

Forensic DNA analysis has received much attention over the last three decades, due to its incredible usefulness in human identification. The statistical interpretation of DNA evidence is recognised as one of the most mature fields in forensic science. Peak heights in an Electropherogram (EPG) are approximately proportional to the amount of template DNA in the original sample being tested. A stutter is a minor peak in an EPG, which is not masking as an allele of a potential contributor, and considered as an artefact that is presumed to be arisen due to miscopying or slippage during the PCR. Stutter peaks are mostly analysed in terms of stutter ratio that is calculated relative to the corresponding parent allele height. Analysis of mixture profiles has always been problematic in evidence interpretation, especially with the presence of PCR artefacts like stutters. Unlike binary and semi-continuous models; continuous models assign a probability (as a continuous weight) for each possible genotype combination, and significantly enhances the use of continuous peak height information resulting in more efficient reliable interpretations. Therefore, the presence of a sound methodology to distinguish between stutters and real alleles is essential for the accuracy of the interpretation. Sensibly, any such method has to be able to focus on modelling stutter peaks. Bayesian nonparametric methods provide increased flexibility in applied statistical modelling. Mixture models are frequently employed as fundamental data analysis tools in clustering and classification of data and assume unidentified heterogeneous sources for data. In model-based clustering, each unknown source is reflected by a cluster, and the clusters are modelled using parametric models. Specifying the number of components in finite mixture models, however, is practically difficult even though the calculations are relatively simple. Infinite mixture models, in contrast, do not require the user to specify the number of components. Instead, a Dirichlet process, which is an infinite-dimensional generalization of the Dirichlet distribution, is used to deal with the problem of a number of components. Chinese restaurant process (CRP), Stick-breaking process and Pólya urn scheme are frequently used as Dirichlet priors in Bayesian mixture models. In this study, we illustrate an infinite mixture of simple linear regression models for modelling stutter ratio and introduce some modifications to overcome weaknesses associated with CRP.

Keywords: Chinese restaurant process, Dirichlet prior, infinite mixture model, PCR stutter

Procedia PDF Downloads 309
6445 Adaptive Neuro Fuzzy Inference System Model Based on Support Vector Regression for Stock Time Series Forecasting

Authors: Anita Setianingrum, Oki S. Jaya, Zuherman Rustam

Abstract:

Forecasting stock price is a challenging task due to the complex time series of the data. The complexity arises from many variables that affect the stock market. Many time series models have been proposed before, but those previous models still have some problems: 1) put the subjectivity of choosing the technical indicators, and 2) rely upon some assumptions about the variables, so it is limited to be applied to all datasets. Therefore, this paper studied a novel Adaptive Neuro-Fuzzy Inference System (ANFIS) time series model based on Support Vector Regression (SVR) for forecasting the stock market. In order to evaluate the performance of proposed models, stock market transaction data of TAIEX and HIS from January to December 2015 is collected as experimental datasets. As a result, the method has outperformed its counterparts in terms of accuracy.

Keywords: ANFIS, fuzzy time series, stock forecasting, SVR

Procedia PDF Downloads 225
6444 Comparison of Fundamental Frequency Model and PWM Based Model for UPFC

Authors: S. A. Al-Qallaf, S. A. Al-Mawsawi, A. Haider

Abstract:

Among all FACTS devices, the unified power flow controller (UPFC) is considered to be the most versatile device. This is due to its capability to control all the transmission system parameters (impedance, voltage magnitude, and phase angle). With the growing interest in UPFC, the attention to develop a mathematical model has increased. Several models were introduced for UPFC in literature for different type of studies in power systems. In this paper a novel comparison study between two dynamic models of UPFC with their proposed control strategies.

Keywords: FACTS, UPFC, dynamic modeling, PWM, fundamental frequency

Procedia PDF Downloads 329
6443 Comparison of Deep Learning and Machine Learning Algorithms to Diagnose and Predict Breast Cancer

Authors: F. Ghazalnaz Sharifonnasabi, Iman Makhdoom

Abstract:

Breast cancer is a serious health concern that affects many people around the world. According to a study published in the Breast journal, the global burden of breast cancer is expected to increase significantly over the next few decades. The number of deaths from breast cancer has been increasing over the years, but the age-standardized mortality rate has decreased in some countries. It’s important to be aware of the risk factors for breast cancer and to get regular check- ups to catch it early if it does occur. Machin learning techniques have been used to aid in the early detection and diagnosis of breast cancer. These techniques, that have been shown to be effective in predicting and diagnosing the disease, have become a research hotspot. In this study, we consider two deep learning approaches including: Multi-Layer Perceptron (MLP), and Convolutional Neural Network (CNN). We also considered the five-machine learning algorithm titled: Decision Tree (C4.5), Naïve Bayesian (NB), Support Vector Machine (SVM), K-Nearest Neighbors (KNN) Algorithm and XGBoost (eXtreme Gradient Boosting) on the Breast Cancer Wisconsin Diagnostic dataset. We have carried out the process of evaluating and comparing classifiers involving selecting appropriate metrics to evaluate classifier performance and selecting an appropriate tool to quantify this performance. The main purpose of the study is predicting and diagnosis breast cancer, applying the mentioned algorithms and also discovering of the most effective with respect to confusion matrix, accuracy and precision. It is realized that CNN outperformed all other classifiers and achieved the highest accuracy (0.982456). The work is implemented in the Anaconda environment based on Python programing language.

Keywords: breast cancer, multi-layer perceptron, Naïve Bayesian, SVM, decision tree, convolutional neural network, XGBoost, KNN

Procedia PDF Downloads 54
6442 Investigation of the Effect of Lecturers' Attributes on Students' Interest in Learning Statistic Ghanaian Tertiary Institutions

Authors: Samuel Asiedu-Addo, Jonathan Annan, Yarhands Dissou Arthur

Abstract:

The study aims to explore the relational effect of lecturers’ personal attribute on student’s interest in statistics. In this study personal attributes of lecturers’ such as lecturer’s dynamism, communication strategies and rapport in the classroom as well as applied knowledge during lecture were examined. Here, exploratory research design was used to establish the effect of lecturer’s personal attributes on student’s interest. Data were analyzed by means of confirmatory factor analysis and structural equation modeling (SEM) using the SmartPLS 3 program. The study recruited 376 students from the faculty of technical and vocational education of the University of Education Winneba Kumasi campus, and Ghana Technology University College as well as Kwame Nkrumah University of science and Technology. The results revealed that personal attributes of an effective lecturer were lecturer’s dynamism, rapport, communication and applied knowledge contribute (52.9%) in explaining students interest in statistics. Our regression analysis and structural equation modeling confirm that lecturers personal attribute contribute effectively by predicting student’s interest of 52.9% and 53.7% respectively. The paper concludes that the total effect of a lecturer’s attribute on student’s interest is moderate and significant. While a lecturer’s communication and dynamism were found to contribute positively to students’ interest, they were insignificant in predicting students’ interest. We further showed that a lecturer’s personal attributes such as applied knowledge and rapport have positive and significant effect on tertiary student’s interest in statistic, whilst lecturers’ communication and dynamism do not significantly affect student interest in statistics; though positively related.

Keywords: student interest, effective teacher, personal attributes, regression and SEM

Procedia PDF Downloads 342
6441 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 175
6440 Development of an Optimised, Automated Multidimensional Model for Supply Chains

Authors: Safaa H. Sindi, Michael Roe

Abstract:

This project divides supply chain (SC) models into seven Eras, according to the evolution of the market’s needs throughout time. The five earliest Eras describe the emergence of supply chains, while the last two Eras are to be created. Research objectives: The aim is to generate the two latest Eras with their respective models that focus on the consumable goods. Era Six contains the Optimal Multidimensional Matrix (OMM) that incorporates most characteristics of the SC and allocates them into four quarters (Agile, Lean, Leagile, and Basic SC). This will help companies, especially (SMEs) plan their optimal SC route. Era Seven creates an Automated Multidimensional Model (AMM) which upgrades the matrix of Era six, as it accounts for all the supply chain factors (i.e. Offshoring, sourcing, risk) into an interactive system with Heuristic Learning that helps larger companies and industries to select the best SC model for their market. Methodologies: The data collection is based on a Fuzzy-Delphi study that analyses statements using Fuzzy Logic. The first round of Delphi study will contain statements (fuzzy rules) about the matrix of Era six. The second round of Delphi contains the feedback given from the first round and so on. Preliminary findings: both models are applicable, Matrix of Era six reduces the complexity of choosing the best SC model for SMEs by helping them identify the best strategy of Basic SC, Lean, Agile and Leagile SC; that’s tailored to their needs. The interactive heuristic learning in the AMM of Era seven will help mitigate error and aid large companies to identify and re-strategize the best SC model and distribution system for their market and commodity, hence increasing efficiency. Potential contributions to the literature: The problematic issue facing many companies is to decide which SC model or strategy to incorporate, due to the many models and definitions developed over the years. This research simplifies this by putting most definition in a template and most models in the Matrix of era six. This research is original as the division of SC into Eras, the Matrix of Era six (OMM) with Fuzzy-Delphi and Heuristic Learning in the AMM of Era seven provides a synergy of tools that were not combined before in the area of SC. Additionally the OMM of Era six is unique as it combines most characteristics of the SC, which is an original concept in itself.

Keywords: Leagile, automation, heuristic learning, supply chain models

Procedia PDF Downloads 373
6439 Kinetics, Equilibrium and Thermodynamics of the Adsorption of Triphenyltin onto NanoSiO₂/Fly Ash/Activated Carbon Composite

Authors: Olushola S. Ayanda, Olalekan S. Fatoki, Folahan A. Adekola, Bhekumusa J. Ximba, Cecilia O. Akintayo

Abstract:

In the present study, the kinetics, equilibrium and thermodynamics of the adsorption of triphenyltin (TPT) from TPT-contaminated water onto nanoSiO2/fly ash/activated carbon composite was investigated in batch adsorption system. Equilibrium adsorption data were analyzed using Langmuir, Freundlich, Temkin and Dubinin–Radushkevich (D-R) isotherm models. Pseudo first- and second-order, Elovich and fractional power models were applied to test the kinetic data and in order to understand the mechanism of adsorption, thermodynamic parameters such as ΔG°, ΔSo and ΔH° were also calculated. The results showed a very good compliance with pseudo second-order equation while the Freundlich and D-R models fit the experiment data. Approximately 99.999 % TPT was removed from the initial concentration of 100 mg/L TPT at 80oC, contact time of 60 min, pH 8 and a stirring speed of 200 rpm. Thus, nanoSiO2/fly ash/activated carbon composite could be used as effective adsorbent for the removal of TPT from contaminated water and wastewater.

Keywords: isotherm, kinetics, nanoSiO₂/fly ash/activated carbon composite, tributyltin

Procedia PDF Downloads 280
6438 Clinical Evaluation of Neutrophil to Lymphocytes Ratio and Platelets to Lymphocytes Ratio in Immune Thrombocytopenic Purpura

Authors: Aisha Arshad, Samina Naz Mukry, Tahir Shamsi

Abstract:

Background: Immune thrombocytopenia (ITP) is an autoimmune disorder. Besides platelets counts, immature platelets fraction (IPF) can be used as tool to predict megakaryocytic activity in ITP patients. The clinical biomarkers like Neutrophils to lymphocytes ratio (NLR) and platelet to lymphocytes ratio(PLR) predicts inflammation and can be used as prognostic markers.The present study was planned to assess the ratios in ITP and their utility in predicting prognosis after treatment. Methods: A total of 111 patients of ITP with same number of healthy individuals were included in this case control study during the period of January 2015 to December 2017.All the ITP patients were grouped according to guidelines of International working group of ITP. A 3cc blood was collected in EDTA tube and blood parameters were evaluated using Sysmex 1000 analyzer.The ratios were calculated by using absolute counts of Neutrophils,Lymphocytes and platelets.The significant (p=<0.05) difference between ITP patients and healthy control groups was determined by Kruskal wallis test, Dunn’s test and spearman’s correlation test was done using SPSS version 23. Results: The significantly raised total leucocytes counts (TLC) and IPF along with low platelets counts were observed in ITP patients as compared to healthy controls.In ITP groups,very low platelet count with median and IQR of 2(3.8)3x109/l with highest mean and IQR IPF 25.4(19.8)% was observed in newly diagnosed ITP group. The NLR was high with prognosis of disease as higher levels were observed in P-ITP. The PLR was significantly low in ND-ITP ,P-ITP, C-ITP, R-ITP and compared to controls with p=<0.001 as platelet were less in number in all ITP patients. Conclusion: The IPF can be used in evaluation of bone marrow response in ITP. The simple, reliable and calculated NLR and PLR ratios can be used in predicting prognosis and response to treatment in ITP and to some extend the severity of disease.

Keywords: neutrophils, platelets, lymphocytes, infection

Procedia PDF Downloads 76
6437 The Use of Respiratory Index of Severity in Children (RISC) for Predicting Clinical Outcomes for 3 Months-59 Months Old Patients Hospitalized with Community-Acquired Pneumonia in Visayas Community Medical Center, Cebu City from January 2013 - June 2

Authors: Karl Owen L. Suan, Juliet Marie S. Lambayan, Floramay P. Salo-Curato

Abstract:

Objective: To predict the outcome among patients admitted with community-acquired pneumonia (ages 3 months to 59 months old) admitted in Visayas Community Medical Center using the Respiratory Index of Severity in Children (RISC). Design: A cross-sectional study design was used. Setting: The study was done in Visayas Community Medical Center, which is a private tertiary level in Cebu City from January-June 2013. Patients/Participants: A total of 72 patients were initially enrolled in the study. However, 1 patient transferred to another institution, thus 71 patients were included in this study. Within 24 hours from admission, patients were assigned a RISC score. Statistical Analysis: Cohen’s kappa coefficient was used for inter-rater agreement for categorical data. This study used frequency and percentage distribution for qualitative data. Mean, standard deviation and range were used for quantitative data. To determine the relationship of each RISC score parameter and the total RISC score with the outcome, a Mann Whitney U Test and 2x2 Fischer Exact test for testing associations were used. A p value less of than 0.05 alpha was considered significant. Results: There was a statistical significance between RISC score and clinical outcome. RISC score of greater than 4 was correlated with intubation and/or mortality. Conclusion: The RISC scoring system is a simple combination of clinical parameters and a reliable tool that will help stratify patients aged 3 months to 59 months in predicting clinical outcome.

Keywords: RISC, clinical outcome, community-acquired pneumonia, patients

Procedia PDF Downloads 279
6436 Strategy Management of Soybean (Glycine max L.) for Dealing with Extreme Climate through the Use of Cropsyst Model

Authors: Aminah Muchdar, Nuraeni, Eddy

Abstract:

The aims of the research are: (1) to verify the cropsyst plant model of experimental data in the field of soybean plants and (2) to predict planting time and potential yield soybean plant with the use of cropsyst model. This research is divided into several stages: (1) first calibration stage which conducted in the field from June until September 2015.(2) application models stage, where the data obtained from calibration in the field will be included in cropsyst models. The required data models are climate data, ground data/soil data,also crop genetic data. The relationship between the obtained result in field with simulation cropsyst model indicated by Efficiency Index (EF) which the value is 0,939.That is showing that cropsyst model is well used. From the calculation result RRMSE which the value is 1,922%.That is showing that comparative fault prediction results from simulation with result obtained in the field is 1,92%. The conclusion has obtained that the prediction of soybean planting time cropsyst based models that have been made valid for use. and the appropriate planting time for planting soybeans mainly on rain-fed land is at the end of the rainy season, in which the above study first planting time (June 2, 2015) which gives the highest production, because at that time there was still some rain. Tanggamus varieties more resistant to slow planting time cause the percentage decrease in the yield of each decade is lower than the average of all varieties.

Keywords: soybean, Cropsyst, calibration, efficiency Index, RRMSE

Procedia PDF Downloads 162