Search results for: model based clustering
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 38025

Search results for: model based clustering

37635 A Novel Approach of NPSO on Flexible Logistic (S-Shaped) Model for Software Reliability Prediction

Authors: Pooja Rani, G. S. Mahapatra, S. K. Pandey

Abstract:

In this paper, we propose a novel approach of Neural Network and Particle Swarm Optimization methods for software reliability prediction. We first explain how to apply compound function in neural network so that we can derive a Flexible Logistic (S-shaped) Growth Curve (FLGC) model. This model mathematically represents software failure as a random process and can be used to evaluate software development status during testing. To avoid trapping in local minima, we have applied Particle Swarm Optimization method to train proposed model using failure test data sets. We drive our proposed model using computational based intelligence modeling. Thus, proposed model becomes Neuro-Particle Swarm Optimization (NPSO) model. We do test result with different inertia weight to update particle and update velocity. We obtain result based on best inertia weight compare along with Personal based oriented PSO (pPSO) help to choose local best in network neighborhood. The applicability of proposed model is demonstrated through real time test data failure set. The results obtained from experiments show that the proposed model has a fairly accurate prediction capability in software reliability.

Keywords: software reliability, flexible logistic growth curve model, software cumulative failure prediction, neural network, particle swarm optimization

Procedia PDF Downloads 344
37634 Genomic Prediction Reliability Using Haplotypes Defined by Different Methods

Authors: Sohyoung Won, Heebal Kim, Dajeong Lim

Abstract:

Genomic prediction is an effective way to measure the abilities of livestock for breeding based on genomic estimated breeding values, statistically predicted values from genotype data using best linear unbiased prediction (BLUP). Using haplotypes, clusters of linked single nucleotide polymorphisms (SNPs), as markers instead of individual SNPs can improve the reliability of genomic prediction since the probability of a quantitative trait loci to be in strong linkage disequilibrium (LD) with markers is higher. To efficiently use haplotypes in genomic prediction, finding optimal ways to define haplotypes is needed. In this study, 770K SNP chip data was collected from Hanwoo (Korean cattle) population consisted of 2506 cattle. Haplotypes were first defined in three different ways using 770K SNP chip data: haplotypes were defined based on 1) length of haplotypes (bp), 2) the number of SNPs, and 3) k-medoids clustering by LD. To compare the methods in parallel, haplotypes defined by all methods were set to have comparable sizes; in each method, haplotypes defined to have an average number of 5, 10, 20 or 50 SNPs were tested respectively. A modified GBLUP method using haplotype alleles as predictor variables was implemented for testing the prediction reliability of each haplotype set. Also, conventional genomic BLUP (GBLUP) method, which uses individual SNPs were tested to evaluate the performance of the haplotype sets on genomic prediction. Carcass weight was used as the phenotype for testing. As a result, using haplotypes defined by all three methods showed increased reliability compared to conventional GBLUP. There were not many differences in the reliability between different haplotype defining methods. The reliability of genomic prediction was highest when the average number of SNPs per haplotype was 20 in all three methods, implying that haplotypes including around 20 SNPs can be optimal to use as markers for genomic prediction. When the number of alleles generated by each haplotype defining methods was compared, clustering by LD generated the least number of alleles. Using haplotype alleles for genomic prediction showed better performance, suggesting improved accuracy in genomic selection. The number of predictor variables was decreased when the LD-based method was used while all three haplotype defining methods showed similar performances. This suggests that defining haplotypes based on LD can reduce computational costs and allows efficient prediction. Finding optimal ways to define haplotypes and using the haplotype alleles as markers can provide improved performance and efficiency in genomic prediction.

Keywords: best linear unbiased predictor, genomic prediction, haplotype, linkage disequilibrium

Procedia PDF Downloads 141
37633 A Combined AHP-GP Model for Selecting Knowledge Management Tool

Authors: Ahmad Sarfaraz, Raiyad Herwies

Abstract:

In this paper, a multi-criteria decision making analysis is used to help any organization selects the best KM tool that fits and serves its needs. The AHP model is used based on a previous study to highlight and identify the main criteria and sub-criteria that are incorporated in the selection process. Different KM tools alternatives with different criteria are compared and weighted accurately to be incorporated in the GP model. The main goal is to combine the GP model with the AHP model to ensure that selecting the KM tool considers the resource constraints. Two important issues are discussed in this paper: how different factors could be taken into consideration in forming the AHP model, and how to incorporate the AHP results into the GP model for better results.

Keywords: knowledge management, analytical hierarchy process, goal programming, multi-criteria decision making

Procedia PDF Downloads 385
37632 Probability-Based Damage Detection of Structures Using Model Updating with Enhanced Ideal Gas Molecular Movement Algorithm

Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee

Abstract:

Model updating method has received increasing attention in damage detection structures based on measured modal parameters. Therefore, a probability-based damage detection (PBDD) procedure based on a model updating procedure is presented in this paper, in which a one-stage model-based damage identification technique based on the dynamic features of a structure is investigated. The presented framework uses a finite element updating method with a Monte Carlo simulation that considers the uncertainty caused by measurement noise. Enhanced ideal gas molecular movement (EIGMM) is used as the main algorithm for model updating. Ideal gas molecular movement (IGMM) is a multiagent algorithm based on the ideal gas molecular movement. Ideal gas molecules disperse rapidly in different directions and cover all the space inside. This is embedded in the high speed of molecules, collisions between them and with the surrounding barriers. In IGMM algorithm to accomplish the optimal solutions, the initial population of gas molecules is randomly generated and the governing equations related to the velocity of gas molecules and collisions between those are utilized. In this paper, an enhanced version of IGMM, which removes unchanged variables after specified iterations, is developed. The proposed method is implemented on two numerical examples in the field of structural damage detection. The results show that the proposed method can perform well and competitive in PBDD of structures.

Keywords: enhanced ideal gas molecular movement (EIGMM), ideal gas molecular movement (IGMM), model updating method, probability-based damage detection (PBDD), uncertainty quantification

Procedia PDF Downloads 277
37631 Metabolic Predictive Model for PMV Control Based on Deep Learning

Authors: Eunji Choi, Borang Park, Youngjae Choi, Jinwoo Moon

Abstract:

In this study, a predictive model for estimating the metabolism (MET) of human body was developed for the optimal control of indoor thermal environment. Human body images for indoor activities and human body joint coordinated values were collected as data sets, which are used in predictive model. A deep learning algorithm was used in an initial model, and its number of hidden layers and hidden neurons were optimized. Lastly, the model prediction performance was analyzed after the model being trained through collected data. In conclusion, the possibility of MET prediction was confirmed, and the direction of the future study was proposed as developing various data and the predictive model.

Keywords: deep learning, indoor quality, metabolism, predictive model

Procedia PDF Downloads 257
37630 Developed Text-Independent Speaker Verification System

Authors: Mohammed Arif, Abdessalam Kifouche

Abstract:

Speech is a very convenient way of communication between people and machines. It conveys information about the identity of the talker. Since speaker recognition technology is increasingly securing our everyday lives, the objective of this paper is to develop two automatic text-independent speaker verification systems (TI SV) using low-level spectral features and machine learning methods. (i) The first system is based on a support vector machine (SVM), which was widely used in voice signal processing with the aim of speaker recognition involving verifying the identity of the speaker based on its voice characteristics, and (ii) the second is based on Gaussian Mixture Model (GMM) and Universal Background Model (UBM) to combine different functions from different resources to implement the SVM based.

Keywords: speaker verification, text-independent, support vector machine, Gaussian mixture model, cepstral analysis

Procedia PDF Downloads 58
37629 Comparison of Sourcing Process in Supply Chain Operation References Model and Business Information Systems

Authors: Batuhan Kocaoglu

Abstract:

Although using powerful systems like ERP (Enterprise Resource Planning), companies still cannot benchmark their processes and measure their process performance easily based on predefined SCOR (Supply Chain Operation References) terms. The purpose of this research is to identify common and corresponding processes to present a conceptual model to model and measure the purchasing process of an organization. The main steps for the research study are: Literature review related to 'procure to pay' process in ERP system; Literature review related to 'sourcing' process in SCOR model; To develop a conceptual model integrating 'sourcing' of SCOR model and 'procure to pay' of ERP model. In this study, we examined the similarities and differences between these two models. The proposed framework is based on the assumptions that are drawn from (1) the body of literature, (2) the authors’ experience by working in the field of enterprise and logistics information systems. The modeling framework provides a structured and systematic way to model and decompose necessary information from conceptual representation to process element specification. This conceptual model will help the organizations to make quality purchasing system measurement instruments and tools. And offered adaptation issues for ERP systems and SCOR model will provide a more benchmarkable and worldwide standard business process.

Keywords: SCOR, ERP, procure to pay, sourcing, reference model

Procedia PDF Downloads 362
37628 Media Richness Perspective on Web 2.0 Usage for Knowledge Creation: The Case of the Cocoa Industry in Ghana

Authors: Albert Gyamfi

Abstract:

Cocoa plays critical role in the socio-economic development of Ghana. Meanwhile, smallholder farmers most of whom are illiterate dominate the industry. According to the cocoa-based agricultural knowledge and information system (AKIS) model knowledge is created and transferred to the industry between three key actors: cocoa researchers, extension experts, and cocoa farmers. Dwelling on the SECI model, the media richness theory (MRT), and the AKIS model, a conceptual model of web 2.0-based AKIS model (AKIS 2.0) is developed and used to assess the possible effects of social media usage for knowledge creation in the Ghanaian cocoa industry. A mixed method approach with a survey questionnaire was employed, and a second-order multi-group structural equation model (SEM) was used to analyze the data. The study concludes that the use of web 2.0 applications for knowledge creation would lead to sustainable interactions among the key knowledge actors for effective knowledge creation in the cocoa industry in Ghana.

Keywords: agriculture, cocoa, knowledge, media, web 2.0

Procedia PDF Downloads 333
37627 Hydro-Gravimetric Ann Model for Prediction of Groundwater Level

Authors: Jayanta Kumar Ghosh, Swastik Sunil Goriwale, Himangshu Sarkar

Abstract:

Groundwater is one of the most valuable natural resources that society consumes for its domestic, industrial, and agricultural water supply. Its bulk and indiscriminate consumption affects the groundwater resource. Often, it has been found that the groundwater recharge rate is much lower than its demand. Thus, to maintain water and food security, it is necessary to monitor and management of groundwater storage. However, it is challenging to estimate groundwater storage (GWS) by making use of existing hydrological models. To overcome the difficulties, machine learning (ML) models are being introduced for the evaluation of groundwater level (GWL). Thus, the objective of this research work is to develop an ML-based model for the prediction of GWL. This objective has been realized through the development of an artificial neural network (ANN) model based on hydro-gravimetry. The model has been developed using training samples from field observations spread over 8 months. The developed model has been tested for the prediction of GWL in an observation well. The root means square error (RMSE) for the test samples has been found to be 0.390 meters. Thus, it can be concluded that the hydro-gravimetric-based ANN model can be used for the prediction of GWL. However, to improve the accuracy, more hydro-gravimetric parameter/s may be considered and tested in future.

Keywords: machine learning, hydro-gravimetry, ground water level, predictive model

Procedia PDF Downloads 127
37626 Character Education Model for Early Childhood Based Javanese Culture

Authors: Rafika Bayu Kusumandari, Istyarini, Ispen Safrel

Abstract:

Character education will be more meaningful if carried out since early childhood. This is because early childhood education is the foundation of the formation of character. This study intends to find a model of character education in early childhood based on Javanese culture. In keeping with the focus of the study, long-term goals to be achieved through this research is to find once described the development of a model of character education in early childhood Javanese culture based in Semarang are then applied across early childhood education institutions in Semarang City. The specific objective of the study is: Describe the character models and management education in early childhood Java-based culture in Semarang City. The benefits of this research are; Provide an overview of the model and describe the management of character education in early childhood Java-based culture in Semarang City. Referring to the objectives of the research program was designed with a "Research and Development", meaning that a program of research followed by development programs for improvement or refinement. To produce a prototype model of character education in early childhood Java-based culture in the city, taken systematic measures in the form of the action, reflection, evaluation and innovation by applying qualitative research methods, descriptive, development, experimentation, and evaluation. This study aims to gain in-depth description of the model of character education in early childhood Java-based culture in the city of Semarang. The reason for the use of the use of qualitative methods researcher's knowledge, no study results and empirical research specifically about the model of character education in early childhood Java-based culture in the city of Semarang. On the implementation of character education early childhood adapted to the characteristics of each school and the emphasis of each agency arrangements for early childhood education, culture-based Java. Javanese culture should be introduced early in order not to erode the cultural lost outside the entrance as the era of globalization. In addition, Java is promoting a culture of courtesy and manners are very appropriate for the character formation of children of early age.

Keywords: education character, Javanese culture, childhood, character

Procedia PDF Downloads 391
37625 Data Model to Predict Customize Skin Care Product Using Biosensor

Authors: Ashi Gautam, Isha Shukla, Akhil Seghal

Abstract:

Biosensors are analytical devices that use a biological sensing element to detect and measure a specific chemical substance or biomolecule in a sample. These devices are widely used in various fields, including medical diagnostics, environmental monitoring, and food analysis, due to their high specificity, sensitivity, and selectivity. In this research paper, a machine learning model is proposed for predicting the suitability of skin care products based on biosensor readings. The proposed model takes in features extracted from biosensor readings, such as biomarker concentration, skin hydration level, inflammation presence, sensitivity, and free radicals, and outputs the most appropriate skin care product for an individual. This model is trained on a dataset of biosensor readings and corresponding skin care product information. The model's performance is evaluated using several metrics, including accuracy, precision, recall, and F1 score. The aim of this research is to develop a personalised skin care product recommendation system using biosensor data. By leveraging the power of machine learning, the proposed model can accurately predict the most suitable skin care product for an individual based on their biosensor readings. This is particularly useful in the skin care industry, where personalised recommendations can lead to better outcomes for consumers. The developed model is based on supervised learning, which means that it is trained on a labeled dataset of biosensor readings and corresponding skin care product information. The model uses these labeled data to learn patterns and relationships between the biosensor readings and skin care products. Once trained, the model can predict the most suitable skin care product for an individual based on their biosensor readings. The results of this study show that the proposed machine learning model can accurately predict the most appropriate skin care product for an individual based on their biosensor readings. The evaluation metrics used in this study demonstrate the effectiveness of the model in predicting skin care products. This model has significant potential for practical use in the skin care industry for personalised skin care product recommendations. The proposed machine learning model for predicting the suitability of skin care products based on biosensor readings is a promising development in the skin care industry. The model's ability to accurately predict the most appropriate skin care product for an individual based on their biosensor readings can lead to better outcomes for consumers. Further research can be done to improve the model's accuracy and effectiveness.

Keywords: biosensors, data model, machine learning, skin care

Procedia PDF Downloads 97
37624 Towards a Distributed Computation Platform Tailored for Educational Process Discovery and Analysis

Authors: Awatef Hicheur Cairns, Billel Gueni, Hind Hafdi, Christian Joubert, Nasser Khelifa

Abstract:

Given the ever changing needs of the job markets, education and training centers are increasingly held accountable for student success. Therefore, education and training centers have to focus on ways to streamline their offers and educational processes in order to achieve the highest level of quality in curriculum contents and managerial decisions. Educational process mining is an emerging field in the educational data mining (EDM) discipline, concerned with developing methods to discover, analyze and provide a visual representation of complete educational processes. In this paper, we present our distributed computation platform which allows different education centers and institutions to load their data and access to advanced data mining and process mining services. To achieve this, we present also a comparative study of the different clustering techniques developed in the context of process mining to partition efficiently educational traces. Our goal is to find the best strategy for distributing heavy analysis computations on many processing nodes of our platform.

Keywords: educational process mining, distributed process mining, clustering, distributed platform, educational data mining, ProM

Procedia PDF Downloads 454
37623 Numerical Simulation of Fracturing Behaviour of Pre-Cracked Crystalline Rock Using a Cohesive Grain-Based Distinct Element Model

Authors: Mahdi Saadat, Abbas Taheri

Abstract:

Understanding the cracking response of crystalline rocks at mineralogical scale is of great importance during the design procedure of mining structures. A grain-based distinct element model (GBM) is employed to numerically study the cracking response of Barre granite at micro- and macro-scales. The GBM framework is augmented with a proposed distinct element-based cohesive model to reproduce the micro-cracking response of the inter- and intra-grain contacts. The cohesive GBM framework is implemented in PFC2D distinct element codes. The microstructural properties of Barre granite are imported in PFC2D to generate synthetic specimens. The microproperties of the model is calibrated against the laboratory uniaxial compressive and Brazilian split tensile tests. The calibrated model is then used to simulate the fracturing behaviour of pre-cracked Barre granite with different flaw configurations. The numerical results of the proposed model demonstrate a good agreement with the experimental counterparts. The GBM framework proposed thus appears promising for further investigation of the influence of grain microstructure and mineralogical properties on the cracking behaviour of crystalline rocks.

Keywords: discrete element modelling, cohesive grain-based model, crystalline rock, fracturing behavior

Procedia PDF Downloads 129
37622 A Deep Learning Based Integrated Model For Spatial Flood Prediction

Authors: Vinayaka Gude Divya Sampath

Abstract:

The research introduces an integrated prediction model to assess the susceptibility of roads in a future flooding event. The model consists of deep learning algorithm for forecasting gauge height data and Flood Inundation Mapper (FIM) for spatial flooding. An optimal architecture for Long short-term memory network (LSTM) was identified for the gauge located on Tangipahoa River at Robert, LA. Dropout was applied to the model to evaluate the uncertainty associated with the predictions. The estimates are then used along with FIM to identify the spatial flooding. Further geoprocessing in ArcGIS provides the susceptibility values for different roads. The model was validated based on the devastating flood of August 2016. The paper discusses the challenges for generalization the methodology for other locations and also for various types of flooding. The developed model can be used by the transportation department and other emergency response organizations for effective disaster management.

Keywords: deep learning, disaster management, flood prediction, urban flooding

Procedia PDF Downloads 146
37621 Study for an Optimal Cable Connection within an Inner Grid of an Offshore Wind Farm

Authors: Je-Seok Shin, Wook-Won Kim, Jin-O Kim

Abstract:

The offshore wind farm needs to be designed carefully considering economics and reliability aspects. There are many decision-making problems for designing entire offshore wind farm, this paper focuses on an inner grid layout which means the connection between wind turbines as well as between wind turbines and an offshore substation. A methodology proposed in this paper determines the connections and the cable type for each connection section using K-clustering, minimum spanning tree and cable selection algorithms. And then, a cost evaluation is performed in terms of investment, power loss and reliability. Through the cost evaluation, an optimal layout of inner grid is determined so as to have the lowest total cost. In order to demonstrate the validity of the methodology, the case study is conducted on 240MW offshore wind farm, and the results show that it is helpful to design optimally offshore wind farm.

Keywords: offshore wind farm, optimal layout, k-clustering algorithm, minimum spanning algorithm, cable type selection, power loss cost, reliability cost

Procedia PDF Downloads 385
37620 SiC Merged PiN and Schottky (MPS) Power Diodes Electrothermal Modeling in SPICE

Authors: A. Lakrim, D. Tahri

Abstract:

This paper sets out a behavioral macro-model of a Merged PiN and Schottky (MPS) diode based on silicon carbide (SiC). This model holds good for both static and dynamic electrothermal simulations for industrial applications. Its parameters have been worked out from datasheets curves by drawing on the optimization method: Simulated Annealing (SA) for the SiC MPS diodes made available in the industry. The model also adopts the Analog Behavioral Model (ABM) of PSPICE in which it has been implemented. The thermal behavior of the devices was also taken into consideration by making use of Foster’ canonical network as figured out from electro-thermal measurement provided by the manufacturer of the device.

Keywords: SiC MPS diode, electro-thermal, SPICE model, behavioral macro-model

Procedia PDF Downloads 407
37619 Predictive Models of Ruin Probability in Retirement Withdrawal Strategies

Authors: Yuanjin Liu

Abstract:

Retirement withdrawal strategies are very important to minimize the probability of ruin in retirement. The ruin probability is modeled as a function of initial withdrawal age, gender, asset allocation, inflation rate, and initial withdrawal rate. The ruin probability is obtained based on the 2019 period life table for the Social Security, IRS Required Minimum Distribution (RMD) Worksheets, US historical bond and equity returns, and inflation rates using simulation. Several popular machine learning algorithms of the generalized additive model, random forest, support vector machine, extreme gradient boosting, and artificial neural network are built. The model validation and selection are based on the test errors using hyperparameter tuning and train-test split. The optimal model is recommended for retirees to monitor the ruin probability. The optimal withdrawal strategy can be obtained based on the optimal predictive model.

Keywords: ruin probability, retirement withdrawal strategies, predictive models, optimal model

Procedia PDF Downloads 73
37618 Intelligent Control Design of Car Following Behavior Using Fuzzy Logic

Authors: Abdelkader Merah, Kada Hartani

Abstract:

A reference model based control approach for improving behavior following car is proposed in this paper. The reference model is nonlinear and provides dynamic solutions consistent with safety constraints and comfort specifications. a robust fuzzy logic based control strategy is further proposed in this paper. A set of simulation results showing the suitability of the proposed technique for various demanding cenarios is also included in this paper.

Keywords: reference model, longitudinal control, fuzzy logic, design of car

Procedia PDF Downloads 430
37617 Semantic Based Analysis in Complaint Management System with Analytics

Authors: Francis Alterado, Jennifer Enriquez

Abstract:

Semantic Based Analysis in Complaint Management System with Analytics is an enhanced tool of providing complaints by the clients as well as a mechanism for Palawan Polytechnic College to gather, process, and monitor status of these complaints. The study has a mobile application that serves as a remote facility of communication between the students and the school management on the issues encountered by the student and the solution of every complaint received. In processing the complaints, text mining and clustering algorithms were utilized. Every module of the systems was tested and based on the results; these are 100% free from error before integration was done. A system testing was also done by checking the expected functionality of the system which was 100% functional. The system was tested by 10 students by forwarding complaints to 10 departments. Based on results, the students were able to submit complaints, the system was able to process accordingly by identifying to which department the complaints are intended, and the concerned department was able to give feedback on the complaint received to the student. With this, the system gained 4.7 rating which means Excellent.

Keywords: technology adoption, emerging technology, issues challenges, algorithm, text mining, mobile technology

Procedia PDF Downloads 199
37616 Probability-Based Damage Detection of Structures Using Kriging Surrogates and Enhanced Ideal Gas Molecular Movement Algorithm

Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee

Abstract:

Surrogate model has received increasing attention for use in detecting damage of structures based on vibration modal parameters. However, uncertainties existing in the measured vibration data may lead to false or unreliable output result from such model. In this study, an efficient approach based on Monte Carlo simulation is proposed to take into account the effect of uncertainties in developing a surrogate model. The probability of damage existence (PDE) is calculated based on the probability density function of the existence of undamaged and damaged states. The kriging technique allows one to genuinely quantify the surrogate error, therefore it is chosen as metamodeling technique. Enhanced version of ideal gas molecular movement (EIGMM) algorithm is used as main algorithm for model updating. The developed approach is applied to detect simulated damage in numerical models of 72-bar space truss and 120-bar dome truss. The simulation results show the proposed method can perform well in probability-based damage detection of structures with less computational effort compared to direct finite element model.

Keywords: probability-based damage detection (PBDD), Kriging, surrogate modeling, uncertainty quantification, artificial intelligence, enhanced ideal gas molecular movement (EIGMM)

Procedia PDF Downloads 239
37615 Delivery Service and Online-and-Offline Purchasing for Collaborative Recommendations on Retail Cross-Channels

Authors: S. H. Liao, J. M. Huang

Abstract:

The delivery service business model is the final link in logistics for both online-and-offline businesses. The online-and-offline business model focuses on the entire customer purchasing process online and offline, placing greater emphasis on the importance of data to optimize overall retail operations. For the retail industry, it is an important task of information and management to strengthen the collection and investigation of consumers' online and offline purchasing data to better understand customers and then recommend products. This study implements two-stage data mining analytics for clustering and association rules analysis to investigate Taiwanese consumers' (n=2,209) preferences for delivery service. This process clarifies online-and-offline purchasing behaviors and preferences to find knowledge profiles/patterns/rules for cross-channel collaborative recommendations. Finally, theoretical and practical implications for methodology and enterprise are presented.

Keywords: delivery service, online-and-offline purchasing, retail cross-channel, collaborative recommendations, data mining analytics

Procedia PDF Downloads 31
37614 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data

Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer

Abstract:

This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.

Keywords: non-stationary, BINARMA(1, 1) model, Poisson innovations, conditional maximum likelihood, CML

Procedia PDF Downloads 129
37613 Logistic Regression Model versus Additive Model for Recurrent Event Data

Authors: Entisar A. Elgmati

Abstract:

Recurrent infant diarrhea is studied using daily data collected in Salvador, Brazil over one year and three months. A logistic regression model is fitted instead of Aalen's additive model using the same covariates that were used in the analysis with the additive model. The model gives reasonably similar results to that using additive regression model. In addition, the problem with the estimated conditional probabilities not being constrained between zero and one in additive model is solved here. Also martingale residuals that have been used to judge the goodness of fit for the additive model are shown to be useful for judging the goodness of fit of the logistic model.

Keywords: additive model, cumulative probabilities, infant diarrhoea, recurrent event

Procedia PDF Downloads 635
37612 Evaluating Traffic Congestion Using the Bayesian Dirichlet Process Mixture of Generalized Linear Models

Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig

Abstract:

This study applied traffic speed and occupancy to develop clustering models that identify different traffic conditions. Particularly, these models are based on the Dirichlet Process Mixture of Generalized Linear regression (DML) and change-point regression (CR). The model frameworks were implemented using 2015 historical traffic data aggregated at a 15-minute interval from an Interstate 295 freeway in Jacksonville, Florida. Using the deviance information criterion (DIC) to identify the appropriate number of mixture components, three traffic states were identified as free-flow, transitional, and congested condition. Results of the DML revealed that traffic occupancy is statistically significant in influencing the reduction of traffic speed in each of the identified states. Influence on the free-flow and the congested state was estimated to be higher than the transitional flow condition in both evening and morning peak periods. Estimation of the critical speed threshold using CR revealed that 47 mph and 48 mph are speed thresholds for congested and transitional traffic condition during the morning peak hours and evening peak hours, respectively. Free-flow speed thresholds for morning and evening peak hours were estimated at 64 mph and 66 mph, respectively. The proposed approaches will facilitate accurate detection and prediction of traffic congestion for developing effective countermeasures.

Keywords: traffic congestion, multistate speed distribution, traffic occupancy, Dirichlet process mixtures of generalized linear model, Bayesian change-point detection

Procedia PDF Downloads 294
37611 Linear MIMO Model Identification Using an Extended Kalman Filter

Authors: Matthew C. Best

Abstract:

Linear Multi-Input Multi-Output (MIMO) dynamic models can be identified, with no a priori knowledge of model structure or order, using a new Generalised Identifying Filter (GIF). Based on an Extended Kalman Filter, the new filter identifies the model iteratively, in a continuous modal canonical form, using only input and output time histories. The filter’s self-propagating state error covariance matrix allows easy determination of convergence and conditioning, and by progressively increasing model order, the best fitting reduced-order model can be identified. The method is shown to be resistant to noise and can easily be extended to identification of smoothly nonlinear systems.

Keywords: system identification, Kalman filter, linear model, MIMO, model order reduction

Procedia PDF Downloads 594
37610 Functional Instruction Set Simulator (ISS) of a Neural Network (NN) IP with Native BF-16 Generator

Authors: Debajyoti Mukherjee, Arathy B. S., Arpita Sahu, Saranga P. Pogula

Abstract:

A Functional Model to mimic the functional correctness of a Neural Network Compute Accelerator IP is very crucial for design validation. Neural network workloads are based on a Brain Floating Point (BF-16) data type. The major challenge we were facing was the incompatibility of gcc compilers to BF-16 datatype, which we addressed with a native BF-16 generator integrated to our functional model. Moreover, working with big GEMM (General Matrix Multiplication) or SpMM (Sparse Matrix Multiplication) Work Loads (Dense or Sparse) and debugging the failures related to data integrity is highly painstaking. In this paper, we are addressing the quality challenge of such a complex Neural Network Accelerator design by proposing a Functional Model-based scoreboard or Software model using SystemC. The proposed Functional Model executes the assembly code based on the ISA of the processor IP, decodes all instructions, and executes as expected to be done by the DUT. The said model would give a lot of visibility and debug capability in the DUT bringing up micro-steps of execution.

Keywords: ISA (instruction set architecture), NN (neural network), TLM (transaction-level modeling), GEMM (general matrix multiplication)

Procedia PDF Downloads 86
37609 Two-Phase Sampling for Estimating a Finite Population Total in Presence of Missing Values

Authors: Daniel Fundi Murithi

Abstract:

Missing data is a real bane in many surveys. To overcome the problems caused by missing data, partial deletion, and single imputation methods, among others, have been proposed. However, problems such as discarding usable data and inaccuracy in reproducing known population parameters and standard errors are associated with them. For regression and stochastic imputation, it is assumed that there is a variable with complete cases to be used as a predictor in estimating missing values in the other variable, and the relationship between the two variables is linear, which might not be realistic in practice. In this project, we estimate population total in presence of missing values in two-phase sampling. Instead of regression or stochastic models, non-parametric model based regression model is used in imputing missing values. Empirical study showed that nonparametric model-based regression imputation is better in reproducing variance of population total estimate obtained when there were no missing values compared to mean, median, regression, and stochastic imputation methods. Although regression and stochastic imputation were better than nonparametric model-based imputation in reproducing population total estimates obtained when there were no missing values in one of the sample sizes considered, nonparametric model-based imputation may be used when the relationship between outcome and predictor variables is not linear.

Keywords: finite population total, missing data, model-based imputation, two-phase sampling

Procedia PDF Downloads 130
37608 The Grand Unified Theory of Everything as a Generalization to the Standard Model Called as the General Standard Model

Authors: Amir Deljoo

Abstract:

The endeavor to comprehend the existence have been the center of thought for human in form of different disciplines and now basically in physics as the theory of everything. Here, after a brief review of the basic frameworks of thought, and a history of thought since ancient up to present, a logical methodology is presented based on a core axiom after which a function, a proto-field and then a coordinates are explained. Afterwards a generalization to Standard Model is proposed as General Standard Model which is believed to be the base of the Unified Theory of Everything.

Keywords: general relativity, grand unified theory, quantum mechanics, standard model, theory of everything

Procedia PDF Downloads 100
37607 A TgCNN-Based Surrogate Model for Subsurface Oil-Water Phase Flow under Multi-Well Conditions

Authors: Jian Li

Abstract:

The uncertainty quantification and inversion problems of subsurface oil-water phase flow usually require extensive repeated forward calculations for new runs with changed conditions. To reduce the computational time, various forms of surrogate models have been built. Related research shows that deep learning has emerged as an effective surrogate model, while most surrogate models with deep learning are purely data-driven, which always leads to poor robustness and abnormal results. To guarantee the model more consistent with the physical laws, a coupled theory-guided convolutional neural network (TgCNN) based surrogate model is built to facilitate computation efficiency under the premise of satisfactory accuracy. The model is a convolutional neural network based on multi-well reservoir simulation. The core notion of this proposed method is to bridge two separate blocks on top of an overall network. They underlie the TgCNN model in a coupled form, which reflects the coupling nature of pressure and water saturation in the two-phase flow equation. The model is driven by not only labeled data but also scientific theories, including governing equations, stochastic parameterization, boundary, and initial conditions, well conditions, and expert knowledge. The results show that the TgCNN-based surrogate model exhibits satisfactory accuracy and efficiency in subsurface oil-water phase flow under multi-well conditions.

Keywords: coupled theory-guided convolutional neural network, multi-well conditions, surrogate model, subsurface oil-water phase

Procedia PDF Downloads 86
37606 Recommendations for Data Quality Filtering of Opportunistic Species Occurrence Data

Authors: Camille Van Eupen, Dirk Maes, Marc Herremans, Kristijn R. R. Swinnen, Ben Somers, Stijn Luca

Abstract:

In ecology, species distribution models are commonly implemented to study species-environment relationships. These models increasingly rely on opportunistic citizen science data when high-quality species records collected through standardized recording protocols are unavailable. While these opportunistic data are abundant, uncertainty is usually high, e.g., due to observer effects or a lack of metadata. Data quality filtering is often used to reduce these types of uncertainty in an attempt to increase the value of studies relying on opportunistic data. However, filtering should not be performed blindly. In this study, recommendations are built for data quality filtering of opportunistic species occurrence data that are used as input for species distribution models. Using an extensive database of 5.7 million citizen science records from 255 species in Flanders, the impact on model performance was quantified by applying three data quality filters, and these results were linked to species traits. More specifically, presence records were filtered based on record attributes that provide information on the observation process or post-entry data validation, and changes in the area under the receiver operating characteristic (AUC), sensitivity, and specificity were analyzed using the Maxent algorithm with and without filtering. Controlling for sample size enabled us to study the combined impact of data quality filtering, i.e., the simultaneous impact of an increase in data quality and a decrease in sample size. Further, the variation among species in their response to data quality filtering was explored by clustering species based on four traits often related to data quality: commonness, popularity, difficulty, and body size. Findings show that model performance is affected by i) the quality of the filtered data, ii) the proportional reduction in sample size caused by filtering and the remaining absolute sample size, and iii) a species ‘quality profile’, resulting from a species classification based on the four traits related to data quality. The findings resulted in recommendations on when and how to filter volunteer generated and opportunistically collected data. This study confirms that correctly processed citizen science data can make a valuable contribution to ecological research and species conservation.

Keywords: citizen science, data quality filtering, species distribution models, trait profiles

Procedia PDF Downloads 202