Search results for: Markov Model
16687 Modelling Operational Risk Using Extreme Value Theory and Skew t-Copulas via Bayesian Inference
Authors: Betty Johanna Garzon Rozo, Jonathan Crook, Fernando Moreira
Abstract:
Operational risk losses are heavy tailed and are likely to be asymmetric and extremely dependent among business lines/event types. We propose a new methodology to assess, in a multivariate way, the asymmetry and extreme dependence between severity distributions, and to calculate the capital for Operational Risk. This methodology simultaneously uses (i) several parametric distributions and an alternative mix distribution (the Lognormal for the body of losses and the Generalized Pareto Distribution for the tail) via extreme value theory using SAS®, (ii) the multivariate skew t-copula applied for the first time for operational losses and (iii) Bayesian theory to estimate new n-dimensional skew t-copula models via Markov chain Monte Carlo (MCMC) simulation. This paper analyses a newly operational loss data set, SAS Global Operational Risk Data [SAS OpRisk], to model operational risk at international financial institutions. All the severity models are constructed in SAS® 9.2. We implement the procedure PROC SEVERITY and PROC NLMIXED. This paper focuses in describing this implementation.Keywords: operational risk, loss distribution approach, extreme value theory, copulas
Procedia PDF Downloads 60116686 Cloud Computing in Data Mining: A Technical Survey
Authors: Ghaemi Reza, Abdollahi Hamid, Dashti Elham
Abstract:
Cloud computing poses a diversity of challenges in data mining operation arising out of the dynamic structure of data distribution as against the use of typical database scenarios in conventional architecture. Due to immense number of users seeking data on daily basis, there is a serious security concerns to cloud providers as well as data providers who put their data on the cloud computing environment. Big data analytics use compute intensive data mining algorithms (Hidden markov, MapReduce parallel programming, Mahot Project, Hadoop distributed file system, K-Means and KMediod, Apriori) that require efficient high performance processors to produce timely results. Data mining algorithms to solve or optimize the model parameters. The challenges that operation has to encounter is the successful transactions to be established with the existing virtual machine environment and the databases to be kept under the control. Several factors have led to the distributed data mining from normal or centralized mining. The approach is as a SaaS which uses multi-agent systems for implementing the different tasks of system. There are still some problems of data mining based on cloud computing, including design and selection of data mining algorithms.Keywords: cloud computing, data mining, computing models, cloud services
Procedia PDF Downloads 47916685 Mathematical Model to Quantify the Phenomenon of Democracy
Authors: Mechlouch Ridha Fethi
Abstract:
This paper presents a recent mathematical model in political sciences concerning democracy. The model is represented by a logarithmic equation linking the Relative Index of Democracy (RID) to Participation Ratio (PR). Firstly the meanings of the different parameters of the model were presented; and the variation curve of the RID according to PR with different critical areas was discussed. Secondly, the model was applied to a virtual group where we show that the model can be applied depending on the gender. Thirdly, it was observed that the model can be extended to different language models of democracy and that little use to assess the state of democracy for some International organizations like UNO.Keywords: democracy, mathematic, modelization, quantification
Procedia PDF Downloads 36816684 The Achievement Model of University Social Responsibility
Authors: Le Kang
Abstract:
On the research question of 'how to achieve USR', this contribution reflects the concept of university social responsibility, identify three achievement models of USR as the society - diversified model, the university-cooperation model, the government - compound model, also conduct a case study to explore characteristics of Chinese achievement model of USR. The contribution concludes with discussion of how the university, government and society balance demands and roles, make necessarily strategic adjustment and innovative approach to repair the shortcomings of each achievement model.Keywords: modern university, USR, achievement model, compound model
Procedia PDF Downloads 75616683 The Capacity of Mel Frequency Cepstral Coefficients for Speech Recognition
Authors: Fawaz S. Al-Anzi, Dia AbuZeina
Abstract:
Speech recognition is of an important contribution in promoting new technologies in human computer interaction. Today, there is a growing need to employ speech technology in daily life and business activities. However, speech recognition is a challenging task that requires different stages before obtaining the desired output. Among automatic speech recognition (ASR) components is the feature extraction process, which parameterizes the speech signal to produce the corresponding feature vectors. Feature extraction process aims at approximating the linguistic content that is conveyed by the input speech signal. In speech processing field, there are several methods to extract speech features, however, Mel Frequency Cepstral Coefficients (MFCC) is the popular technique. It has been long observed that the MFCC is dominantly used in the well-known recognizers such as the Carnegie Mellon University (CMU) Sphinx and the Markov Model Toolkit (HTK). Hence, this paper focuses on the MFCC method as the standard choice to identify the different speech segments in order to obtain the language phonemes for further training and decoding steps. Due to MFCC good performance, the previous studies show that the MFCC dominates the Arabic ASR research. In this paper, we demonstrate MFCC as well as the intermediate steps that are performed to get these coefficients using the HTK toolkit.Keywords: speech recognition, acoustic features, mel frequency, cepstral coefficients
Procedia PDF Downloads 25916682 Use of Multistage Transition Regression Models for Credit Card Income Prediction
Authors: Denys Osipenko, Jonathan Crook
Abstract:
Because of the variety of the card holders’ behaviour types and income sources each consumer account can be transferred to a variety of states. Each consumer account can be inactive, transactor, revolver, delinquent, defaulted and requires an individual model for the income prediction. The estimation of transition probabilities between statuses at the account level helps to avoid the memorylessness of the Markov Chains approach. This paper investigates the transition probabilities estimation approaches to credit cards income prediction at the account level. The key question of empirical research is which approach gives more accurate results: multinomial logistic regression or multistage conditional logistic regression with binary target. Both models have shown moderate predictive power. Prediction accuracy for conditional logistic regression depends on the order of stages for the conditional binary logistic regression. On the other hand, multinomial logistic regression is easier for usage and gives integrate estimations for all states without priorities. Thus further investigations can be concentrated on alternative modeling approaches such as discrete choice models.Keywords: multinomial regression, conditional logistic regression, credit account state, transition probability
Procedia PDF Downloads 48616681 Model Averaging for Poisson Regression
Authors: Zhou Jianhong
Abstract:
Model averaging is a desirable approach to deal with model uncertainty, which, however, has rarely been explored for Poisson regression. In this paper, we propose a model averaging procedure based on an unbiased estimator of the expected Kullback-Leibler distance for the Poisson regression. Simulation study shows that the proposed model average estimator outperforms some other commonly used model selection and model average estimators in some situations. Our proposed methods are further applied to a real data example and the advantage of this method is demonstrated again.Keywords: model averaging, poission regression, Kullback-Leibler distance, statistics
Procedia PDF Downloads 52016680 Implementation and Validation of a Damage-Friction Constitutive Model for Concrete
Authors: L. Madouni, M. Ould Ouali, N. E. Hannachi
Abstract:
Two constitutive models for concrete are available in ABAQUS/Explicit, the Brittle Cracking Model and the Concrete Damaged Plasticity Model, and their suitability and limitations are well known. The aim of the present paper is to implement a damage-friction concrete constitutive model and to evaluate the performance of this model by comparing the predicted response with experimental data. The constitutive formulation of this material model is reviewed. In order to have consistent results, the parameter identification and calibration for the model have been performed. Several numerical simulations are presented in this paper, whose results allow for validating the capability of the proposed model for reproducing the typical nonlinear performances of concrete structures under different monotonic and cyclic load conditions. The results of the evaluation will be used for recommendations concerning the application and further improvements of the investigated model.Keywords: Abaqus, concrete, constitutive model, numerical simulation
Procedia PDF Downloads 36416679 Development and Validation of a Coronary Heart Disease Risk Score in Indian Type 2 Diabetes Mellitus Patients
Authors: Faiz N. K. Yusufi, Aquil Ahmed, Jamal Ahmad
Abstract:
Diabetes in India is growing at an alarming rate and the complications caused by it need to be controlled. Coronary heart disease (CHD) is one of the complications that will be discussed for prediction in this study. India has the second most number of diabetes patients in the world. To the best of our knowledge, there is no CHD risk score for Indian type 2 diabetes patients. Any form of CHD has been taken as the event of interest. A sample of 750 was determined and randomly collected from the Rajiv Gandhi Centre for Diabetes and Endocrinology, J.N.M.C., A.M.U., Aligarh, India. Collected variables include patients data such as sex, age, height, weight, body mass index (BMI), blood sugar fasting (BSF), post prandial sugar (PP), glycosylated haemoglobin (HbA1c), diastolic blood pressure (DBP), systolic blood pressure (SBP), smoking, alcohol habits, total cholesterol (TC), triglycerides (TG), high density lipoprotein (HDL), low density lipoprotein (LDL), very low density lipoprotein (VLDL), physical activity, duration of diabetes, diet control, history of antihypertensive drug treatment, family history of diabetes, waist circumference, hip circumference, medications, central obesity and history of CHD. Predictive risk scores of CHD events are designed by cox proportional hazard regression. Model calibration and discrimination is assessed from Hosmer Lemeshow and area under receiver operating characteristic (ROC) curve. Overfitting and underfitting of the model is checked by applying regularization techniques and best method is selected between ridge, lasso and elastic net regression. Youden’s index is used to choose the optimal cut off point from the scores. Five year probability of CHD is predicted by both survival function and Markov chain two state model and the better technique is concluded. The risk scores for CHD developed can be calculated by doctors and patients for self-control of diabetes. Furthermore, the five-year probabilities can be implemented as well to forecast and maintain the condition of patients.Keywords: coronary heart disease, cox proportional hazard regression, ROC curve, type 2 diabetes Mellitus
Procedia PDF Downloads 21916678 Effect of Atmospheric Turbulence on Hybrid FSO/RF Link Availability under Qatar's Harsh Climate
Authors: Abir Touati, Syed Jawad Hussain, Farid Touati, Ammar Bouallegue
Abstract:
Although there has been a growing interest in the hybrid free-space optical link and radio frequency FSO/RF communication system, the current literature is limited to results obtained in moderate or cold environment. In this paper, using a soft switching approach, we investigate the effect of weather inhomogeneities on the strength of turbulence hence the channel refractive index under Qatar harsh environment and their influence on the hybrid FSO/RF availability. In this approach, either FSO/RF or simultaneous or none of them can be active. Based on soft switching approach and a finite state Markov Chain (FSMC) process, we model the channel fading for the two links and derive a mathematical expression for the outage probability of the hybrid system. Then, we evaluate the behavior of the hybrid FSO/RF under hazy and harsh weather. Results show that the FSO/RF soft switching renders the system outage probability less than that of each link individually. A soft switching algorithm is being implemented on FPGAs using Raptor code interfaced to the two terminals of a 1Gbps/100 Mbps FSO/RF hybrid system, the first being implemented in the region. Experimental results are compared to the above simulation results.Keywords: atmospheric turbulence, haze, hybrid FSO/RF, outage probability, refractive index
Procedia PDF Downloads 41816677 Model Driven Architecture Methodologies: A Review
Authors: Arslan Murtaza
Abstract:
Model Driven Architecture (MDA) is technique presented by OMG (Object Management Group) for software development in which different models are proposed and converted them into code. The main plan is to identify task by using PIM (Platform Independent Model) and transform it into PSM (Platform Specific Model) and then converted into code. In this review paper describes some challenges and issues that are faced in MDA, type and transformation of models (e.g. CIM, PIM and PSM), and evaluation of MDA-based methodologies.Keywords: OMG, model driven rrchitecture (MDA), computation independent model (CIM), platform independent model (PIM), platform specific model(PSM), MDA-based methodologies
Procedia PDF Downloads 45816676 The Influence of the Concentration and Temperature on the Rheological Behavior of Carbonyl-Methylcellulose
Authors: Mohamed Rabhi, Kouider Halim Benrahou
Abstract:
The rheological properties of the carbonyl-methylcellulose (CMC), of different concentrations (25000, 50000, 60000, 80000 and 100000 ppm) and different temperatures were studied. We found that the rheological behavior of all CMC solutions presents a pseudo-plastic behavior, it follows the model of Ostwald-de Waele. The objective of this work is the modeling of flow by the CMC Cross model. The Cross model gives us the variation of the viscosity according to the shear rate. This model allowed us to adjust more clearly the rheological characteristics of CMC solutions. A comparison between the Cross model and the model of Ostwald was made. Cross the model fitting parameters were determined by a numerical simulation to make an approach between the experimental curve and those given by the two models. Our study has shown that the model of Cross, describes well the flow of "CMC" for low concentrations.Keywords: CMC, rheological modeling, Ostwald model, cross model, viscosity
Procedia PDF Downloads 40516675 3D Model of Rain-Wind Induced Vibration of Inclined Cable
Authors: Viet-Hung Truong, Seung-Eock Kim
Abstract:
Rain–wind induced vibration of inclined cable is a special aerodynamic phenomenon because it is easily influenced by many factors, especially the distribution of rivulet and wind velocity. This paper proposes a new 3D model of inclined cable, based on single degree-of-freedom model. Aerodynamic forces are firstly established and verified with the existing results from a 2D model. The 3D model of inclined cable is developed. The 3D model is then applied to assess the effects of wind velocity distribution and the continuity of rivulets on the cable. Finally, an inclined cable model with small sag is investigated.Keywords: 3D model, rain - wind induced vibration, rivulet, analytical model
Procedia PDF Downloads 48916674 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis
Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee
Abstract:
In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences
Procedia PDF Downloads 74316673 Evaluating Machine Learning Techniques for Activity Classification in Smart Home Environments
Authors: Talal Alshammari, Nasser Alshammari, Mohamed Sedky, Chris Howard
Abstract:
With the widespread adoption of the Internet-connected devices, and with the prevalence of the Internet of Things (IoT) applications, there is an increased interest in machine learning techniques that can provide useful and interesting services in the smart home domain. The areas that machine learning techniques can help advance are varied and ever-evolving. Classifying smart home inhabitants’ Activities of Daily Living (ADLs), is one prominent example. The ability of machine learning technique to find meaningful spatio-temporal relations of high-dimensional data is an important requirement as well. This paper presents a comparative evaluation of state-of-the-art machine learning techniques to classify ADLs in the smart home domain. Forty-two synthetic datasets and two real-world datasets with multiple inhabitants are used to evaluate and compare the performance of the identified machine learning techniques. Our results show significant performance differences between the evaluated techniques. Such as AdaBoost, Cortical Learning Algorithm (CLA), Decision Trees, Hidden Markov Model (HMM), Multi-layer Perceptron (MLP), Structured Perceptron and Support Vector Machines (SVM). Overall, neural network based techniques have shown superiority over the other tested techniques.Keywords: activities of daily living, classification, internet of things, machine learning, prediction, smart home
Procedia PDF Downloads 35716672 Identifying Diabetic Retinopathy Complication by Predictive Techniques in Indian Type 2 Diabetes Mellitus Patients
Authors: Faiz N. K. Yusufi, Aquil Ahmed, Jamal Ahmad
Abstract:
Predicting the risk of diabetic retinopathy (DR) in Indian type 2 diabetes patients is immensely necessary. India, being the second largest country after China in terms of a number of diabetic patients, to the best of our knowledge not a single risk score for complications has ever been investigated. Diabetic retinopathy is a serious complication and is the topmost reason for visual impairment across countries. Any type or form of DR has been taken as the event of interest, be it mild, back, grade I, II, III, and IV DR. A sample was determined and randomly collected from the Rajiv Gandhi Centre for Diabetes and Endocrinology, J.N.M.C., A.M.U., Aligarh, India. Collected variables include patients data such as sex, age, height, weight, body mass index (BMI), blood sugar fasting (BSF), post prandial sugar (PP), glycosylated haemoglobin (HbA1c), diastolic blood pressure (DBP), systolic blood pressure (SBP), smoking, alcohol habits, total cholesterol (TC), triglycerides (TG), high density lipoprotein (HDL), low density lipoprotein (LDL), very low density lipoprotein (VLDL), physical activity, duration of diabetes, diet control, history of antihypertensive drug treatment, family history of diabetes, waist circumference, hip circumference, medications, central obesity and history of DR. Cox proportional hazard regression is used to design risk scores for the prediction of retinopathy. Model calibration and discrimination are assessed from Hosmer Lemeshow and area under receiver operating characteristic curve (ROC). Overfitting and underfitting of the model are checked by applying regularization techniques and best method is selected between ridge, lasso and elastic net regression. Optimal cut off point is chosen by Youden’s index. Five-year probability of DR is predicted by both survival function, and Markov chain two state model and the better technique is concluded. The risk scores developed can be applied by doctors and patients themselves for self evaluation. Furthermore, the five-year probabilities can be applied as well to forecast and maintain the condition of patients. This provides immense benefit in real application of DR prediction in T2DM.Keywords: Cox proportional hazard regression, diabetic retinopathy, ROC curve, type 2 diabetes mellitus
Procedia PDF Downloads 18616671 Bayesian Analysis of Topp-Leone Generalized Exponential Distribution
Authors: Najrullah Khan, Athar Ali Khan
Abstract:
The Topp-Leone distribution was introduced by Topp- Leone in 1955. In this paper, an attempt has been made to fit Topp-Leone Generalized exponential (TPGE) distribution. A real survival data set is used for illustrations. Implementation is done using R and JAGS and appropriate illustrations are made. R and JAGS codes have been provided to implement censoring mechanism using both optimization and simulation tools. The main aim of this paper is to describe and illustrate the Bayesian modelling approach to the analysis of survival data. Emphasis is placed on the modeling of data and the interpretation of the results. Crucial to this is an understanding of the nature of the incomplete or 'censored' data encountered. Analytic approximation and simulation tools are covered here, but most of the emphasis is on Markov chain based Monte Carlo method including independent Metropolis algorithm, which is currently the most popular technique. For analytic approximation, among various optimization algorithms and trust region method is found to be the best. In this paper, TPGE model is also used to analyze the lifetime data in Bayesian paradigm. Results are evaluated from the above mentioned real survival data set. The analytic approximation and simulation methods are implemented using some software packages. It is clear from our findings that simulation tools provide better results as compared to those obtained by asymptotic approximation.Keywords: Bayesian Inference, JAGS, Laplace Approximation, LaplacesDemon, posterior, R Software, simulation
Procedia PDF Downloads 53516670 Bayesian Semiparametric Geoadditive Modelling of Underweight Malnutrition of Children under 5 Years in Ethiopia
Authors: Endeshaw Assefa Derso, Maria Gabriella Campolo, Angela Alibrandi
Abstract:
Objectives:Early childhood malnutrition can have long-term and irreversible effects on a child's health and development. This study uses the Bayesian method with spatial variation to investigate the flexible trends of metrical covariates and to identify communities at high risk of injury. Methods: Cross-sectional data on underweight are collected from the 2016 Ethiopian Demographic and Health Survey (EDHS). The Bayesian geo-additive model is performed. Appropriate prior distributions were provided for scall parameters in the models, and the inference is entirely Bayesian, using Monte Carlo Markov chain (MCMC) stimulation. Results: The results show that metrical covariates like child age, maternal body mass index (BMI), and maternal age affect a child's underweight non-linearly. Lower and higher maternal BMI seem to have a significant impact on the child’s high underweight. There was also a significant spatial heterogeneity, and based on IDW interpolation of predictive values, the western, central, and eastern parts of the country are hotspot areas. Conclusion: Socio-demographic and community- based programs development should be considered compressively in Ethiopian policy to combat childhood underweight malnutrition.Keywords: bayesX, Ethiopia, malnutrition, MCMC, semi-parametric bayesian analysis, spatial distribution, P- splines
Procedia PDF Downloads 8716669 Jointly Optimal Statistical Process Control and Maintenance Policy for Deteriorating Processes
Authors: Lucas Paganin, Viliam Makis
Abstract:
With the advent of globalization, the market competition has become a major issue for most companies. One of the main strategies to overcome this situation is the quality improvement of the product at a lower cost to meet customers’ expectations. In order to achieve the desired quality of products, it is important to control the process to meet the specifications, and to implement the optimal maintenance policy for the machines and the production lines. Thus, the overall objective is to reduce process variation and the production and maintenance costs. In this paper, an integrated model involving Statistical Process Control (SPC) and maintenance is developed to achieve this goal. Therefore, the main focus of this paper is to develop the jointly optimal maintenance and statistical process control policy minimizing the total long run expected average cost per unit time. In our model, the production process can go out of control due to either the deterioration of equipment or other assignable causes. The equipment is also subject to failures in any of the operating states due to deterioration and aging. Hence, the process mean is controlled by an Xbar control chart using equidistant sampling epochs. We assume that the machine inspection epochs are the times when the control chart signals an out-of-control condition, considering both true and false alarms. At these times, the production process will be stopped, and an investigation will be conducted not only to determine whether it is a true or false alarm, but also to identify the causes of the true alarm, whether it was caused by the change in the machine setting, by other assignable causes, or by both. If the system is out of control, the proper actions will be taken to bring it back to the in-control state. At these epochs, a maintenance action can be taken, which can be no action, or preventive replacement of the unit. When the equipment is in the failure state, a corrective maintenance action is performed, which can be minimal repair or replacement of the machine and the process is brought to the in-control state. SMDP framework is used to formulate and solve the joint control problem. Numerical example is developed to demonstrate the effectiveness of the control policy.Keywords: maintenance, semi-Markov decision process, statistical process control, Xbar control chart
Procedia PDF Downloads 9116668 Modelling Volatility Spillovers and Cross Hedging among Major Agricultural Commodity Futures
Authors: Roengchai Tansuchat, Woraphon Yamaka, Paravee Maneejuk
Abstract:
From the past recent, the global financial crisis, economic instability, and large fluctuation in agricultural commodity price have led to increased concerns about the volatility transmission among them. The problem is further exacerbated by commodities volatility caused by other commodity price fluctuations, hence the decision on hedging strategy has become both costly and useless. Thus, this paper is conducted to analysis the volatility spillover effect among major agriculture including corn, soybeans, wheat and rice, to help the commodity suppliers hedge their portfolios, and manage the risk and co-volatility of them. We provide a switching regime approach to analyzing the issue of volatility spillovers in different economic conditions, namely upturn and downturn economic. In particular, we investigate relationships and volatility transmissions between these commodities in different economic conditions. We purposed a Copula-based multivariate Markov Switching GARCH model with two regimes that depend on an economic conditions and perform simulation study to check the accuracy of our proposed model. In this study, the correlation term in the cross-hedge ratio is obtained from six copula families – two elliptical copulas (Gaussian and Student-t) and four Archimedean copulas (Clayton, Gumbel, Frank, and Joe). We use one-step maximum likelihood estimation techniques to estimate our models and compare the performance of these copula using Akaike information criterion (AIC) and Bayesian information criteria (BIC). In the application study of agriculture commodities, the weekly data used are conducted from 4 January 2005 to 1 September 2016, covering 612 observations. The empirical results indicate that the volatility spillover effects among cereal futures are different, as response of different economic condition. In addition, the results of hedge effectiveness will also suggest the optimal cross hedge strategies in different economic condition especially upturn and downturn economic.Keywords: agricultural commodity futures, cereal, cross-hedge, spillover effect, switching regime approach
Procedia PDF Downloads 20216667 Risk Assessment of Flood Defences by Utilising Condition Grade Based Probabilistic Approach
Authors: M. Bahari Mehrabani, Hua-Peng Chen
Abstract:
Management and maintenance of coastal defence structures during the expected life cycle have become a real challenge for decision makers and engineers. Accurate evaluation of the current condition and future performance of flood defence structures is essential for effective practical maintenance strategies on the basis of available field inspection data. Moreover, as coastal defence structures age, it becomes more challenging to implement maintenance and management plans to avoid structural failure. Therefore, condition inspection data are essential for assessing damage and forecasting deterioration of ageing flood defence structures in order to keep the structures in an acceptable condition. The inspection data for flood defence structures are often collected using discrete visual condition rating schemes. In order to evaluate future condition of the structure, a probabilistic deterioration model needs to be utilised. However, existing deterioration models may not provide a reliable prediction of performance deterioration for a long period due to uncertainties. To tackle the limitation, a time-dependent condition-based model associated with a transition probability needs to be developed on the basis of condition grade scheme for flood defences. This paper presents a probabilistic method for predicting future performance deterioration of coastal flood defence structures based on condition grading inspection data and deterioration curves estimated by expert judgement. In condition-based deterioration modelling, the main task is to estimate transition probability matrices. The deterioration process of the structure related to the transition states is modelled according to Markov chain process, and a reliability-based approach is used to estimate the probability of structural failure. Visual inspection data according to the United Kingdom Condition Assessment Manual are used to obtain the initial condition grade curve of the coastal flood defences. The initial curves then modified in order to develop transition probabilities through non-linear regression based optimisation algorithms. The Monte Carlo simulations are then used to evaluate the future performance of the structure on the basis of the estimated transition probabilities. Finally, a case study is given to demonstrate the applicability of the proposed method under no-maintenance and medium-maintenance scenarios. Results show that the proposed method can provide an effective predictive model for various situations in terms of available condition grading data. The proposed model also provides useful information on time-dependent probability of failure in coastal flood defences.Keywords: condition grading, flood defense, performance assessment, stochastic deterioration modelling
Procedia PDF Downloads 23316666 Equivalent Circuit Model for the Eddy Current Damping with Frequency-Dependence
Authors: Zhiguo Shi, Cheng Ning Loong, Jiazeng Shan, Weichao Wu
Abstract:
This study proposes an equivalent circuit model to simulate the eddy current damping force with shaking table tests and finite element modeling. The model is firstly proposed and applied to a simple eddy current damper, which is modelled in ANSYS, indicating that the proposed model can simulate the eddy current damping force under different types of excitations. Then, a non-contact and friction-free eddy current damper is designed and tested, and the proposed model can reproduce the experimental observations. The excellent agreement between the simulated results and the experimental data validates the accuracy and reliability of the equivalent circuit model. Furthermore, a more complicated model is performed in ANSYS to verify the feasibility of the equivalent circuit model in complex eddy current damper, and the higher-order fractional model and viscous model are adopted for comparison.Keywords: equivalent circuit model, eddy current damping, finite element model, shake table test
Procedia PDF Downloads 19116665 The Extended Skew Gaussian Process for Regression
Authors: M. T. Alodat
Abstract:
In this paper, we propose a generalization to the Gaussian process regression(GPR) model called the extended skew Gaussian process for regression(ESGPr) model. The ESGPR model works better than the GPR model when the errors are skewed. We derive the predictive distribution for the ESGPR model at a new input. Also we apply the ESGPR model to FOREX data and we find that it fits the Forex data better than the GPR model.Keywords: extended skew normal distribution, Gaussian process for regression, predictive distribution, ESGPr model
Procedia PDF Downloads 55316664 Camera Model Identification for Mi Pad 4, Oppo A37f, Samsung M20, and Oppo f9
Authors: Ulrich Wake, Eniman Syamsuddin
Abstract:
The model for camera model identificaiton is trained using pretrained model ResNet43 and ResNet50. The dataset consists of 500 photos of each phone. Dataset is divided into 1280 photos for training, 320 photos for validation and 400 photos for testing. The model is trained using One Cycle Policy Method and tested using Test-Time Augmentation. Furthermore, the model is trained for 50 epoch using regularization such as drop out and early stopping. The result is 90% accuracy for validation set and above 85% for Test-Time Augmentation using ResNet50. Every model is also trained by slightly updating the pretrained model’s weightsKeywords: One Cycle Policy, ResNet34, ResNet50, Test-Time Agumentation
Procedia PDF Downloads 20816663 A Theoretical Hypothesis on Ferris Wheel Model of University Social Responsibility
Authors: Le Kang
Abstract:
According to the nature of the university, as a free and responsible academic community, USR is based on a different foundation —academic responsibility, so the Pyramid and the IC Model of CSR could not fully explain the most distinguished feature of USR. This paper sought to put forward a new model— Ferris Wheel Model, to illustrate the nature of USR and the process of achievement. The Ferris Wheel Model of USR shows the university creates a balanced, fairness and neutrality systemic structure to afford social responsibilities; that makes the organization could obtain a synergistic effect to achieve more extensive interests of stakeholders and wider social responsibilities.Keywords: USR, achievement model, ferris wheel model, social responsibilities
Procedia PDF Downloads 72416662 Model Predictive Control of Three Phase Inverter for PV Systems
Authors: Irtaza M. Syed, Kaamran Raahemifar
Abstract:
This paper presents a model predictive control (MPC) of a utility interactive three phase inverter (TPI) for a photovoltaic (PV) system at commercial level. The proposed model uses phase locked loop (PLL) to synchronize TPI with the power electric grid (PEG) and performs MPC control in a dq reference frame. TPI model consists of boost converter (BC), maximum power point tracking (MPPT) control, and a three leg voltage source inverter (VSI). Operational model of VSI is used to synthesize sinusoidal current and track the reference. Model is validated using a 35.7 kW PV system in Matlab/Simulink. Implementation and results show simplicity and accuracy, as well as reliability of the model.Keywords: model predictive control, three phase voltage source inverter, PV system, Matlab/simulink
Procedia PDF Downloads 59416661 Model Observability – A Monitoring Solution for Machine Learning Models
Authors: Amreth Chandrasehar
Abstract:
Machine Learning (ML) Models are developed and run in production to solve various use cases that help organizations to be more efficient and help drive the business. But this comes at a massive development cost and lost business opportunities. According to the Gartner report, 85% of data science projects fail, and one of the factors impacting this is not paying attention to Model Observability. Model Observability helps the developers and operators to pinpoint the model performance issues data drift and help identify root cause of issues. This paper focuses on providing insights into incorporating model observability in model development and operationalizing it in production.Keywords: model observability, monitoring, drift detection, ML observability platform
Procedia PDF Downloads 11216660 Consensus Reaching Process and False Consensus Effect in a Problem of Portfolio Selection
Authors: Viviana Ventre, Giacomo Di Tollo, Roberta Martino
Abstract:
The portfolio selection problem includes the evaluation of many criteria that are difficult to compare directly and is characterized by uncertain elements. The portfolio selection problem can be modeled as a group decision problem in which several experts are invited to present their assessment. In this context, it is important to study and analyze the process of reaching a consensus among group members. Indeed, due to the various diversities among experts, reaching consensus is not necessarily always simple and easily achievable. Moreover, the concept of consensus is accompanied by the concept of false consensus, which is particularly interesting in the dynamics of group decision-making processes. False consensus can alter the evaluation and selection phase of the alternative and is the consequence of the decision maker's inability to recognize that his preferences are conditioned by subjective structures. The present work aims to investigate the dynamics of consensus attainment in a group decision problem in which equivalent portfolios are proposed. In particular, the study aims to analyze the impact of the subjective structure of the decision-maker during the evaluation and selection phase of the alternatives. Therefore, the experimental framework is divided into three phases. In the first phase, experts are sent to evaluate the characteristics of all portfolios individually, without peer comparison, arriving independently at the selection of the preferred portfolio. The experts' evaluations are used to obtain individual Analytical Hierarchical Processes that define the weight that each expert gives to all criteria with respect to the proposed alternatives. This step provides insight into how the decision maker's decision process develops, step by step, from goal analysis to alternative selection. The second phase includes the description of the decision maker's state through Markov chains. In fact, the individual weights obtained in the first phase can be reviewed and described as transition weights from one state to another. Thus, with the construction of the individual transition matrices, the possible next state of the expert is determined from the individual weights at the end of the first phase. Finally, the experts meet, and the process of reaching consensus is analyzed by considering the single individual state obtained at the previous stage and the false consensus bias. The work contributes to the study of the impact of subjective structures, quantified through the Analytical Hierarchical Process, and how they combine with the false consensus bias in group decision-making dynamics and the consensus reaching process in problems involving the selection of equivalent portfolios.Keywords: analytical hierarchical process, consensus building, false consensus effect, markov chains, portfolio selection problem
Procedia PDF Downloads 9316659 All-or-None Principle and Weakness of Hodgkin-Huxley Mathematical Model
Authors: S. A. Sadegh Zadeh, C. Kambhampati
Abstract:
Mathematical and computational modellings are the necessary tools for reviewing, analysing, and predicting processes and events in the wide spectrum range of scientific fields. Therefore, in a field as rapidly developing as neuroscience, the combination of these two modellings can have a significant role in helping to guide the direction the field takes. The paper combined mathematical and computational modelling to prove a weakness in a very precious model in neuroscience. This paper is intended to analyse all-or-none principle in Hodgkin-Huxley mathematical model. By implementation the computational model of Hodgkin-Huxley model and applying the concept of all-or-none principle, an investigation on this mathematical model has been performed. The results clearly showed that the mathematical model of Hodgkin-Huxley does not observe this fundamental law in neurophysiology to generating action potentials. This study shows that further mathematical studies on the Hodgkin-Huxley model are needed in order to create a model without this weakness.Keywords: all-or-none, computational modelling, mathematical model, transmembrane voltage, action potential
Procedia PDF Downloads 61716658 Decision Support: How Explainable A.I. Can Improve Transparency and Trust with Human Users
Authors: Devon Brown, Liu Chunmei
Abstract:
This paper will present an analysis as part of the researchers dissertation topic focusing on the intersection of affective and analytical directed acyclic graphs (DAGs) in the context of Decision Support Systems (DSS). The researcher’s work involves analyzing decision theory models like Affective and Bayesian Decision theory models and how they could be implemented under an Affective Computing Framework using Information Fusion and Human-Centered Design. Additionally, the researcher is beginning research on an Affective-Analytic Decision Framework (AADF) model for their dissertation research and are looking to merge logic and analytic models with empathetic insights into affective DAGs. Data-collection efforts begin Fall 2024 and in preparation for the efforts this paper looks to analyze previous research in this area and introduce the AADF framework and propose conceptual models for consideration. For this paper, the research emphasis is placed on analyzing Bayesian networks and Markov models which offer probabilistic techniques during uncertainty in decision-making. Ideally, including affect into analytic models will ensure algorithms can increase user trust with algorithms by including emotional states and the user’s experience with the goal of developing emotionally intelligent A.I. systems that can start to navigate the complex fabric of human emotion during decision-making.Keywords: decision support systems, explainable AI, HCAI techniques, affective-analytical decision framework
Procedia PDF Downloads 20