Search results for: random intercepts model
17981 A Machine Learning Approach for Intelligent Transportation System Management on Urban Roads
Authors: Ashish Dhamaniya, Vineet Jain, Rajesh Chouhan
Abstract:
Traffic management is one of the gigantic issue in most of the urban roads in al-most all metropolitan cities in India. Speed is one of the critical traffic parameters for effective Intelligent Transportation System (ITS) implementation as it decides the arrival rate of vehicles on an intersection which are majorly the point of con-gestions. The study aimed to leverage Machine Learning (ML) models to produce precise predictions of speed on urban roadway links. The research objective was to assess how categorized traffic volume and road width, serving as variables, in-fluence speed prediction. Four tree-based regression models namely: Decision Tree (DT), Random Forest (RF), Extra Tree (ET), and Extreme Gradient Boost (XGB)are employed for this purpose. The models' performances were validated using test data, and the results demonstrate that Random Forest surpasses other machine learning techniques and a conventional utility theory-based model in speed prediction. The study is useful for managing the urban roadway network performance under mixed traffic conditions and effective implementation of ITS.Keywords: stream speed, urban roads, machine learning, traffic flow
Procedia PDF Downloads 7017980 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model
Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin
Abstract:
Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.Keywords: anomaly detection, autoencoder, data centers, deep learning
Procedia PDF Downloads 19417979 Solving Weighted Number of Operation Plus Processing Time Due-Date Assignment, Weighted Scheduling and Process Planning Integration Problem Using Genetic and Simulated Annealing Search Methods
Authors: Halil Ibrahim Demir, Caner Erden, Mumtaz Ipek, Ozer Uygun
Abstract:
Traditionally, the three important manufacturing functions, which are process planning, scheduling and due-date assignment, are performed separately and sequentially. For couple of decades, hundreds of studies are done on integrated process planning and scheduling problems and numerous researches are performed on scheduling with due date assignment problem, but unfortunately the integration of these three important functions are not adequately addressed. Here, the integration of these three important functions is studied by using genetic, random-genetic hybrid, simulated annealing, random-simulated annealing hybrid and random search techniques. As well, the importance of the integration of these three functions and the power of meta-heuristics and of hybrid heuristics are studied.Keywords: process planning, weighted scheduling, weighted due-date assignment, genetic search, simulated annealing, hybrid meta-heuristics
Procedia PDF Downloads 46917978 Progressive Type-I Interval Censoring with Binomial Removal-Estimation and Its Properties
Authors: Sonal Budhiraja, Biswabrata Pradhan
Abstract:
This work considers statistical inference based on progressive Type-I interval censored data with random removal. The scheme of progressive Type-I interval censoring with random removal can be described as follows. Suppose n identical items are placed on a test at time T0 = 0 under k pre-fixed inspection times at pre-specified times T1 < T2 < . . . < Tk, where Tk is the scheduled termination time of the experiment. At inspection time Ti, Ri of the remaining surviving units Si, are randomly removed from the experiment. The removal follows a binomial distribution with parameters Si and pi for i = 1, . . . , k, with pk = 1. In this censoring scheme, the number of failures in different inspection intervals and the number of randomly removed items at pre-specified inspection times are observed. Asymptotic properties of the maximum likelihood estimators (MLEs) are established under some regularity conditions. A β-content γ-level tolerance interval (TI) is determined for two parameters Weibull lifetime model using the asymptotic properties of MLEs. The minimum sample size required to achieve the desired β-content γ-level TI is determined. The performance of the MLEs and TI is studied via simulation.Keywords: asymptotic normality, consistency, regularity conditions, simulation study, tolerance interval
Procedia PDF Downloads 24917977 Bayesian Structural Identification with Systematic Uncertainty Using Multiple Responses
Authors: André Jesus, Yanjie Zhu, Irwanda Laory
Abstract:
Structural health monitoring is one of the most promising technologies concerning aversion of structural risk and economic savings. Analysts often have to deal with a considerable variety of uncertainties that arise during a monitoring process. Namely the widespread application of numerical models (model-based) is accompanied by a widespread concern about quantifying the uncertainties prevailing in their use. Some of these uncertainties are related with the deterministic nature of the model (code uncertainty) others with the variability of its inputs (parameter uncertainty) and the discrepancy between a model/experiment (systematic uncertainty). The actual process always exhibits a random behaviour (observation error) even when conditions are set identically (residual variation). Bayesian inference assumes that parameters of a model are random variables with an associated PDF, which can be inferred from experimental data. However in many Bayesian methods the determination of systematic uncertainty can be problematic. In this work systematic uncertainty is associated with a discrepancy function. The numerical model and discrepancy function are approximated by Gaussian processes (surrogate model). Finally, to avoid the computational burden of a fully Bayesian approach the parameters that characterise the Gaussian processes were estimated in a four stage process (modular Bayesian approach). The proposed methodology has been successfully applied on fields such as geoscience, biomedics, particle physics but never on the SHM context. This approach considerably reduces the computational burden; although the extent of the considered uncertainties is lower (second order effects are neglected). To successfully identify the considered uncertainties this formulation was extended to consider multiple responses. The efficiency of the algorithm has been tested on a small scale aluminium bridge structure, subjected to a thermal expansion due to infrared heaters. Comparison of its performance with responses measured at different points of the structure and associated degrees of identifiability is also carried out. A numerical FEM model of the structure was developed and the stiffness from its supports is considered as a parameter to calibrate. Results show that the modular Bayesian approach performed best when responses of the same type had the lowest spatial correlation. Based on previous literature, using different types of responses (strain, acceleration, and displacement) should also improve the identifiability problem. Uncertainties due to parametric variability, observation error, residual variability, code variability and systematic uncertainty were all recovered. For this example the algorithm performance was stable and considerably quicker than Bayesian methods that account for the full extent of uncertainties. Future research with real-life examples is required to fully access the advantages and limitations of the proposed methodology.Keywords: bayesian, calibration, numerical model, system identification, systematic uncertainty, Gaussian process
Procedia PDF Downloads 32617976 Analysis of Overall Thermo-Elastic Properties of Random Particulate Nanocomposites with Various Interphase Models
Authors: Lidiia Nazarenko, Henryk Stolarski, Holm Altenbach
Abstract:
In the paper, a (hierarchical) approach to analysis of thermo-elastic properties of random composites with interphases is outlined and illustrated. It is based on the statistical homogenization method – the method of conditional moments – combined with recently introduced notion of the energy-equivalent inhomogeneity which, in this paper, is extended to include thermal effects. After exposition of the general principles, the approach is applied in the investigation of the effective thermo-elastic properties of a material with randomly distributed nanoparticles. The basic idea of equivalent inhomogeneity is to replace the inhomogeneity and the surrounding it interphase by a single equivalent inhomogeneity of constant stiffness tensor and coefficient of thermal expansion, combining thermal and elastic properties of both. The equivalent inhomogeneity is then perfectly bonded to the matrix which allows to analyze composites with interphases using techniques devised for problems without interphases. From the mechanical viewpoint, definition of the equivalent inhomogeneity is based on Hill’s energy equivalence principle, applied to the problem consisting only of the original inhomogeneity and its interphase. It is more general than the definitions proposed in the past in that, conceptually and practically, it allows to consider inhomogeneities of various shapes and various models of interphases. This is illustrated considering spherical particles with two models of interphases, Gurtin-Murdoch material surface model and spring layer model. The resulting equivalent inhomogeneities are subsequently used to determine effective thermo-elastic properties of randomly distributed particulate composites. The effective stiffness tensor and coefficient of thermal extension of the material with so defined equivalent inhomogeneities are determined by the method of conditional moments. Closed-form expressions for the effective thermo-elastic parameters of a composite consisting of a matrix and randomly distributed spherical inhomogeneities are derived for the bulk and the shear moduli as well as for the coefficient of thermal expansion. Dependence of the effective parameters on the interphase properties is included in the resulting expressions, exhibiting analytically the nature of the size-effects in nanomaterials. As a numerical example, the epoxy matrix with randomly distributed spherical glass particles is investigated. The dependence of the effective bulk and shear moduli, as well as of the effective thermal expansion coefficient on the particle volume fraction (for different radii of nanoparticles) and on the radius of nanoparticle (for fixed volume fraction of nanoparticles) for different interphase models are compared to and discussed in the context of other theoretical predictions. Possible applications of the proposed approach to short-fiber composites with various types of interphases are discussed.Keywords: effective properties, energy equivalence, Gurtin-Murdoch surface model, interphase, random composites, spherical equivalent inhomogeneity, spring layer model
Procedia PDF Downloads 18517975 Consumers’ Willingness to Pay for Organic Vegetables in Oyo State
Authors: Olanrewaju Kafayat, O., Salman Kabir, K.
Abstract:
The role of organic agriculture in providing food and income is now gaining wider recognition (Van Elzakker et al 2007). The increasing public concerns about food safety issues on the use of fertilizers, pesticide residues, growth hormones, GM organisms, and increasing awareness of environmental quality issues have led to an expanding demand for environmentally friendly products (Thompson, 1998; Rimal et al., 2005). As a result national governments are concerned about diet and health, and there has been renewed recognition of the role of public policy in promoting healthy diets, thus to provide healthier, safer, more confident citizens (Poole et al., 2007), With these benefits, a study into organic vegetables is very vital to all the major stakeholders. This study analyzed the willingness of consumers to pay for organic vegetables in Oyo state, Nigeria. Primary data was collected with the aid of structured questionnaire administered to 168 respondents. These were selected using multistage random sampling. The first stage involved the selection two (2) ADP zones out of the three (3) ADP zones in Oyo state, The second stage involved the random selection of two (2) local government areas each out of the two (2) ADP zones which are; Ibadan South West and Ogbomoso North and random selection of 4 wards each from the local government areas. The third stage involved random selection of 42 household each from of the local government areas. Descriptive statistics, the principal component analysis, and the logistic regression were used to analyze the data. Results showed 55 percent of the respondents were female while 80 percent were 50 years. 74 percent of the respondents agreed that organic vegetables are of better quality. 31 percent of the respondents were aware of organic vegetables as against 69 percent who were not aware. From the logistic model, educational attainment, amount spent on organic vegetables monthly, better quality of organic vegetables and accessibility to organic vegetables were significant and had a positive relationship on willingness to pay for organic vegetable. The variables that were significant and had a negative relationship with WTP are less attractiveness of organic vegetables and household size of the respondents. This study concludes that consumers with higher level of education were more likely to be aware and willing to pay for organic vegetables than those with low levels of education, the study therefore recommends creation of awareness on the relevance of consuming organic vegetables through effective marketing and educational campaigns.Keywords: consumers awareness, willingness to pay, organic vegetables, Oyo State
Procedia PDF Downloads 27117974 Seismic Response Mitigation of Structures Using Base Isolation System Considering Uncertain Parameters
Authors: Rama Debbarma
Abstract:
The present study deals with the performance of Linear base isolation system to mitigate seismic response of structures characterized by random system parameters. This involves optimization of the tuning ratio and damping properties of the base isolation system considering uncertain system parameters. However, the efficiency of base isolator may reduce if it is not tuned to the vibrating mode it is designed to suppress due to unavoidable presence of system parameters uncertainty. With the aid of matrix perturbation theory and first order Taylor series expansion, the total probability concept is used to evaluate the unconditional response of the primary structures considering random system parameters. For this, the conditional second order information of the response quantities are obtained in random vibration framework using state space formulation. Subsequently, the maximum unconditional root mean square displacement of the primary structures is used as the objective function to obtain optimum damping parameters Numerical study is performed to elucidate the effect of parameters uncertainties on the optimization of parameters of linear base isolator and system performance.Keywords: linear base isolator, earthquake, optimization, uncertain parameters
Procedia PDF Downloads 43317973 Performance Comparison of Cooperative Banks in the EU, USA and Canada
Authors: Matěj Kuc
Abstract:
This paper compares different types of profitability measures of cooperative banks from two developed regions: the European Union and the United States of America together with Canada. We created balanced dataset of more than 200 cooperative banks covering 2011-2016 period. We made series of tests and run Random Effects estimation on panel data. We found that American and Canadian cooperatives are more profitable in terms of return on assets (ROA) and return on equity (ROE). There is no significant difference in net interest margin (NIM). Our results show that the North American cooperative banks accommodated better to the current market environment.Keywords: cooperative banking, panel data, profitability measures, random effects
Procedia PDF Downloads 11317972 Alcohol-Containing versus Aqueous-Based Solutions for Skin Preparation in Abdominal Surgery: A Systematic Review and Meta-Analysis
Authors: Dimitra V. Peristeri, Hussameldin M. Nour, Amiya Ahsan, Sameh Abogabal, Krishna K. Singh, Muhammad Shafique Sajid
Abstract:
Introduction: The use of optimal skin antiseptic agents for the prevention of surgical site infection (SSI) is of critical importance, especially during abdominal surgical procedures. Alcohol-based chlorhexidine gluconate (CHG) and aqueous-based povidone-iodine (PVI) are the two most common skin antiseptics used nowadays. The objective of this article is to evaluate the effectiveness of alcohol-based CHG versus aqueous-based PVI used for skin preparation before abdominal surgery to reduce SSIs. Methods: Standard medical databases such as MEDLINE, Embase, Pubmed, and Cochrane Library were searched to find randomised, controlled trials (RCTs) comparing alcohol-based CHG skin preparation versus aqueous-based PVI in patients undergoing abdominal surgery. The combined outcomes of SSIs were calculated using an odds ratio (OR) with 95% confidence intervals (95% CI). All data were analysed using Review Manager (RevMan) Software 5.4, and the meta-analysis was performed with a random effect model analysis. Results: A total of 11 studies, all RCTs, were included (n= 12072 participants), recruiting adult patients undergoing abdominal surgery. In the random effect model analysis, the use of alcohol-based CHG in patients undergoing abdominal surgery was associated with a reduced risk of SSI compared to aqueous-based PVI (OR: 0.84; 95% CI [0.74, 0.96], z= 2.61, p= 0.009). Conclusion: Alcohol-based CHG may be more effective for preventing the risk of SSI compared to aqueous-based PVI agents in abdominal surgery. The conclusion of this meta-analysis may add a guiding value to reinforce current clinical practice guidelines.Keywords: skin preparation, surgical site infection, chlorhexidine, skin antiseptics
Procedia PDF Downloads 11017971 Logistic Regression Model versus Additive Model for Recurrent Event Data
Authors: Entisar A. Elgmati
Abstract:
Recurrent infant diarrhea is studied using daily data collected in Salvador, Brazil over one year and three months. A logistic regression model is fitted instead of Aalen's additive model using the same covariates that were used in the analysis with the additive model. The model gives reasonably similar results to that using additive regression model. In addition, the problem with the estimated conditional probabilities not being constrained between zero and one in additive model is solved here. Also martingale residuals that have been used to judge the goodness of fit for the additive model are shown to be useful for judging the goodness of fit of the logistic model.Keywords: additive model, cumulative probabilities, infant diarrhoea, recurrent event
Procedia PDF Downloads 63517970 Evaluation of Spatial Correlation Length and Karhunen-Loeve Expansion Terms for Predicting Reliability Level of Long-Term Settlement in Soft Soils
Authors: Mehrnaz Alibeikloo, Hadi Khabbaz, Behzad Fatahi
Abstract:
The spectral random field method is one of the widely used methods to obtain more reliable and accurate results in geotechnical problems involving material variability. Karhunen-Loeve (K-L) expansion method was applied to perform random field discretization of cross-correlated creep parameters. Karhunen-Loeve expansion method is based on eigenfunctions and eigenvalues of covariance function adopting Kernel integral solution. In this paper, the accuracy of Karhunen-Loeve expansion was investigated to predict long-term settlement of soft soils adopting elastic visco-plastic creep model. For this purpose, a parametric study was carried to evaluate the effect of K-L expansion terms and spatial correlation length on the reliability of results. The results indicate that small values of spatial correlation length require more K-L expansion terms. Moreover, by increasing spatial correlation length, the coefficient of variation (COV) of creep settlement increases, confirming more conservative and safer prediction.Keywords: Karhunen-Loeve expansion, long-term settlement, reliability analysis, spatial correlation length
Procedia PDF Downloads 15917969 Motion Detection Method for Clutter Rejection in the Bio-Radar Signal Processing
Authors: Carolina Gouveia, José Vieira, Pedro Pinho
Abstract:
The cardiopulmonary signal monitoring, without the usage of contact electrodes or any type of in-body sensors, has several applications such as sleeping monitoring and continuous monitoring of vital signals in bedridden patients. This system has also applications in the vehicular environment to monitor the driver, in order to avoid any possible accident in case of cardiac failure. Thus, the bio-radar system proposed in this paper, can measure vital signals accurately by using the Doppler effect principle that relates the received signal properties with the distance change between the radar antennas and the person’s chest-wall. Once the bio-radar aim is to monitor subjects in real-time and during long periods of time, it is impossible to guarantee the patient immobilization, hence their random motion will interfere in the acquired signals. In this paper, a mathematical model of the bio-radar is presented, as well as its simulation in MATLAB. The used algorithm for breath rate extraction is explained and a method for DC offsets removal based in a motion detection system is proposed. Furthermore, experimental tests were conducted with a view to prove that the unavoidable random motion can be used to estimate the DC offsets accurately and thus remove them successfully.Keywords: bio-signals, DC component, Doppler effect, ellipse fitting, radar, SDR
Procedia PDF Downloads 14017968 Optimizing a Hybrid Inventory System with Random Demand and Lead Time
Authors: Benga Ebouele, Thomas Tengen
Abstract:
Implementing either periodic or continuous inventory review model within most manufacturing-companies-supply chains as a management tool may incur higher costs. These high costs affect the system flexibility which in turn affects the level of service required to satisfy customers. However, these effects are not clearly understood because the parameters of both inventory review policies (protection demand interval, order quantity, etc.) are not designed to be fully utilized under different and uncertain conditions such as poor manufacturing, supplies and delivery performance. Coming up with a hybrid model which may combine in some sense the feature of both continuous and a periodic inventory review models should be useful. Therefore, there is a need to build and evaluate such hybrid model on the annual total cost, stock out probability and system’s flexibility in order to search for the most cost effective inventory review model. This work also seeks to find the optimal sets of parameters of inventory management under stochastic condition so as to optimise each policy independently. The results reveal that a continuous inventory system always incurs lesser cost than a periodic (R, S) inventory system, but this difference tends to decrease as time goes by. Although the hybrid inventory is the only one that can yield lesser cost over time, it is not always desirable but also natural to use it in order to help the system to meet high performance specification.Keywords: demand and lead time randomness, hybrid Inventory model, optimization, supply chain
Procedia PDF Downloads 31317967 Discontinuous Spacetime with Vacuum Holes as Explanation for Gravitation, Quantum Mechanics and Teleportation
Authors: Constantin Z. Leshan
Abstract:
Hole Vacuum theory is based on discontinuous spacetime that contains vacuum holes. Vacuum holes can explain gravitation, some laws of quantum mechanics and allow teleportation of matter. All massive bodies emit a flux of holes which curve the spacetime; if we increase the concentration of holes, it leads to length contraction and time dilation because the holes do not have the properties of extension and duration. In the limited case when space consists of holes only, the distance between every two points is equal to zero and time stops - outside of the Universe, the extension and duration properties do not exist. For this reason, the vacuum hole is the only particle in physics capable of describing gravitation using its own properties only. All microscopic particles must 'jump' continually and 'vibrate' due to the appearance of holes (impassable microscopic 'walls' in space), and it is the cause of the quantum behavior. Vacuum holes can explain the entanglement, non-locality, wave properties of matter, tunneling, uncertainty principle and so on. Particles do not have trajectories because spacetime is discontinuous and has impassable microscopic 'walls' due to the simple mechanical motion is impossible at small scale distances; it is impossible to 'trace' a straight line in the discontinuous spacetime because it contains the impassable holes. Spacetime 'boils' continually due to the appearance of the vacuum holes. For teleportation to be possible, we must send a body outside of the Universe by enveloping it with a closed surface consisting of vacuum holes. Since a material body cannot exist outside of the Universe, it reappears instantaneously in a random point of the Universe. Since a body disappears in one volume and reappears in another random volume without traversing the physical space between them, such a transportation method can be called teleportation (or Hole Teleportation). It is shown that Hole Teleportation does not violate causality and special relativity due to its random nature and other properties. Although Hole Teleportation has a random nature, it can be used for colonization of extrasolar planets by the help of the method called 'random jumps': after a large number of random teleportation jumps, there is a probability that the spaceship may appear near a habitable planet. We can create vacuum holes experimentally using the method proposed by Descartes: we must remove a body from the vessel without permitting another body to occupy this volume.Keywords: border of the Universe, causality violation, perfect isolation, quantum jumps
Procedia PDF Downloads 42517966 Distributed Acoustic Sensing Signal Model under Static Fiber Conditions
Authors: G. Punithavathy
Abstract:
The research proposes a statistical model for the distributed acoustic sensor interrogation units that broadcast a laser pulse into the fiber optics, where interactions within the fiber determine the localized acoustic energy that causes light reflections known as backscatter. The backscattered signal's amplitude and phase can be calculated using explicit equations. The created model makes amplitude signal spectrum and autocorrelation predictions that are confirmed by experimental findings. Phase signal characteristics that are useful for researching optical time domain reflectometry (OTDR) system sensing applications are provided and examined, showing good agreement with the experiment. The experiment was successfully done with the use of Python coding. In this research, we can analyze the entire distributed acoustic sensing (DAS) component parts separately. This model assumes that the fiber is in a static condition, meaning that there is no external force or vibration applied to the cable, that means no external acoustic disturbances present. The backscattered signal consists of a random noise component, which is caused by the intrinsic imperfections of the fiber, and a coherent component, which is due to the laser pulse interacting with the fiber.Keywords: distributed acoustic sensing, optical fiber devices, optical time domain reflectometry, Rayleigh scattering
Procedia PDF Downloads 7017965 A Multi-Criteria Model for Scheduling of Stochastic Single Machine Problem with Outsourcing and Solving It through Application of Chance Constrained
Authors: Homa Ghave, Parmis Shahmaleki
Abstract:
This paper presents a new multi-criteria stochastic mathematical model for a single machine scheduling with outsourcing allowed. There are multiple jobs processing in batch. For each batch, all of job or a quantity of it can be outsourced. The jobs have stochastic processing time and lead time and deterministic due dates arrive randomly. Because of the stochastic inherent of processing time and lead time, we use the chance constrained programming for modeling the problem. First, the problem is formulated in form of stochastic programming and then prepared in a form of deterministic mixed integer linear programming. The objectives are considered in the model to minimize the maximum tardiness and outsourcing cost simultaneously. Several procedures have been developed to deal with the multi-criteria problem. In this paper, we utilize the concept of satisfaction functions to increases the manager’s preference. The proposed approach is tested on instances where the random variables are normally distributed.Keywords: single machine scheduling, multi-criteria mathematical model, outsourcing strategy, uncertain lead times and processing times, chance constrained programming, satisfaction function
Procedia PDF Downloads 26417964 A Comparative Study on Sampling Techniques of Polynomial Regression Model Based Stochastic Free Vibration of Composite Plates
Authors: S. Dey, T. Mukhopadhyay, S. Adhikari
Abstract:
This paper presents an exhaustive comparative investigation on sampling techniques of polynomial regression model based stochastic natural frequency of composite plates. Both individual and combined variations of input parameters are considered to map the computational time and accuracy of each modelling techniques. The finite element formulation of composites is capable to deal with both correlated and uncorrelated random input variables such as fibre parameters and material properties. The results obtained by Polynomial regression (PR) using different sampling techniques are compared. Depending on the suitability of sampling techniques such as 2k Factorial designs, Central composite design, A-Optimal design, I-Optimal, D-Optimal, Taguchi’s orthogonal array design, Box-Behnken design, Latin hypercube sampling, sobol sequence are illustrated. Statistical analysis of the first three natural frequencies is presented to compare the results and its performance.Keywords: composite plate, natural frequency, polynomial regression model, sampling technique, uncertainty quantification
Procedia PDF Downloads 51317963 E-Consumers’ Attribute Non-Attendance Switching Behavior: Effect of Providing Information on Attributes
Authors: Leonard Maaya, Michel Meulders, Martina Vandebroek
Abstract:
Discrete Choice Experiments (DCE) are used to investigate how product attributes affect decision-makers’ choices. In DCEs, choice situations consisting of several alternatives are presented from which choice-makers select the preferred alternative. Standard multinomial logit models based on random utility theory can be used to estimate the utilities for the attributes. The overarching principle in these models is that respondents understand and use all the attributes when making choices. However, studies suggest that respondents sometimes ignore some attributes (commonly referred to as Attribute Non-Attendance/ANA). The choice modeling literature presents ANA as a static process, i.e., respondents’ ANA behavior does not change throughout the experiment. However, respondents may ignore attributes due to changing factors like availability of information on attributes, learning/fatigue in experiments, etc. We develop a dynamic mixture latent Markov model to model changes in ANA when information on attributes is provided. The model is illustrated on e-consumers’ webshop choices. The results indicate that the dynamic ANA model describes the behavioral changes better than modeling the impact of information using changes in parameters. Further, we find that providing information on attributes leads to an increase in the attendance probabilities for the investigated attributes.Keywords: choice models, discrete choice experiments, dynamic models, e-commerce, statistical modeling
Procedia PDF Downloads 14017962 Combining a Continuum of Hidden Regimes and a Heteroskedastic Three-Factor Model in Option Pricing
Authors: Rachid Belhachemi, Pierre Rostan, Alexandra Rostan
Abstract:
This paper develops a discrete-time option pricing model for index options. The model consists of two key ingredients. First, daily stock return innovations are driven by a continuous hidden threshold mixed skew-normal (HTSN) distribution which generates conditional non-normality that is needed to fit daily index return. The most important feature of the HTSN is the inclusion of a latent state variable with a continuum of states, unlike the traditional mixture distributions where the state variable is discrete with little number of states. The HTSN distribution belongs to the class of univariate probability distributions where parameters of the distribution capture the dependence between the variable of interest and the continuous latent state variable (the regime). The distribution has an interpretation in terms of a mixture distribution with time-varying mixing probabilities. It has been shown empirically that this distribution outperforms its main competitor, the mixed normal (MN) distribution, in terms of capturing the stylized facts known for stock returns, namely, volatility clustering, leverage effect, skewness, kurtosis and regime dependence. Second, heteroscedasticity in the model is captured by a threeexogenous-factor GARCH model (GARCHX), where the factors are taken from the principal components analysis of various world indices and presents an application to option pricing. The factors of the GARCHX model are extracted from a matrix of world indices applying principal component analysis (PCA). The empirically determined factors are uncorrelated and represent truly different common components driving the returns. Both factors and the eight parameters inherent to the HTSN distribution aim at capturing the impact of the state of the economy on price levels since distribution parameters have economic interpretations in terms of conditional volatilities and correlations of the returns with the hidden continuous state. The PCA identifies statistically independent factors affecting the random evolution of a given pool of assets -in our paper a pool of international stock indices- and sorting them by order of relative importance. The PCA computes a historical cross asset covariance matrix and identifies principal components representing independent factors. In our paper, factors are used to calibrate the HTSN-GARCHX model and are ultimately responsible for the nature of the distribution of random variables being generated. We benchmark our model to the MN-GARCHX model following the same PCA methodology and the standard Black-Scholes model. We show that our model outperforms the benchmark in terms of RMSE in dollar losses for put and call options, which in turn outperforms the analytical Black-Scholes by capturing the stylized facts known for index returns, namely, volatility clustering, leverage effect, skewness, kurtosis and regime dependence.Keywords: continuous hidden threshold, factor models, GARCHX models, option pricing, risk-premium
Procedia PDF Downloads 29717961 Secure Watermarking not at the Cost of Low Robustness
Authors: Jian Cao
Abstract:
This paper describes a novel watermarking technique which we call the random direction embedding (RDE) watermarking. Unlike traditional watermarking techniques, the watermark energy after the RDE embedding does not focus on a fixed direction, leading to the security against the traditional unauthorized watermark removal attack. In addition, the experimental results show that when compared with the existing secure watermarking, namely natural watermarking (NW), the RDE watermarking gains significant improvement in terms of robustness. In fact, the security of the RDE watermarking is not at the cost of low robustness, and it can even achieve more robust than the traditional spread spectrum watermarking, which has been shown to be very insecure.Keywords: robustness, spread spectrum watermarking, watermarking security, random direction embedding (RDE)
Procedia PDF Downloads 38417960 A Study of Non Linear Partial Differential Equation with Random Initial Condition
Authors: Ayaz Ahmad
Abstract:
In this work, we present the effect of noise on the solution of a partial differential equation (PDE) in three different setting. We shall first consider random initial condition for two nonlinear dispersive PDE the non linear Schrodinger equation and the Kortteweg –de vries equation and analyse their effect on some special solution , the soliton solutions.The second case considered a linear partial differential equation , the wave equation with random initial conditions allow to substantially decrease the computational and data storage costs of an algorithm to solve the inverse problem based on the boundary measurements of the solution of this equation. Finally, the third example considered is that of the linear transport equation with a singular drift term, when we shall show that the addition of a multiplicative noise term forbids the blow up of solutions under a very weak hypothesis for which we have finite time blow up of a solution in the deterministic case. Here we consider the problem of wave propagation, which is modelled by a nonlinear dispersive equation with noisy initial condition .As observed noise can also be introduced directly in the equations.Keywords: drift term, finite time blow up, inverse problem, soliton solution
Procedia PDF Downloads 21517959 An Adaptive Controller Method Based on Full-State Linear Model of Variable Cycle Engine
Authors: Jia Li, Huacong Li, Xiaobao Han
Abstract:
Due to the more variable geometry parameters of VCE (variable cycle aircraft engine), presents an adaptive controller method based on the full-state linear model of VCE and has simulated to solve the multivariate controller design problem of the whole flight envelops. First, analyzes the static and dynamic performances of bypass ratio and other state parameters caused by variable geometric components, and develops nonlinear component model of VCE. Then based on the component model, through small deviation linearization of main fuel (Wf), the area of tail nozzle throat (A8) and the angle of rear bypass ejector (A163), setting up multiple linear model which variable geometric parameters can be inputs. Second, designs the adaptive controllers for VCE linear models of different nominal points. Among them, considering of modeling uncertainties and external disturbances, derives the adaptive law by lyapunov function. The simulation results showed that, the adaptive controller method based on full-state linear model used the angle of rear bypass ejector as input and effectively solved the multivariate control problems of VCE. The performance of all nominal points could track the desired closed-loop reference instructions. The adjust time was less than 1.2s, and the system overshoot was less than 1%, at the same time, the errors of steady states were less than 0.5% and the dynamic tracking errors were less than 1%. In addition, the designed controller could effectively suppress interference and reached the desired commands with different external random noise signals.Keywords: variable cycle engine (VCE), full-state linear model, adaptive control, by-pass ratio
Procedia PDF Downloads 31717958 Using Nonhomogeneous Poisson Process with Compound Distribution to Price Catastrophe Options
Authors: Rong-Tsorng Wang
Abstract:
In this paper, we derive a pricing formula for catastrophe equity put options (or CatEPut) with non-homogeneous loss and approximated compound distributions. We assume that the loss claims arrival process is a nonhomogeneous Poisson process (NHPP) representing the clustering occurrences of loss claims, the size of loss claims is a sequence of independent and identically distributed random variables, and the accumulated loss distribution forms a compound distribution and is approximated by a heavy-tailed distribution. A numerical example is given to calibrate parameters, and we discuss how the value of CatEPut is affected by the changes of parameters in the pricing model we provided.Keywords: catastrophe equity put options, compound distributions, nonhomogeneous Poisson process, pricing model
Procedia PDF Downloads 16717957 Enhanced Test Scheme based on Programmable Write Time for Future Computer Memories
Authors: Nor Zaidi Haron, Fauziyah Salehuddin, Norsuhaidah Arshad, Sani Irwan Salim
Abstract:
Resistive random access memories (RRAMs) are one of the main candidates for future computer memories. However, due to their tiny size and immature device technology, the quality of the outgoing RRAM chips is seen as a serious issue. Defective RRAM cells might behave differently than existing semiconductor memories (Dynamic RAM, Static RAM, and Flash), meaning that they are difficult to be detected using existing test schemes. This paper presents an enhanced test scheme, referred to as Programmable Short Write Time (PSWT) that is able to improve the detection of faulty RRAM cells. It is developed by applying multiple weak write operations, each with different time durations. The test circuit embedded in the RRAM chip is made programmable in order to supply different weak write times during testing. The RRAM electrical model is described using Verilog-AMS language and is simulated using HSPICE simulation tools. Simulation results show that the proposed test scheme offers better open-resistive fault detection compared to existing test schemes.Keywords: memory fault, memory test, design-for-testability, resistive random access memory
Procedia PDF Downloads 38717956 Evidence of Conditional and Unconditional Cooperation in a Public Goods Game: Experimental Evidence from Mali
Authors: Maria Laura Alzua, Maria Adelaida Lopera
Abstract:
This paper measures the relative importance of conditional cooperation and unconditional cooperation in a large public goods experiment conducted in Mali. We use expectations about total public goods provision to estimate a structural choice model with heterogeneous preferences. While unconditional cooperation can be captured by common preferences shared by all participants, conditional cooperation is much more heterogeneous and depends on unobserved individual factors. This structural model, in combination with two experimental treatments, suggests that leadership and group communication incentivize public goods provision through different channels. First, We find that participation of local leaders effectively changes individual choices through unconditional cooperation. A simulation exercise predicts that even in the most pessimistic scenario in which all participants expect zero public good provision, 60% would still choose to cooperate. Second, allowing participants to communicate fosters conditional cooperation. The simulations suggest that expectations are responsible for around 24% of the observed public good provision and that group communication does not necessarily ameliorate public good provision. In fact, communication may even worsen the outcome when expectations are low.Keywords: conditional cooperation, discrete choice model, expectations, public goods game, random coefficients model
Procedia PDF Downloads 30617955 Detection Characteristics of the Random and Deterministic Signals in Antenna Arrays
Authors: Olesya Bolkhovskaya, Alexey Davydov, Alexander Maltsev
Abstract:
In this paper approach to incoherent signal detection in multi-element antenna array are researched and modeled. Two types of useful signals with unknown wavefront were considered. First one is deterministic (Barker code), the second one is random (Gaussian distribution). The derivation of the sufficient statistics took into account the linearity of the antenna array. The performance characteristics and detecting curves are modeled and compared for different useful signals parameters and for different number of elements of the antenna array. Results of researches in case of some additional conditions can be applied to a digital communications systems.Keywords: antenna array, detection curves, performance characteristics, quadrature processing, signal detection
Procedia PDF Downloads 40517954 Optimization of Reliability and Communicability of a Random Two-Dimensional Point Patterns Using Delaunay Triangulation
Authors: Sopheak Sorn, Kwok Yip Szeto
Abstract:
Reliability is one of the important measures of how well the system meets its design objective, and mathematically is the probability that a complex system will perform satisfactorily. When the system is described by a network of N components (nodes) and their L connection (links), the reliability of the system becomes a network design problem that is an NP-hard combinatorial optimization problem. In this paper, we address the network design problem for a random point set’s pattern in two dimensions. We make use of a Voronoi construction with each cell containing exactly one point in the point pattern and compute the reliability of the Voronoi’s dual, i.e. the Delaunay graph. We further investigate the communicability of the Delaunay network. We find that there is a positive correlation and a negative correlation between the homogeneity of a Delaunay's degree distribution with its reliability and its communicability respectively. Based on the correlations, we alter the communicability and the reliability by performing random edge flips, which preserve the number of links and nodes in the network but can increase the communicability in a Delaunay network at the cost of its reliability. This transformation is later used to optimize a Delaunay network with the optimum geometric mean between communicability and reliability. We also discuss the importance of the edge flips in the evolution of real soap froth in two dimensions.Keywords: Communicability, Delaunay triangulation, Edge Flip, Reliability, Two dimensional network, Voronio
Procedia PDF Downloads 41917953 Quantum Graph Approach for Energy and Information Transfer through Networks of Cables
Authors: Mubarack Ahmed, Gabriele Gradoni, Stephen C. Creagh, Gregor Tanner
Abstract:
High-frequency cables commonly connect modern devices and sensors. Interestingly, the proportion of electric components is rising fast in an attempt to achieve lighter and greener devices. Modelling the propagation of signals through these cable networks in the presence of parameter uncertainty is a daunting task. In this work, we study the response of high-frequency cable networks using both Transmission Line and Quantum Graph (QG) theories. We have successfully compared the two theories in terms of reflection spectra using measurements on real, lossy cables. We have derived a generalisation of the vertex scattering matrix to include non-uniform networks – networks of cables with different characteristic impedances and propagation constants. The QG model implicitly takes into account the pseudo-chaotic behavior, at the vertices, of the propagating electric signal. We have successfully compared the asymptotic growth of eigenvalues of the Laplacian with the predictions of Weyl law. We investigate the nearest-neighbour level-spacing distribution of the resonances and compare our results with the predictions of Random Matrix Theory (RMT). To achieve this, we will compare our graphs with the generalisation of Wigner distribution for open systems. The problem of scattering from networks of cables can also provide an analogue model for wireless communication in highly reverberant environments. In this context, we provide a preliminary analysis of the statistics of communication capacity for communication across cable networks, whose eventual aim is to enable detailed laboratory testing of information transfer rates using software defined radio. We specialise this analysis in particular for the case of MIMO (Multiple-Input Multiple-Output) protocols. We have successfully validated our QG model with both TL model and laboratory measurements. The growth of Eigenvalues compares well with Weyl’s law and the level-spacing distribution agrees so well RMT predictions. The results we achieved in the MIMO application compares favourably with the prediction of a parallel on-going research (sponsored by NEMF21.)Keywords: eigenvalues, multiple-input multiple-output, quantum graph, random matrix theory, transmission line
Procedia PDF Downloads 17317952 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform
Authors: Reza Mohammadzadeh
Abstract:
The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.Keywords: data model, geotechnical risks, machine learning, underground coal mining
Procedia PDF Downloads 274