Search results for: probability distributions
1696 Parameter Interactions in the Cumulative Prospect Theory: Fitting the Binary Choice Experiment Data
Authors: Elzbieta Babula, Juhyun Park
Abstract:
Tversky and Kahneman’s cumulative prospect theory assumes symmetric probability cumulation with regard to the reference point within decision weights. Theoretically, this model should be invariant under the change of the direction of probability cumulation. In the present study, this phenomenon is being investigated by creating a reference model that allows verifying the parameter interactions in the cumulative prospect theory specifications. The simultaneous parametric fitting of utility and weighting functions is applied to binary choice data from the experiment. The results show that the flexibility of the probability weighting function is a crucial characteristic allowing to prevent parameter interactions while estimating cumulative prospect theory.Keywords: binary choice experiment, cumulative prospect theory, decision weights, parameter interactions
Procedia PDF Downloads 2141695 Numerical Analysis of Liquid Metal Magnetohydrodynamic Flows in a Manifold with Three Sub-Channels
Authors: Meimei Wen, Chang Nyung Kim
Abstract:
In the current study, three-dimensional liquid metal (LM) magneto-hydrodynamic (MHD) flows in a manifold with three sub-channels under a uniform magnetic field are numerically investigated. In the manifold, the electrical current can cross channel walls, thus having influence on the flow distribution in each sub-channel. A case with various arrangements of electric conductivity for different parts of channel walls is considered, yielding different current distributions as well as flow distributions in each sub-channel. Here, the imbalance of mass flow rates in the three sub-channels is addressed. Meanwhile, predicted are detailed behaviors of the flow velocity, pressure, current and electric potential of LM MHD flows with three sub-channels. Commercial software CFX is used for the numerical simulation of LM MHD flows.Keywords: CFX, liquid metal, manifold, MHD flow
Procedia PDF Downloads 3421694 Optimal Mitigation of Slopes by Probabilistic Methods
Authors: D. De-León-Escobedo, D. J. Delgado-Hernández, S. Pérez
Abstract:
A probabilistic formulation to assess the slopes safety under the hazard of strong storms is presented and illustrated through a slope in Mexico. The formulation is based on the classical safety factor (SF) used in practice to appraise the slope stability, but it is introduced the treatment of uncertainties, and the slope failure probability is calculated as the probability that SF<1. As the main hazard is the rainfall on the area, statistics of rainfall intensity and duration are considered and modeled with an exponential distribution. The expected life-cycle cost is assessed by considering a monetary value on the slope failure consequences. Alternative mitigation measures are simulated, and the formulation is used to get the measures driving to the optimal one (minimum life-cycle costs). For the example, the optimal mitigation measure is the reduction on the slope inclination angle.Keywords: expected life-cycle cost, failure probability, slopes failure, storms
Procedia PDF Downloads 1581693 Design and Analysis of Adaptive Type-I Progressive Hybrid Censoring Plan under Step Stress Partially Accelerated Life Testing Using Competing Risk
Authors: Ariful Islam, Showkat Ahmad Lone
Abstract:
Statistical distributions have long been employed in the assessment of semiconductor devices and product reliability. The power function-distribution is one of the most important distributions in the modern reliability practice and can be frequently preferred over mathematically more complex distributions, such as the Weibull and the lognormal, because of its simplicity. Moreover, it may exhibit a better fit for failure data and provide more appropriate information about reliability and hazard rates in some circumstances. This study deals with estimating information about failure times of items under step-stress partially accelerated life tests for competing risk based on adoptive type-I progressive hybrid censoring criteria. The life data of the units under test is assumed to follow Mukherjee-Islam distribution. The point and interval maximum-likelihood estimations are obtained for distribution parameters and tampering coefficient. The performances of the resulting estimators of the developed model parameters are evaluated and investigated by using a simulation algorithm.Keywords: adoptive progressive hybrid censoring, competing risk, mukherjee-islam distribution, partially accelerated life testing, simulation study
Procedia PDF Downloads 3461692 3D-Mesh Robust Watermarking Technique for Ownership Protection and Authentication
Authors: Farhan A. Alenizi
Abstract:
Digital watermarking has evolved in the past years as an important means for data authentication and ownership protection. The images and video watermarking was well known in the field of multimedia processing; however, 3D objects' watermarking techniques have emerged as an important means for the same purposes, as 3D mesh models are in increasing use in different areas of scientific, industrial, and medical applications. Like the image watermarking techniques, 3D watermarking can take place in either space or transform domains. Unlike images and video watermarking, where the frames have regular structures in both space and temporal domains, 3D objects are represented in different ways as meshes that are basically irregular samplings of surfaces; moreover, meshes can undergo a large variety of alterations which may be hard to tackle. This makes the watermarking process more challenging. While the transform domain watermarking is preferable in images and videos, they are still difficult to implement in 3d meshes due to the huge number of vertices involved and the complicated topology and geometry, and hence the difficulty to perform the spectral decomposition, even though significant work was done in the field. Spatial domain watermarking has attracted significant attention in the past years; they can either act on the topology or on the geometry of the model. Exploiting the statistical characteristics in the 3D mesh models from both geometrical and topological aspects was useful in hiding data. However, doing that with minimal surface distortions to the mesh attracted significant research in the field. A 3D mesh blind watermarking technique is proposed in this research. The watermarking method depends on modifying the vertices' positions with respect to the center of the object. An optimal method will be developed to reduce the errors, minimizing the distortions that the 3d object may experience due to the watermarking process, and reducing the computational complexity due to the iterations and other factors. The technique relies on the displacement process of the vertices' locations depending on the modification of the variances of the vertices’ norms. Statistical analyses were performed to establish the proper distributions that best fit each mesh, and hence establishing the bins sizes. Several optimizing approaches were introduced in the realms of mesh local roughness, the statistical distributions of the norms, and the displacements in the mesh centers. To evaluate the algorithm's robustness against other common geometry and connectivity attacks, the watermarked objects were subjected to uniform noise, Laplacian smoothing, vertices quantization, simplification, and cropping. Experimental results showed that the approach is robust in terms of both perceptual and quantitative qualities. It was also robust against both geometry and connectivity attacks. Moreover, the probability of true positive detection versus the probability of false-positive detection was evaluated. To validate the accuracy of the test cases, the receiver operating characteristics (ROC) curves were drawn, and they’ve shown robustness from this aspect. 3D watermarking is still a new field but still a promising one.Keywords: watermarking, mesh objects, local roughness, Laplacian Smoothing
Procedia PDF Downloads 1591691 Statistical Analysis of Cables in Long-Span Cable-Stayed Bridges
Authors: Ceshi Sun, Yueyu Zhao, Yaobing Zhao, Zhiqiang Wang, Jian Peng, Pengxin Guo
Abstract:
With the rapid development of transportation, there are more than 100 cable-stayed bridges with main span larger than 300 m in China. In order to ascertain the statistical relationships among the design parameters of stay cables and their distribution characteristics, 1500 cables were selected from 25 practical long-span cable-stayed bridges. A new relationship between the first order frequency and the length of cable was found by conducting the curve fitting. Then, based on this relationship other interesting relationships were deduced. Several probability density functions (PDFs) were used to investigate the distributions of the parameters of first order frequency, stress level and the Irvine parameter. It was found that these parameters obey the Lognormal distribution, the Weibull distribution and the generalized Pareto distribution, respectively. Scatter diagrams of the three parameters were plotted and their 95% confidence intervals were also investigated.Keywords: cable, cable-stayed bridge, long-span, statistical analysis
Procedia PDF Downloads 6311690 Co-Movement between Financial Assets: An Empirical Study on Effects of the Depreciation of Yen on Asia Markets
Authors: Yih-Wenn Laih
Abstract:
In recent times, the dependence and co-movement among international financial markets have become stronger than in the past, as evidenced by commentaries in the news media and the financial sections of newspapers. Studying the co-movement between returns in financial markets is an important issue for portfolio management and risk management. The realization of co-movement helps investors to identify the opportunities for international portfolio management in terms of asset allocation and pricing. Since the election of the new Prime Minister, Shinzo Abe, in November 2012, the yen has weakened against the US dollar from the 80 to the 120 level. The policies, known as “Abenomics,” are to encourage private investment through a more aggressive mix of monetary and fiscal policy. Given the close economic relations and competitions among Asia markets, it is interesting to discover the co-movement relations, affected by the depreciation of yen, between stock market of Japan and 5 major Asia stock markets, including China, Hong Kong, Korea, Singapore, and Taiwan. Specifically, we devote ourselves to measure the co-movement of stock markets between Japan and each one of the 5 Asia stock markets in terms of rank correlation coefficients. To compute the coefficients, return series of each stock market is first fitted by a skewed-t GARCH (generalized autoregressive conditional heteroscedasticity) model. Secondly, to measure the dependence structure between matched stock markets, we employ the symmetrized Joe-Clayton (SJC) copula to calculate the probability density function of paired skewed-t distributions. The joint probability density function is then utilized as the scoring scheme to optimize the sequence alignment by dynamic programming method. Finally, we compute the rank correlation coefficients (Kendall's and Spearman's ) between matched stock markets based on their aligned sequences. We collect empirical data of 6 stock indexes from Taiwan Economic Journal. The data is sampled at a daily frequency covering the period from January 1, 2013 to July 31, 2015. The empirical distributions of returns indicate fatter tails than the normal distribution. Therefore, the skewed-t distribution and SJC copula are appropriate for characterizing the data. According to the computed Kendall’s τ, Korea has the strongest co-movement relation with Japan, followed by Taiwan, China, and Singapore; the weakest is Hong Kong. On the other hand, the Spearman’s ρ reveals that the strength of co-movement between markets with Japan in decreasing order are Korea, China, Taiwan, Singapore, and Hong Kong. We explore the effects of “Abenomics” on Asia stock markets by measuring the co-movement relation between Japan and five major Asia stock markets in terms of rank correlation coefficients. The matched markets are aligned by a hybrid method consisting of GARCH, copula and sequence alignment. Empirical experiments indicate that Korea has the strongest co-movement relation with Japan. The strength of China and Taiwan are better than Singapore. The Hong Kong market has the weakest co-movement relation with Japan.Keywords: co-movement, depreciation of Yen, rank correlation, stock market
Procedia PDF Downloads 2291689 A Statistical Model for the Dynamics of Single Cathode Spot in Vacuum Cylindrical Cathode
Authors: Po-Wen Chen, Jin-Yu Wu, Md. Manirul Ali, Yang Peng, Chen-Te Chang, Der-Jun Jan
Abstract:
Dynamics of cathode spot has become a major part of vacuum arc discharge with its high academic interest and wide application potential. In this article, using a three-dimensional statistical model, we simulate the distribution of the ignition probability of a new cathode spot occurring in different magnetic pressure on old cathode spot surface and at different arcing time. This model for the ignition probability of a new cathode spot was proposed in two typical situations, one by the pure isotropic random walk in the absence of an external magnetic field, other by the retrograde motion in external magnetic field, in parallel with the cathode surface. We mainly focus on developed relationship between the ignition probability density distribution of a new cathode spot and the external magnetic field.Keywords: cathode spot, vacuum arc discharge, transverse magnetic field, random walk
Procedia PDF Downloads 4311688 Validation of a Reloading Vehicle Design by Finite Element Analysis
Authors: Tuğrul Aksoy, Hüseyin Karabıyık
Abstract:
Reloading vehicles are the vehicles which are generally equipped with a crane and used to carry a stowage from a point and locate onto the vehicle or vice versa. In this study, structural analysis of a reloading vehicle was performed under the loads which are predicted to be exposed under operating conditions via the finite element method. Among the finite element analysis results, the stress and displacement distributions of the vehicle and the contact pressure distributions of the guide rings within the stabilization legs were examined. Vehicle design was improved by strengthening certain parts according to the analysis results. The analyses performed for the final design were verified by the experiments involving strain gauge measurements.Keywords: structural analysis, reloading vehicle, crane, strain gauge
Procedia PDF Downloads 671687 Reliability and Probability Weighted Moment Estimation for Three Parameter Mukherjee-Islam Failure Model
Authors: Ariful Islam, Showkat Ahmad Lone
Abstract:
The Mukherjee-Islam Model is commonly used as a simple life time distribution to assess system reliability. The model exhibits a better fit for failure information and provides more appropriate information about hazard rate and other reliability measures as shown by various authors. It is possible to introduce a location parameter at a time (i.e., a time before which failure cannot occur) which makes it a more useful failure distribution than the existing ones. Even after shifting the location of the distribution, it represents a decreasing, constant and increasing failure rate. It has been shown to represent the appropriate lower tail of the distribution of random variables having fixed lower bound. This study presents the reliability computations and probability weighted moment estimation of three parameter model. A comparative analysis is carried out between three parameters finite range model and some existing bathtub shaped curve fitting models. Since probability weighted moment method is used, the results obtained can also be applied on small sample cases. Maximum likelihood estimation method is also applied in this study.Keywords: comparative analysis, maximum likelihood estimation, Mukherjee-Islam failure model, probability weighted moment estimation, reliability
Procedia PDF Downloads 2711686 Potential Distribution and Electric Field Analysis around a Polluted Outdoor Polymeric Insulator with Broken Sheds
Authors: Adel Kara, Abdelhafid Bayadi, Hocine Terrab
Abstract:
This paper presents a study of electric field distribution along of 72 kV polymeric outdoor insulators with broken sheds. Different cases of damaged insulators are modeled and both of clean and polluted cases. By 3D finite element analysis using the software package COMSOL Multiphysics 4.3b. The obtained results of potential and the electrical field distribution around insulators by 3D simulation proved that finite element computations is useful tool for studying insulation electrical field distribution.Keywords: electric field distributions, insulator, broken sheds, potential distributions
Procedia PDF Downloads 5091685 Reliability Indices Evaluation of SEIG Rotor Core Magnetization with Minimum Capacitive Excitation for WECs
Authors: Lokesh Varshney, R. K. Saket
Abstract:
This paper presents reliability indices evaluation of the rotor core magnetization of the induction motor operated as a self-excited induction generator by using probability distribution approach and Monte Carlo simulation. Parallel capacitors with calculated minimum capacitive value across the terminals of the induction motor operating as a SEIG with unregulated shaft speed have been connected during the experimental study. A three phase, 4 poles, 50Hz, 5.5 hp, 12.3A, 230V induction motor coupled with DC Shunt Motor was tested in the electrical machine laboratory with variable reactive loads. Based on this experimental study, it is possible to choose a reliable induction machine operating as a SEIG for unregulated renewable energy application in remote area or where grid is not available. Failure density function, cumulative failure distribution function, survivor function, hazard model, probability of success and probability of failure for reliability evaluation of the three phase induction motor operating as a SEIG have been presented graphically in this paper.Keywords: residual magnetism, magnetization curve, induction motor, self excited induction generator, probability distribution, Monte Carlo simulation
Procedia PDF Downloads 5561684 The Use of the Matlab Software as the Best Way to Recognize Penumbra Region in Radiotherapy
Authors: Alireza Shayegan, Morteza Amirabadi
Abstract:
The y tool was developed to quantitatively compare dose distributions, either measured or calculated. Before computing ɣ, the dose and distance scales of the two distributions, referred to as evaluated and reference, are re-normalized by dose and distance criteria, respectively. The re-normalization allows the dose distribution comparison to be conducted simultaneously along dose and distance axes. Several two-dimensional images were acquired using a Scanning Liquid Ionization Chamber EPID and Extended Dose Range (EDR2) films for regular and irregular radiation fields. The raw images were then converted into two-dimensional dose maps. Transitional and rotational manipulations were performed for images using Matlab software. As evaluated dose distribution maps, they were then compared with the corresponding original dose maps as the reference dose maps.Keywords: energetic electron, gamma function, penumbra, Matlab software
Procedia PDF Downloads 2981683 Statistical Channel Modeling for Multiple-Input-Multiple-Output Communication System
Authors: M. I. Youssef, A. E. Emam, M. Abd Elghany
Abstract:
The performance of wireless communication systems is affected mainly by the environment of its associated channel, which is characterized by dynamic and unpredictable behavior. In this paper, different statistical earth-satellite channel models are studied with emphasize on two main models, first is the Rice-Log normal model, due to its representation for the environment including shadowing and multi-path components that affect the propagated signal along its path, and a three-state model that take into account different fading conditions (clear area, moderate shadow and heavy shadowing). The provided models are based on AWGN, Rician, Rayleigh, and log-normal distributions were their Probability Density Functions (PDFs) are presented. The transmission system Bit Error Rate (BER), Peak-Average-Power Ratio (PAPR), and the channel capacity vs. fading models are measured and analyzed. These simulations are implemented using MATLAB tool, and the results had shown the performance of transmission system over different channel models.Keywords: fading channels, MIMO communication, RNS scheme, statistical modeling
Procedia PDF Downloads 1461682 Graphical Modeling of High Dimension Processes with an Environmental Application
Authors: Ali S. Gargoum
Abstract:
Graphical modeling plays an important role in providing efficient probability calculations in high dimensional problems (computational efficiency). In this paper, we address one of such problems where we discuss fragmenting puff models and some distributional assumptions concerning models for the instantaneous, emission readings and for the fragmenting process. A graphical representation in terms of a junction tree of the conditional probability breakdown of puffs and puff fragments is proposed.Keywords: graphical models, influence diagrams, junction trees, Bayesian nets
Procedia PDF Downloads 3951681 Computational Fluid Dynamic Investigation into the Relationship between Pressure and Velocity Distributions within a Microfluidic Feedback Oscillator
Authors: Zara L. Sheady
Abstract:
Fluidic oscillators are being utilised in an increasing number of applications in a wide variety of areas; these include on-board vehicle cleaning systems, flow separation control on aircraft and in fluidic circuitry. With this increased use, there is a further understanding required for the mechanics of the fluidics of the fluidic oscillator and why they work in the manner that they do. ANSYS CFX has been utilized to visualise the pressure and velocity within a microfluidic feedback oscillator. The images demonstrate how the pressure vortices build within the oscillator at the points where the velocity is diverted from linear motion through the oscillator. With an enhanced understanding of the pressure and velocity distributions within a fluidic oscillator, it will enable users of microfluidics to more greatly tailor fluidic nozzles to their specification.Keywords: ANSYS CFX, control, fluidic oscillators, mechanics, pressure, relationship, velocity
Procedia PDF Downloads 3351680 On Virtual Coordination Protocol towards 5G Interference Mitigation: Modelling and Performance Analysis
Authors: Bohli Afef
Abstract:
The fifth-generation (5G) wireless systems is featured by extreme densities of cell stations to overcome the higher future demand. Hence, interference management is a crucial challenge in 5G ultra-dense cellular networks. In contrast to the classical inter-cell interference coordination approach, which is no longer fit for the high density of cell-tiers, this paper proposes a novel virtual coordination based on the dynamic common cognitive monitor channel protocol to deal with the inter-cell interference issue. A tractable and flexible model for the coverage probability of a typical user is developed through the use of the stochastic geometry model. The analyses of the performance of the suggested protocol are illustrated both analytically and numerically in terms of coverage probability.Keywords: ultra dense heterogeneous networks, dynamic common channel protocol, cognitive radio, stochastic geometry, coverage probability
Procedia PDF Downloads 3251679 Exact Solutions for Steady Response of Nonlinear Systems under Non-White Excitation
Authors: Yaping Zhao
Abstract:
In the present study, the exact solutions for the steady response of quasi-linear systems under non-white wide-band random excitation are considered by means of the stochastic averaging method. The non linearity of the systems contains the power-law damping and the cross-product term of the power-law damping and displacement. The drift and diffusion coefficients of the Fokker-Planck-Kolmogorov (FPK) equation after averaging are obtained by a succinct approach. After solving the averaged FPK equation, the joint probability density function and the marginal probability density function in steady state are attained. In the process of resolving, the eigenvalue problem of ordinary differential equation is handled by integral equation method. Some new results are acquired and the novel method to deal with the problems in nonlinear random vibration is proposed.Keywords: random vibration, stochastic averaging method, FPK equation, transition probability density
Procedia PDF Downloads 5021678 A Bayesian Model with Improved Prior in Extreme Value Problems
Authors: Eva L. Sanjuán, Jacinto Martín, M. Isabel Parra, Mario M. Pizarro
Abstract:
In Extreme Value Theory, inference estimation for the parameters of the distribution is made employing a small part of the observation values. When block maxima values are taken, many data are discarded. We developed a new Bayesian inference model to seize all the information provided by the data, introducing informative priors and using the relations between baseline and limit parameters. Firstly, we studied the accuracy of the new model for three baseline distributions that lead to a Gumbel extreme distribution: Exponential, Normal and Gumbel. Secondly, we considered mixtures of Normal variables, to simulate practical situations when data do not adjust to pure distributions, because of perturbations (noise).Keywords: bayesian inference, extreme value theory, Gumbel distribution, highly informative prior
Procedia PDF Downloads 1961677 Frequency Analysis of Minimum Ecological Flow and Gage Height in Indus River Using Maximum Likelihood Estimation
Authors: Tasir Khan, Yejuan Wan, Kalim Ullah
Abstract:
Hydrological frequency analysis has been conducted to estimate the minimum flow elevation of the Indus River in Pakistan to protect the ecosystem. The Maximum likelihood estimation (MLE) technique is used to estimate the best-fitted distribution for Minimum Ecological Flows at nine stations of the Indus River in Pakistan. The four selected distributions, Generalized Extreme Value (GEV) distribution, Generalized Logistics (GLO) distribution, Generalized Pareto (GPA) distribution, and Pearson type 3 (PE3) are fitted in all sites, usually used in hydro frequency analysis. Compare the performance of these distributions by using the goodness of fit tests, such as the Kolmogorov Smirnov test, Anderson darling test, and chi-square test. The study concludes that the Maximum Likelihood Estimation (MLE) method recommended that GEV and GPA are the most suitable distributions which can be effectively applied to all the proposed sites. The quantiles are estimated for the return periods from 5 to 1000 years by using MLE, estimations methods. The MLE is the robust method for larger sample sizes. The results of these analyses can be used for water resources research, including water quality management, designing irrigation systems, determining downstream flow requirements for hydropower, and the impact of long-term drought on the country's aquatic system.Keywords: minimum ecological flow, frequency distribution, indus river, maximum likelihood estimation
Procedia PDF Downloads 761676 Wind Fragility for Honeycomb Roof Cladding Panels Using Screw Pull-Out Capacity
Authors: Viriyavudh Sim, Woo Young Jung
Abstract:
The failure of roof cladding mostly occurs due to the failing of the connection between claddings and purlins, which is the pull-out of the screw connecting the two parts when the pull-out load, i.e. typhoon, is higher than the resistance of the connection screw. As typhoon disasters in Korea are constantly on the rise, probability risk assessment (PRA) has become a vital tool to evaluate the performance of civil structures. In this study, we attempted to determine the fragility of roof cladding with the screw connection. Experimental study was performed to evaluate the pull-out resistance of screw joints between honeycomb panels and back frames. Subsequently, by means of Monte Carlo Simulation method, probability of failure for these types of roof cladding was determined. The results that the failure of roof cladding was depends on their location on the roof, for example, the edge most panel has the highest probability of failure.Keywords: Monte Carlo Simulation, roof cladding, screw pull-out strength, wind fragility
Procedia PDF Downloads 2511675 Neural Network and Support Vector Machine for Prediction of Foot Disorders Based on Foot Analysis
Authors: Monireh Ahmadi Bani, Adel Khorramrouz, Lalenoor Morvarid, Bagheri Mahtab
Abstract:
Background:- Foot disorders are common in musculoskeletal problems. Plantar pressure distribution measurement is one the most important part of foot disorders diagnosis for quantitative analysis. However, the association of plantar pressure and foot disorders is not clear. With the growth of dataset and machine learning methods, the relationship between foot disorders and plantar pressures can be detected. Significance of the study:- The purpose of this study was to predict the probability of common foot disorders based on peak plantar pressure distribution and center of pressure during walking. Methodologies:- 2323 participants were assessed in a foot therapy clinic between 2015 and 2021. Foot disorders were diagnosed by an experienced physician and then they were asked to walk on a force plate scanner. After the data preprocessing, due to the difference in walking time and foot size, we normalized the samples based on time and foot size. Some of force plate variables were selected as input to a deep neural network (DNN), and the probability of any each foot disorder was measured. In next step, we used support vector machine (SVM) and run dataset for each foot disorder (classification of yes or no). We compared DNN and SVM for foot disorders prediction based on plantar pressure distributions and center of pressure. Findings:- The results demonstrated that the accuracy of deep learning architecture is sufficient for most clinical and research applications in the study population. In addition, the SVM approach has more accuracy for predictions, enabling applications for foot disorders diagnosis. The detection accuracy was 71% by the deep learning algorithm and 78% by the SVM algorithm. Moreover, when we worked with peak plantar pressure distribution, it was more accurate than center of pressure dataset. Conclusion:- Both algorithms- deep learning and SVM will help therapist and patients to improve the data pool and enhance foot disorders prediction with less expense and error after removing some restrictions properly.Keywords: deep neural network, foot disorder, plantar pressure, support vector machine
Procedia PDF Downloads 3501674 Using Nonhomogeneous Poisson Process with Compound Distribution to Price Catastrophe Options
Authors: Rong-Tsorng Wang
Abstract:
In this paper, we derive a pricing formula for catastrophe equity put options (or CatEPut) with non-homogeneous loss and approximated compound distributions. We assume that the loss claims arrival process is a nonhomogeneous Poisson process (NHPP) representing the clustering occurrences of loss claims, the size of loss claims is a sequence of independent and identically distributed random variables, and the accumulated loss distribution forms a compound distribution and is approximated by a heavy-tailed distribution. A numerical example is given to calibrate parameters, and we discuss how the value of CatEPut is affected by the changes of parameters in the pricing model we provided.Keywords: catastrophe equity put options, compound distributions, nonhomogeneous Poisson process, pricing model
Procedia PDF Downloads 1651673 A Two Tailed Secretary Problem with Multiple Criteria
Authors: Alaka Padhye, S. P. Kane
Abstract:
The following study considers some variations made to the secretary problem (SP). In a multiple criteria secretary problem (MCSP), the selection of a unit is based on two independent characteristics. The units that appear before an observer are known say N, the best rank of a unit being N. A unit is selected, if it is better with respect to either first or second or both the characteristics. When the number of units is large and due to constraints like time and cost, the observer might want to stop earlier instead of inspecting all the available units. Let the process terminate at r2th unit where r11672 A Packet Loss Probability Estimation Filter Using Most Recent Finite Traffic Measurements
Authors: Pyung Soo Kim, Eung Hyuk Lee, Mun Suck Jang
Abstract:
A packet loss probability (PLP) estimation filter with finite memory structure is proposed to estimate the packet rate mean and variance of the input traffic process in real-time while removing undesired system and measurement noises. The proposed PLP estimation filter is developed under a weighted least square criterion using only the finite traffic measurements on the most recent window. The proposed PLP estimation filter is shown to have several inherent properties such as unbiasedness, deadbeat, robustness. A guideline for choosing appropriate window length is described since it can affect significantly the estimation performance. Using computer simulations, the proposed PLP estimation filter is shown to be superior to the Kalman filter for the temporarily uncertain system. One possible explanation for this is that the proposed PLP estimation filter can have greater convergence time of a filtered estimate as the window length M decreases.Keywords: packet loss probability estimation, finite memory filter, infinite memory filter, Kalman filter
Procedia PDF Downloads 6701671 The Control of Wall Thickness Tolerance during Pipe Purchase Stage Based on Reliability Approach
Authors: Weichao Yu, Kai Wen, Weihe Huang, Yang Yang, Jing Gong
Abstract:
Metal-loss corrosion is a major threat to the safety and integrity of gas pipelines as it may result in the burst failures which can cause severe consequences that may include enormous economic losses as well as the personnel casualties. Therefore, it is important to ensure the corroding pipeline integrity and efficiency, considering the value of wall thickness, which plays an important role in the failure probability of corroding pipeline. Actually, the wall thickness is controlled during pipe purchase stage. For example, the API_SPEC_5L standard regulates the allowable tolerance of the wall thickness from the specified value during the pipe purchase. The allowable wall thickness tolerance will be used to determine the wall thickness distribution characteristic such as the mean value, standard deviation and distribution. Taking the uncertainties of the input variables in the burst limit-state function into account, the reliability approach rather than the deterministic approach will be used to evaluate the failure probability. Moreover, the cost of pipe purchase will be influenced by the allowable wall thickness tolerance. More strict control of the wall thickness usually corresponds to a higher pipe purchase cost. Therefore changing the wall thickness tolerance will vary both the probability of a burst failure and the cost of the pipe. This paper describes an approach to optimize the wall thickness tolerance considering both the safety and economy of corroding pipelines. In this paper, the corrosion burst limit-state function in Annex O of CSAZ662-7 is employed to evaluate the failure probability using the Monte Carlo simulation technique. By changing the allowable wall thickness tolerance, the parameters of the wall thickness distribution in the limit-state function will be changed. Using the reliability approach, the corresponding variations in the burst failure probability will be shown. On the other hand, changing the wall thickness tolerance will lead to a change in cost in pipe purchase. Using the variation of the failure probability and pipe cost caused by changing wall thickness tolerance specification, the optimal allowable tolerance can be obtained, and used to define pipe purchase specifications.Keywords: allowable tolerance, corroding pipeline segment, operation cost, production cost, reliability approach
Procedia PDF Downloads 3941670 Influence of Maximum Fatigue Load on Probabilistic Aspect of Fatigue Crack Propagation Life at Specified Grown Crack in Magnesium Alloys
Authors: Seon Soon Choi
Abstract:
The principal purpose of this paper is to find the influence of maximum fatigue load on the probabilistic aspect of fatigue crack propagation life at a specified grown crack in magnesium alloys. The experiments of fatigue crack propagation are carried out in laboratory air under different conditions of the maximum fatigue loads to obtain the fatigue crack propagation data for the statistical analysis. In order to analyze the probabilistic aspect of fatigue crack propagation life, the goodness-of fit test for probability distribution of the fatigue crack propagation life at a specified grown crack is implemented through Anderson-Darling test. The good probability distribution of the fatigue crack propagation life is also verified under the conditions of the maximum fatigue loads.Keywords: fatigue crack propagation life, magnesium alloys, maximum fatigue load, probability
Procedia PDF Downloads 3861669 Urban-Rural Inequality in Mexico after Nafta: A Quantile Regression Analysis
Authors: Rene Valdiviezo-Issa
Abstract:
In this paper, we use Mexico’s Households Income and Expenditures (ENIGH) survey to explain the behaviour that the urban-rural expenditure gap has had since Mexico’s incorporation to the North American Free Trade Agreement (NAFTA) in 1994 and we compare it with the latest available survey, which took place in 2014. We use real trimestral expenditure per capita (RTEPC) as the measure of welfare. We use quantile regressions and a quantile regression decomposition to describe the gap between urban and rural distributions of log RTEPC. We discover that the decrease in the difference between the urban and rural distributions of log RTEPC, or inequality, is motivated because of a deprivation of the urban areas, in very specific characteristics, rather than an improvement of the urban areas. When using the decomposition we observe that the gap is primarily brought about because differences in returns to covariates between the urban and rural areas.Keywords: quantile regression, urban-rural inequality, inequality in Mexico, income decompositon
Procedia PDF Downloads 2791668 The Effect of Non-Normality on CB-SEM and PLS-SEM Path Estimates
Authors: Z. Jannoo, B. W. Yap, N. Auchoybur, M. A. Lazim
Abstract:
The two common approaches to Structural Equation Modeling (SEM) are the Covariance-Based SEM (CB-SEM) and Partial Least Squares SEM (PLS-SEM). There is much debate on the performance of CB-SEM and PLS-SEM for small sample size and when distributions are non-normal. This study evaluates the performance of CB-SEM and PLS-SEM under normality and non-normality conditions via a simulation. Monte Carlo Simulation in R programming language was employed to generate data based on the theoretical model with one endogenous and four exogenous variables. Each latent variable has three indicators. For normal distributions, CB-SEM estimates were found to be inaccurate for small sample size while PLS-SEM could produce the path estimates. Meanwhile, for a larger sample size, CB-SEM estimates have lower variability compared to PLS-SEM. Under non-normality, CB-SEM path estimates were inaccurate for small sample size. However, CB-SEM estimates are more accurate than those of PLS-SEM for sample size of 50 and above. The PLS-SEM estimates are not accurate unless sample size is very large.Keywords: CB-SEM, Monte Carlo simulation, normality conditions, non-normality, PLS-SEM
Procedia PDF Downloads 4091667 Prioritized Processor-Sharing with a Maximum Permissible Sojourn Time
Authors: Yoshiaki Shikata
Abstract:
A prioritized processor-sharing (PS) system with a maximum permissible sojourn time (MPST) is proposed. In this PS system, a higher-priority request is allocated a larger service ratio than a lower-priority request. Moreover, each request receiving service is guaranteed the maximum permissible sojourn time determined by each priority class, regardless of its service time. Arriving requests that cannot receive service due to this guarantee are rejected. We further propose a guarantee method for implementing such a system, and discuss performance evaluation procedures for the resulting system. Practical performance measures, such as the relationships between the loss probability or mean sojourn time of each class request and the maximum permissible sojourn time are evaluated via simulation. At the arrival of each class request, its acceptance or rejection is judged using extended sojourn times of all requests receiving service in the server. As the MPST increases, the mean sojourn time increases almost linearly. However, the logarithm of the loss probability decreases almost linearly. Moreover with an MPST, the difference in the mean sojourn time for different MPSTs increases with the traffic rate. Conversely, the difference in the loss probability for different MPSTs decreases as the traffic rate increases.Keywords: prioritized processor sharing, priority ratio, permissible sojourn time, loss probability, mean sojourn time, simulation
Procedia PDF Downloads 190