Search results for: continuous right skewed distributions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2815

Search results for: continuous right skewed distributions

2785 Frequency Analysis Using Multiple Parameter Probability Distributions for Rainfall to Determine Suitable Probability Distribution in Pakistan

Authors: Tasir Khan, Yejuan Wang

Abstract:

The study of extreme rainfall events is very important for flood management in river basins and the design of water conservancy infrastructure. Evaluation of quantiles of annual maximum rainfall (AMRF) is required in different environmental fields, agriculture operations, renewable energy sources, climatology, and the design of different structures. Therefore, the annual maximum rainfall (AMRF) was performed at different stations in Pakistan. Multiple probability distributions, log normal (LN), generalized extreme value (GEV), Gumbel (max), and Pearson type3 (P3) were used to find out the most appropriate distributions in different stations. The L moments method was used to evaluate the distribution parameters. Anderson darling test, Kolmogorov- Smirnov test, and chi-square test showed that two distributions, namely GUM (max) and LN, were the best appropriate distributions. The quantile estimate of a multi-parameter PD offers extreme rainfall through a specific location and is therefore important for decision-makers and planners who design and construct different structures. This result provides an indication of these multi-parameter distribution consequences for the study of sites and peak flow prediction and the design of hydrological maps. Therefore, this discovery can support hydraulic structure and flood management.

Keywords: RAMSE, multiple frequency analysis, annual maximum rainfall, L-moments

Procedia PDF Downloads 52
2784 An Application of Modified M-out-of-N Bootstrap Method to Heavy-Tailed Distributions

Authors: Hannah F. Opayinka, Adedayo A. Adepoju

Abstract:

This study is an extension of a prior study on the modification of the existing m-out-of-n (moon) bootstrap method for heavy-tailed distributions in which modified m-out-of-n (mmoon) was proposed as an alternative method to the existing moon technique. In this study, both moon and mmoon techniques were applied to two real income datasets which followed Lognormal and Pareto distributions respectively with finite variances. The performances of these two techniques were compared using Standard Error (SE) and Root Mean Square Error (RMSE). The findings showed that mmoon outperformed moon bootstrap in terms of smaller SEs and RMSEs for all the sample sizes considered in the two datasets.

Keywords: Bootstrap, income data, lognormal distribution, Pareto distribution

Procedia PDF Downloads 155
2783 A Copula-Based Approach for the Assessment of Severity of Illness and Probability of Mortality: An Exploratory Study Applied to Intensive Care Patients

Authors: Ainura Tursunalieva, Irene Hudson

Abstract:

Continuous improvement of both the quality and safety of health care is an important goal in Australia and internationally. The intensive care unit (ICU) receives patients with a wide variety of and severity of illnesses. Accurately identifying patients at risk of developing complications or dying is crucial to increasing healthcare efficiency. Thus, it is essential for clinicians and researchers to have a robust framework capable of evaluating the risk profile of a patient. ICU scoring systems provide such a framework. The Acute Physiology and Chronic Health Evaluation III and the Simplified Acute Physiology Score II are ICU scoring systems frequently used for assessing the severity of acute illness. These scoring systems collect multiple risk factors for each patient including physiological measurements then render the assessment outcomes of individual risk factors into a single numerical value. A higher score is related to a more severe patient condition. Furthermore, the Mortality Probability Model II uses logistic regression based on independent risk factors to predict a patient’s probability of mortality. An important overlooked limitation of SAPS II and MPM II is that they do not, to date, include interaction terms between a patient’s vital signs. This is a prominent oversight as it is likely there is an interplay among vital signs. The co-existence of certain conditions may pose a greater health risk than when these conditions exist independently. One barrier to including such interaction terms in predictive models is the dimensionality issue as it becomes difficult to use variable selection. We propose an innovative scoring system which takes into account a dependence structure among patient’s vital signs, such as systolic and diastolic blood pressures, heart rate, pulse interval, and peripheral oxygen saturation. Copulas will capture the dependence among normally distributed and skewed variables as some of the vital sign distributions are skewed. The estimated dependence parameter will then be incorporated into the traditional scoring systems to adjust the points allocated for the individual vital sign measurements. The same dependence parameter will also be used to create an alternative copula-based model for predicting a patient’s probability of mortality. The new copula-based approach will accommodate not only a patient’s trajectories of vital signs but also the joint dependence probabilities among the vital signs. We hypothesise that this approach will produce more stable assessments and lead to more time efficient and accurate predictions. We will use two data sets: (1) 250 ICU patients admitted once to the Chui Regional Hospital (Kyrgyzstan) and (2) 37 ICU patients’ agitation-sedation profiles collected by the Hunter Medical Research Institute (Australia). Both the traditional scoring approach and our copula-based approach will be evaluated using the Brier score to indicate overall model performance, the concordance (or c) statistic to indicate the discriminative ability (or area under the receiver operating characteristic (ROC) curve), and goodness-of-fit statistics for calibration. We will also report discrimination and calibration values and establish visualization of the copulas and high dimensional regions of risk interrelating two or three vital signs in so-called higher dimensional ROCs.

Keywords: copula, intensive unit scoring system, ROC curves, vital sign dependence

Procedia PDF Downloads 127
2782 Evaluating Forecasts Through Stochastic Loss Order

Authors: Wilmer Osvaldo Martinez, Manuel Dario Hernandez, Juan Manuel Julio

Abstract:

We propose to assess the performance of k forecast procedures by exploring the distributions of forecast errors and error losses. We argue that non systematic forecast errors minimize when their distributions are symmetric and unimodal, and that forecast accuracy should be assessed through stochastic loss order rather than expected loss order, which is the way it is customarily performed in previous work. Moreover, since forecast performance evaluation can be understood as a one way analysis of variance, we propose to explore loss distributions under two circumstances; when a strict (but unknown) joint stochastic order exists among the losses of all forecast alternatives, and when such order happens among subsets of alternative procedures. In spite of the fact that loss stochastic order is stronger than loss moment order, our proposals are at least as powerful as competing tests, and are robust to the correlation, autocorrelation and heteroskedasticity settings they consider. In addition, since our proposals do not require samples of the same size, their scope is also wider, and provided that they test the whole loss distribution instead of just loss moments, they can also be used to study forecast distributions as well. We illustrate the usefulness of our proposals by evaluating a set of real world forecasts.

Keywords: forecast evaluation, stochastic order, multiple comparison, non parametric test

Procedia PDF Downloads 54
2781 Continuous Improvement in Emerging Economies: Insights from a Multi-Case Analysis

Authors: Luis A. Paipa-Galeano, Yavar Jarrah-Nezhad, César A. Bernal-Torres

Abstract:

This paper presents a case study of four companies in an emerging economy to identify the key success factors and barriers to sustaining continuous improvement practices. The study analyzes the empirical evidence and compares it to the literature review to provide insights for companies looking to increase their maturity level in this area. The five success factors identified are the availability of resources, commitment and support from management, participation of employees in identifying tasks to improve, clear and realistic objectives for continuous improvement, and the existence of a leader or responsible for continuous improvement. The major barriers to success are a lack of alignment between the organization’s strategic objectives and continuous improvement objectives, a lack of motivation in the team, and resistance to change. The paper concludes with recommendations for companies to reduce the risk of improvement failure and increase their maturity level in continuous improvement.

Keywords: emerging economies, Kaizen, continuous improvement sustainability, maturity model

Procedia PDF Downloads 27
2780 Comparative Study Between Continuous Versus Pulsed Ultrasound in Knee Osteoarthritis

Authors: Karim Mohamed Fawzy Ghuiba, Alaa Aldeen Abd Al Hakeem Balbaa, Shams Elbaz

Abstract:

Objectives: To compare between the effects continuous and pulsed ultrasound on pain and function in patient with knee osteoarthritis. Design: Randomized-Single blinded Study. Participants: 6 patients with knee osteoarthritis with mean age 53.66±3.61years, Altman Grade II or III. Interventions: Subjects were randomly assigned into two groups; Group A received continuous ultrasound and Group B received pulsed ultrasound. Outcome measures: Effects of pulsed and continuous ultrasound were evaluated by pain threshold assessed by visual analogue scale (VAS) scores and function assessed by the Western Ontario and McMaster Universities osteoarthritis index (WOMAC) scores. Results: There was no significant decrease in VAS and WOMAC scores in patients treated with pulsed or continuous ultrasound; and there were no significant differences between both groups. Conclusion: there is no difference between the effects of pulsed and continuous ultrasound in pain relief or functional outcome in patients with knee osteoarthritis.

Keywords: knee osteoarthritis, pulsed ultrasound, ultrasound therapy, continuous ultrasound

Procedia PDF Downloads 245
2779 Hybrid EMPCA-Scott Approach for Estimating Probability Distributions of Mutual Information

Authors: Thuvanan Borvornvitchotikarn, Werasak Kurutach

Abstract:

Mutual information (MI) is widely used in medical image registration. In the different medical images analysis, it is difficult to choose an optimal bins size number for calculating the probability distributions in MI. As the result, this paper presents a new adaptive bins number selection approach that named a hybrid EMPCA-Scott approach. This work combines an expectation maximization principal component analysis (EMPCA) and the modified Scott’s rule. The proposed approach solves the binning problem from the various intensity values in medical images. Experimental results of this work show the lower registration errors compared to other adaptive binning approaches.

Keywords: mutual information, EMPCA, Scott, probability distributions

Procedia PDF Downloads 220
2778 Evaluating Performance of Value at Risk Models for the MENA Islamic Stock Market Portfolios

Authors: Abderrazek Ben Maatoug, Ibrahim Fatnassi, Wassim Ben Ayed

Abstract:

In this paper we investigate the issue of market risk quantification for Middle East and North Africa (MENA) Islamic market equity. We use Value-at-Risk (VaR) as a measure of potential risk in Islamic stock market, for long and short position, based on Riskmetrics model and the conditional parametric ARCH class model volatility with normal, student and skewed student distribution. The sample consist of daily data for the 2006-2014 of 11 Islamic stock markets indices. We conduct Kupiec and Engle and Manganelli tests to evaluate the performance for each model. The main finding of our empirical results show that (i) the superior performance of VaR models based on the Student and skewed Student distribution, for the significance level of α=1% , for all Islamic stock market indices, and for both long and short trading positions (ii) Risk Metrics model, and VaR model based on conditional volatility with normal distribution provides the best accurate VaR estimations for both long and short trading positions for a significance level of α=5%.

Keywords: value-at-risk, risk management, islamic finance, GARCH models

Procedia PDF Downloads 556
2777 A Strategy for the Application of Second-Order Monte Carlo Algorithms to Petroleum Exploration and Production Projects

Authors: Obioma Uche

Abstract:

Due to the recent volatility in oil & gas prices as well as increased development of non-conventional resources, it has become even more essential to critically evaluate the profitability of petroleum prospects prior to making any investment decisions. Traditionally, simple Monte Carlo (MC) algorithms have been used to randomly sample probability distributions of economic and geological factors (e.g. price, OPEX, CAPEX, reserves, productive life, etc.) in order to obtain probability distributions for profitability metrics such as Net Present Value (NPV). In recent years, second-order MC algorithms have been shown to offer an advantage over simple MC techniques due to the added consideration of uncertainties associated with the probability distributions of the relevant variables. Here, a strategy for the application of the second-order MC technique to a case study is demonstrated to analyze its effectiveness as a tool for portfolio management.

Keywords: Monte Carlo algorithms, portfolio management, profitability, risk analysis

Procedia PDF Downloads 299
2776 A Continuous Boundary Value Method of Order 8 for Solving the General Second Order Multipoint Boundary Value Problems

Authors: T. A. Biala

Abstract:

This paper deals with the numerical integration of the general second order multipoint boundary value problems. This has been achieved by the development of a continuous linear multistep method (LMM). The continuous LMM is used to construct a main discrete method to be used with some initial and final methods (also obtained from the continuous LMM) so that they form a discrete analogue of the continuous second order boundary value problems. These methods are used as boundary value methods and adapted to cope with the integration of the general second order multipoint boundary value problems. The convergence, the use and the region of absolute stability of the methods are discussed. Several numerical examples are implemented to elucidate our solution process.

Keywords: linear multistep methods, boundary value methods, second order multipoint boundary value problems, convergence

Procedia PDF Downloads 352
2775 A Family of Distributions on Learnable Problems without Uniform Convergence

Authors: César Garza

Abstract:

In supervised binary classification and regression problems, it is well-known that learnability is equivalent to a uniform convergence of the hypothesis class, and if a problem is learnable, it is learnable by empirical risk minimization. For the general learning setting of unsupervised learning tasks, there are non-trivial learning problems where uniform convergence does not hold. We present here the task of learning centers of mass with an extra feature that “activates” some of the coordinates over the unit ball in a Hilbert space. We show that the learning problem is learnable under a stable RLM rule. We introduce a family of distributions over the domain space with some mild restrictions for which the sample complexity of uniform convergence for these problems must grow logarithmically with the dimension of the Hilbert space. If we take this dimension to infinity, we obtain a learnable problem for which the uniform convergence property fails for a vast family of distributions.

Keywords: statistical learning theory, learnability, uniform convergence, stability, regularized loss minimization

Procedia PDF Downloads 92
2774 Parameters Estimation of Multidimensional Possibility Distributions

Authors: Sergey Sorokin, Irina Sorokina, Alexander Yazenin

Abstract:

We present a solution to the Maxmin u/E parameters estimation problem of possibility distributions in m-dimensional case. Our method is based on geometrical approach, where minimal area enclosing ellipsoid is constructed around the sample. Also we demonstrate that one can improve results of well-known algorithms in fuzzy model identification task using Maxmin u/E parameters estimation.

Keywords: possibility distribution, parameters estimation, Maxmin u\E estimator, fuzzy model identification

Procedia PDF Downloads 436
2773 Hybrid Inventory Model Optimization under Uncertainties: A Case Study in a Manufacturing Plant

Authors: E. Benga, T. Tengen, A. Alugongo

Abstract:

Periodic and continuous inventory models are the two classical management tools used to handle inventories. These models have advantages and disadvantages. The implementation of both continuous (r,Q) inventory and periodic (R, S) inventory models in most manufacturing plants comes with higher cost. Such high inventory costs are due to the fact that most manufacturing plants are not flexible enough. Since demand and lead-time are two important variables of every inventory models, their effect on the flexibility of the manufacturing plant matter most. Unfortunately, these effects are not clearly understood by managers. The reason is that the decision parameters of the continuous (r, Q) inventory and periodic (R, S) inventory models are not designed to effectively deal with the issues of uncertainties such as poor manufacturing performances, delivery performance supplies performances. There is, therefore, a need to come up with a predictive and hybrid inventory model that can combine in some sense the feature of the aforementioned inventory models. A linear combination technique is used to hybridize both continuous (r, Q) inventory and periodic (R, S) inventory models. The behavior of such hybrid inventory model is described by a differential equation and then optimized. From the results obtained after simulation, the continuous (r, Q) inventory model is more effective than the periodic (R, S) inventory models in the short run, but this difference changes as time goes by. Because the hybrid inventory model is more cost effective than the continuous (r,Q) inventory and periodic (R, S) inventory models in long run, it should be implemented for strategic decisions.

Keywords: periodic inventory, continuous inventory, hybrid inventory, optimization, manufacturing plant

Procedia PDF Downloads 357
2772 Tumour Radionuclides Therapy: in vitro and in vivo Dose Distribution Study

Authors: Rekaya A. Shabbir, Marco Mingarelli, Glenn Flux, Ananya Choudhury, Tim A. D. Smith

Abstract:

Introduction: Heterogeneity of dose distributions across a tumour is problematic for targeted radiotherapy. Gold nanoparticles (AuNPs) enhance dose-distributions of targeted radionuclides. The aim of this study is to demonstrate if tumour dose-distribution of targeted AuNPs radiolabelled with either of two radioisotopes (¹⁷⁷Lu and ⁹⁰Y) in breast cancer cells produced homogeneous dose distributions. Moreover, in vitro and in vivo studies were conducted to study the importance of receptor level on cytotoxicity of EGFR-targeted AuNPs in breast and colorectal cancer cells. Methods: AuNPs were functionalised with DOTA and OPPS-PEG-SVA to optimise labelling with radionuclide tracers and targeting with Erbitux. Radionuclides were chelated with DOTA, and the uptake of the radiolabelled AuNPs and targeted activity in vitro in both cell lines measured using liquid scintillation counting. Cells with medium (HCT8) and high (MDA-MB-468) EGFR expression were incubated with targeted ¹⁷⁷Lu-AuNPs for 4h, then washed and allowed to form colonies. Nude mice bearing tumours were used to study the biodistribution by injecting ¹⁷⁷Lu-AuNPs or ⁹⁰Y-AuNPs via the tail vein. Heterogeneity of dose-distribution in tumours was determined using autoradiography. Results: Colony formation (% control) was 81 ± 4.7% (HCT8) and 32 ± 9% (MDA-MB-468). High uptake was observed in the liver and spleen, indicating hepatobiliary excretion. Imaging showed heterogeneity in dose-distributions for both radionuclides across the tumours. Conclusion: The cytotoxic effect of EGFR-targeted AuNPs is greater in cells with higher EGFR expression. Dose-distributions for individual radiolabelled nanoparticles were heterogeneous across tumours. Further strategies are required to improve the uniformity of dose distribution prior to clinical trials.

Keywords: cancer cells, dose distributions, radionuclide therapy, targeted gold nanoparticles

Procedia PDF Downloads 96
2771 Continuous Synthesis of Nickel Nanoparticles by Hydrazine Reduction

Authors: Yong-Su Jo, Seung-Min Yang, Seok Hong Min, Tae Kwon Ha

Abstract:

The synthesis of nickel nanoparticles by the reduction of nickel chloride with hydrazine in an aqueous solution. The effect of hydrazine concentration on batch-processed particle characteristics was investigated using Field Emission Scanning Electron Microscopy (FESEM). Both average particle size and geometric standard deviation (GSD) were decreasing with increasing hydrazine concentration. The continuous synthesis of nickel nanoparticles by microemulsion method was also studied using FESEM and X-ray Diffraction (XRD). The average size and geometric standard deviation of continuous-processed particles were 87.4 nm and 1.16, respectively. X-ray diffraction revealed continuous-processed particles were pure nickel crystalline with a face-centered cubic (fcc) structure.

Keywords: nanoparticle, hydrazine reduction, continuous process, microemulsion method

Procedia PDF Downloads 426
2770 Adaptation of Projection Profile Algorithm for Skewed Handwritten Text Line Detection

Authors: Kayode A. Olaniyi, Tola. M. Osifeko, Adeola A. Ogunleye

Abstract:

Text line segmentation is an important step in document image processing. It represents a labeling process that assigns the same label using distance metric probability to spatially aligned units. Text line detection techniques have successfully been implemented mainly in printed documents. However, processing of the handwritten texts especially unconstrained documents has remained a key problem. This is because the unconstrained hand-written text lines are often not uniformly skewed. The spaces between text lines may not be obvious, complicated by the nature of handwriting and, overlapping ascenders and/or descenders of some characters. Hence, text lines detection and segmentation represents a leading challenge in handwritten document image processing. Text line detection methods that rely on the traditional global projection profile of the text document cannot efficiently confront with the problem of variable skew angles between different text lines. Hence, the formulation of a horizontal line as a separator is often not efficient. This paper presents a technique to segment a handwritten document into distinct lines of text. The proposed algorithm starts, by partitioning the initial text image into columns, across its width into chunks of about 5% each. At each vertical strip of 5%, the histogram of horizontal runs is projected. We have worked with the assumption that text appearing in a single strip is almost parallel to each other. The algorithm developed provides a sliding window through the first vertical strip on the left side of the page. It runs through to identify the new minimum corresponding to a valley in the projection profile. Each valley would represent the starting point of the orientation line and the ending point is the minimum point on the projection profile of the next vertical strip. The derived text-lines traverse around any obstructing handwritten vertical strips of connected component by associating it to either the line above or below. A decision of associating such connected component is made by the probability obtained from a distance metric decision. The technique outperforms the global projection profile for text line segmentation and it is robust to handle skewed documents and those with lines running into each other.

Keywords: connected-component, projection-profile, segmentation, text-line

Procedia PDF Downloads 93
2769 Effect of Variable Fluxes on Optimal Flux Distribution in a Metabolic Network

Authors: Ehsan Motamedian

Abstract:

Finding all optimal flux distributions of a metabolic model is an important challenge in systems biology. In this paper, a new algorithm is introduced to identify all alternate optimal solutions of a large scale metabolic network. The algorithm reduces the model to decrease computations for finding optimal solutions. The algorithm was implemented on the Escherichia coli metabolic model to find all optimal solutions for lactate and acetate production. There were more optimal flux distributions when acetate production was optimized. The model was reduced from 1076 to 80 variable fluxes for lactate while it was reduced to 91 variable fluxes for acetate. These 11 more variable fluxes resulted in about three times more optimal flux distributions. Variable fluxes were from 12 various metabolic pathways and most of them belonged to nucleotide salvage and extra cellular transport pathways.

Keywords: flux variability, metabolic network, mixed-integer linear programming, multiple optimal solutions

Procedia PDF Downloads 396
2768 Determination of the Best Fit Probability Distribution for Annual Rainfall in Karkheh River at Iran

Authors: Karim Hamidi Machekposhti, Hossein Sedghi

Abstract:

This study was designed to find the best-fit probability distribution of annual rainfall based on 50 years sample (1966-2015) in the Karkheh river basin at Iran using six probability distributions: Normal, 2-Parameter Log Normal, 3-Parameter Log Normal, Pearson Type 3, Log Pearson Type 3 and Gumbel distribution. The best fit probability distribution was selected using Stormwater Management and Design Aid (SMADA) software and based on the Residual Sum of Squares (R.S.S) between observed and estimated values Based on the R.S.S values of fit tests, the Log Pearson Type 3 and then Pearson Type 3 distributions were found to be the best-fit probability distribution at the Jelogir Majin and Pole Zal rainfall gauging station. The annual values of expected rainfall were calculated using the best fit probability distributions and can be used by hydrologists and design engineers in future research at studied region and other region in the world.

Keywords: Log Pearson Type 3, SMADA, rainfall, Karkheh River

Procedia PDF Downloads 167
2767 Evaluation of Carbon Dioxide Pressure through Radial Velocity Difference in Arterial Blood Modeled by Drift Flux Model

Authors: Aicha Rima Cheniti, Hatem Besbes, Joseph Haggege, Christophe Sintes

Abstract:

In this paper, we are interested to determine the carbon dioxide pressure in the arterial blood through radial velocity difference. The blood was modeled as a two phase mixture (an aqueous carbon dioxide solution with carbon dioxide gas) by Drift flux model and the Young-Laplace equation. The distributions of mixture velocities determined from the considered model permitted the calculation of the radial velocity distributions with different values of mean mixture pressure and the calculation of the mean carbon dioxide pressure knowing the mean mixture pressure. The radial velocity distributions are used to deduce a calculation method of the mean mixture pressure through the radial velocity difference between two positions which is measured by ultrasound. The mean carbon dioxide pressure is then deduced from the mean mixture pressure.

Keywords: mean carbon dioxide pressure, mean mixture pressure, mixture velocity, radial velocity difference

Procedia PDF Downloads 391
2766 Assessment Using Copulas of Simultaneous Damage to Multiple Buildings Due to Tsunamis

Authors: Yo Fukutani, Shuji Moriguchi, Takuma Kotani, Terada Kenjiro

Abstract:

If risk management of the assets owned by companies, risk assessment of real estate portfolio, and risk identification of the entire region are to be implemented, it is necessary to consider simultaneous damage to multiple buildings. In this research, the Sagami Trough earthquake tsunami that could have a significant effect on the Japanese capital region is focused on, and a method is proposed for simultaneous damage assessment using copulas that can take into consideration the correlation of tsunami depths and building damage between two sites. First, the tsunami inundation depths at two sites were simulated by using a nonlinear long-wave equation. The tsunamis were simulated by varying the slip amount (five cases) and the depths (five cases) for each of 10 sources of the Sagami Trough. For each source, the frequency distributions of the tsunami inundation depth were evaluated by using the response surface method. Then, Monte-Carlo simulation was conducted, and frequency distributions of tsunami inundation depth were evaluated at the target sites for all sources of the Sagami Trough. These are marginal distributions. Kendall’s tau for the tsunami inundation simulation at two sites was 0.83. Based on this value, the Gaussian copula, t-copula, Clayton copula, and Gumbel copula (n = 10,000) were generated. Then, the simultaneous distributions of the damage rate were evaluated using the marginal distributions and the copulas. For the correlation of the tsunami inundation depth at the two sites, the expected value hardly changed compared with the case of no correlation, but the damage rate of the ninety-ninth percentile value was approximately 2%, and the maximum value was approximately 6% when using the Gumbel copula.

Keywords: copulas, Monte-Carlo simulation, probabilistic risk assessment, tsunamis

Procedia PDF Downloads 114
2765 Combining a Continuum of Hidden Regimes and a Heteroskedastic Three-Factor Model in Option Pricing

Authors: Rachid Belhachemi, Pierre Rostan, Alexandra Rostan

Abstract:

This paper develops a discrete-time option pricing model for index options. The model consists of two key ingredients. First, daily stock return innovations are driven by a continuous hidden threshold mixed skew-normal (HTSN) distribution which generates conditional non-normality that is needed to fit daily index return. The most important feature of the HTSN is the inclusion of a latent state variable with a continuum of states, unlike the traditional mixture distributions where the state variable is discrete with little number of states. The HTSN distribution belongs to the class of univariate probability distributions where parameters of the distribution capture the dependence between the variable of interest and the continuous latent state variable (the regime). The distribution has an interpretation in terms of a mixture distribution with time-varying mixing probabilities. It has been shown empirically that this distribution outperforms its main competitor, the mixed normal (MN) distribution, in terms of capturing the stylized facts known for stock returns, namely, volatility clustering, leverage effect, skewness, kurtosis and regime dependence. Second, heteroscedasticity in the model is captured by a threeexogenous-factor GARCH model (GARCHX), where the factors are taken from the principal components analysis of various world indices and presents an application to option pricing. The factors of the GARCHX model are extracted from a matrix of world indices applying principal component analysis (PCA). The empirically determined factors are uncorrelated and represent truly different common components driving the returns. Both factors and the eight parameters inherent to the HTSN distribution aim at capturing the impact of the state of the economy on price levels since distribution parameters have economic interpretations in terms of conditional volatilities and correlations of the returns with the hidden continuous state. The PCA identifies statistically independent factors affecting the random evolution of a given pool of assets -in our paper a pool of international stock indices- and sorting them by order of relative importance. The PCA computes a historical cross asset covariance matrix and identifies principal components representing independent factors. In our paper, factors are used to calibrate the HTSN-GARCHX model and are ultimately responsible for the nature of the distribution of random variables being generated. We benchmark our model to the MN-GARCHX model following the same PCA methodology and the standard Black-Scholes model. We show that our model outperforms the benchmark in terms of RMSE in dollar losses for put and call options, which in turn outperforms the analytical Black-Scholes by capturing the stylized facts known for index returns, namely, volatility clustering, leverage effect, skewness, kurtosis and regime dependence.

Keywords: continuous hidden threshold, factor models, GARCHX models, option pricing, risk-premium

Procedia PDF Downloads 276
2764 Continuous Improvement as an Organizational Capability in the Industry 4.0 Era

Authors: Lodgaard Eirin, Myklebust Odd, Eleftheriadis Ragnhild

Abstract:

Continuous improvement is becoming increasingly a prerequisite for manufacturing companies to remain competitive in a global market. In addition, future survival and success will depend on the ability to manage the forthcoming digitalization transformation in the industry 4.0 era. Industry 4.0 promises substantially increased operational effectiveness, were all equipment are equipped with integrated processing and communication capabilities. Subsequently, the interplay of human and technology will evolve and influence the range of worker tasks and demands. Taking into account these changes, the concept of continuous improvement must evolve accordingly. Based on a case study from manufacturing industry, the purpose of this paper is to point out what the concept of continuous improvement will meet and has to take into considering when entering the 4th industrial revolution. In the past, continuous improvement has the focus on a culture of sustained improvement targeting the elimination of waste in all systems and processes of an organization by involving everyone. Today, it has to be evolved into the forthcoming digital transformation and the increased interplay of human and digital communication system to reach its full potential. One main findings of this study, is how digital communication systems will act as an enabler to strengthen the continuous improvement process, by moving from collaboration within individual teams to interconnection of teams along the product value chain. For academics and practitioners, it will help them to identify and prioritize their steps towards an industry 4.0 implementation integrated with focus on continuous improvement.

Keywords: continuous improvement, digital communication system, human-machine-interaction, industry 4.0, team perfomance

Procedia PDF Downloads 170
2763 Bayesian Parameter Inference for Continuous Time Markov Chains with Intractable Likelihood

Authors: Randa Alharbi, Vladislav Vyshemirsky

Abstract:

Systems biology is an important field in science which focuses on studying behaviour of biological systems. Modelling is required to produce detailed description of the elements of a biological system, their function, and their interactions. A well-designed model requires selecting a suitable mechanism which can capture the main features of the system, define the essential components of the system and represent an appropriate law that can define the interactions between its components. Complex biological systems exhibit stochastic behaviour. Thus, using probabilistic models are suitable to describe and analyse biological systems. Continuous-Time Markov Chain (CTMC) is one of the probabilistic models that describe the system as a set of discrete states with continuous time transitions between them. The system is then characterised by a set of probability distributions that describe the transition from one state to another at a given time. The evolution of these probabilities through time can be obtained by chemical master equation which is analytically intractable but it can be simulated. Uncertain parameters of such a model can be inferred using methods of Bayesian inference. Yet, inference in such a complex system is challenging as it requires the evaluation of the likelihood which is intractable in most cases. There are different statistical methods that allow simulating from the model despite intractability of the likelihood. Approximate Bayesian computation is a common approach for tackling inference which relies on simulation of the model to approximate the intractable likelihood. Particle Markov chain Monte Carlo (PMCMC) is another approach which is based on using sequential Monte Carlo to estimate intractable likelihood. However, both methods are computationally expensive. In this paper we discuss the efficiency and possible practical issues for each method, taking into account the computational time for these methods. We demonstrate likelihood-free inference by performing analysing a model of the Repressilator using both methods. Detailed investigation is performed to quantify the difference between these methods in terms of efficiency and computational cost.

Keywords: Approximate Bayesian computation(ABC), Continuous-Time Markov Chains, Sequential Monte Carlo, Particle Markov chain Monte Carlo (PMCMC)

Procedia PDF Downloads 179
2762 Computing Continuous Skyline Queries without Discriminating between Static and Dynamic Attributes

Authors: Ibrahim Gomaa, Hoda M. O. Mokhtar

Abstract:

Although most of the existing skyline queries algorithms focused basically on querying static points through static databases; with the expanding number of sensors, wireless communications and mobile applications, the demand for continuous skyline queries has increased. Unlike traditional skyline queries which only consider static attributes, continuous skyline queries include dynamic attributes, as well as the static ones. However, as skyline queries computation is based on checking the domination of skyline points over all dimensions, considering both the static and dynamic attributes without separation is required. In this paper, we present an efficient algorithm for computing continuous skyline queries without discriminating between static and dynamic attributes. Our algorithm in brief proceeds as follows: First, it excludes the points which will not be in the initial skyline result; this pruning phase reduces the required number of comparisons. Second, the association between the spatial positions of data points is examined; this phase gives an idea of where changes in the result might occur and consequently enables us to efficiently update the skyline result (continuous update) rather than computing the skyline from scratch. Finally, experimental evaluation is provided which demonstrates the accuracy, performance and efficiency of our algorithm over other existing approaches.

Keywords: continuous query processing, dynamic database, moving object, skyline queries

Procedia PDF Downloads 191
2761 Continuous Manufacturing of Ultra Fine Grained Materials by Severe Plastic Deformation Methods

Authors: Aslı Günay Bulutsuz, Mehmet Emin Yurci

Abstract:

Severe plastic deformation techniques are top-down deformation methods which enable superior mechanical properties by decreasing grain size. Different kind severe plastic deformation methods have been widely being used at various process temperature and geometries. Besides manufacturing advantages of severe plastic deformation technique, most of the types are being used only at the laboratory level. They cannot be adapted to industrial usage due to their continuous manufacturability and manufacturing costs. In order to enhance these manufacturing difficulties and enable widespread usage, different kinds of methods have been developed. In this review, a comprehensive literature research was fulfilled in order to highlight continuous severe plastic deformation methods.

Keywords: continuous manufacturing, severe plastic deformation, ultrafine grains, grain size refinement

Procedia PDF Downloads 212
2760 Essentiality of Core Strategic Vision in Continuous Cost Reduction Management

Authors: Lai Ving Kam

Abstract:

Many markets are maturing, consumer buying powers are weakening and customer preferences change rapidly. To survive, many adopt fast paced continuous cost reduction and competitive pricing to remain relevance. Marketers desire to push for more sales to increase revenues have intensified competitions at time cannibalize the product and market. The amazing technologies changes have created both hope and despair to the industries. The pressure to constantly reduce cost, on the one hand, create and market new products in cheaper prices and shorter life cycles, on the other has become a continuous endeavour. The twin trends appear irreconcilable. Can core strategic vision provides and adapts new directions in continuous cost reduction? This study investigates core strategic vision able to meet this need, for firms to survive and stay profitable. Under current uncertainty market, are firms falling back on their core strategic visions to take them out of the unfavourable positions?

Keywords: core strategy vision, continuous cost reduction, fashionable products industry, competitive pricing

Procedia PDF Downloads 294
2759 Methods of Improving Production Processes Based on Deming Cycle

Authors: Daniel Tochwin

Abstract:

Continuous improvement is an essential part of effective process performance management. In order to achieve continuous quality improvement, each organization must use the appropriate selection of tools and techniques. The basic condition for success is a proper understanding of the business need faced by the company and the selection of appropriate methods to improve a given production process. The main aim of this article is to analyze the methods of conduct which are popular in practice when implementing process improvements and then to determine whether the tested methods include repetitive systematics of the approach, i.e., a similar sequence of the same or similar actions. Based on an extensive literature review, 4 methods of continuous improvement of production processes were selected: A3 report, Gemba Kaizen, PDCA cycle, and Deming cycle. The research shows that all frequently used improvement methods are generally based on the PDCA cycle, and the differences are due to "(re)interpretation" and the need to adapt the continuous improvement approach to the specific business process. The research shows that all the frequently used improvement methods are generally based on the PDCA cycle, and the differences are due to "(re) interpretation" and the need to adapt the continuous improvement approach to the specific business process.

Keywords: continuous improvement, lean methods, process improvement, PDCA

Procedia PDF Downloads 45
2758 Finite Sample Inferences for Weak Instrument Models

Authors: Gubhinder Kundhi, Paul Rilstone

Abstract:

It is well established that Instrumental Variable (IV) estimators in the presence of weak instruments can be poorly behaved, in particular, be quite biased in finite samples. Finite sample approximations to the distributions of these estimators are obtained using Edgeworth and Saddlepoint expansions. Departures from normality of the distributions of these estimators are analyzed using higher order analytical corrections in these expansions. In a Monte-Carlo experiment, the performance of these expansions is compared to the first order approximation and other methods commonly used in finite samples such as the bootstrap.

Keywords: bootstrap, Instrumental Variable, Edgeworth expansions, Saddlepoint expansions

Procedia PDF Downloads 283
2757 Exponentiated Transmuted Weibull Distribution: A Generalization of the Weibull Probability Distribution

Authors: Abd El Hady N. Ebraheim

Abstract:

This paper introduces a new generalization of the two parameter Weibull distribution. To this end, the quadratic rank transmutation map has been used. This new distribution is named exponentiated transmuted Weibull (ETW) distribution. The ETW distribution has the advantage of being capable of modeling various shapes of aging and failure criteria. Furthermore, eleven lifetime distributions such as the Weibull, exponentiated Weibull, Rayleigh and exponential distributions, among others follow as special cases. The properties of the new model are discussed and the maximum likelihood estimation is used to estimate the parameters. Explicit expressions are derived for the quantiles. The moments of the distribution are derived, and the order statistics are examined.

Keywords: exponentiated, inversion method, maximum likelihood estimation, transmutation map

Procedia PDF Downloads 539
2756 The Beta-Fisher Snedecor Distribution with Applications to Cancer Remission Data

Authors: K. A. Adepoju, O. I. Shittu, A. U. Chukwu

Abstract:

In this paper, a new four-parameter generalized version of the Fisher Snedecor distribution called Beta- F distribution is introduced. The comprehensive account of the statistical properties of the new distributions was considered. Formal expressions for the cumulative density function, moments, moment generating function and maximum likelihood estimation, as well as its Fisher information, were obtained. The flexibility of this distribution as well as its robustness using cancer remission time data was demonstrated. The new distribution can be used in most applications where the assumption underlying the use of other lifetime distributions is violated.

Keywords: fisher-snedecor distribution, beta-f distribution, outlier, maximum likelihood method

Procedia PDF Downloads 314