Search results for: skew distributions
664 Evaluating Forecasts Through Stochastic Loss Order
Authors: Wilmer Osvaldo Martinez, Manuel Dario Hernandez, Juan Manuel Julio
Abstract:
We propose to assess the performance of k forecast procedures by exploring the distributions of forecast errors and error losses. We argue that non systematic forecast errors minimize when their distributions are symmetric and unimodal, and that forecast accuracy should be assessed through stochastic loss order rather than expected loss order, which is the way it is customarily performed in previous work. Moreover, since forecast performance evaluation can be understood as a one way analysis of variance, we propose to explore loss distributions under two circumstances; when a strict (but unknown) joint stochastic order exists among the losses of all forecast alternatives, and when such order happens among subsets of alternative procedures. In spite of the fact that loss stochastic order is stronger than loss moment order, our proposals are at least as powerful as competing tests, and are robust to the correlation, autocorrelation and heteroskedasticity settings they consider. In addition, since our proposals do not require samples of the same size, their scope is also wider, and provided that they test the whole loss distribution instead of just loss moments, they can also be used to study forecast distributions as well. We illustrate the usefulness of our proposals by evaluating a set of real world forecasts.Keywords: forecast evaluation, stochastic order, multiple comparison, non parametric test
Procedia PDF Downloads 89663 Hybrid EMPCA-Scott Approach for Estimating Probability Distributions of Mutual Information
Authors: Thuvanan Borvornvitchotikarn, Werasak Kurutach
Abstract:
Mutual information (MI) is widely used in medical image registration. In the different medical images analysis, it is difficult to choose an optimal bins size number for calculating the probability distributions in MI. As the result, this paper presents a new adaptive bins number selection approach that named a hybrid EMPCA-Scott approach. This work combines an expectation maximization principal component analysis (EMPCA) and the modified Scott’s rule. The proposed approach solves the binning problem from the various intensity values in medical images. Experimental results of this work show the lower registration errors compared to other adaptive binning approaches.Keywords: mutual information, EMPCA, Scott, probability distributions
Procedia PDF Downloads 249662 Robust Inference with a Skew T Distribution
Authors: M. Qamarul Islam, Ergun Dogan, Mehmet Yazici
Abstract:
There is a growing body of evidence that non-normal data is more prevalent in nature than the normal one. Examples can be quoted from, but not restricted to, the areas of Economics, Finance and Actuarial Science. The non-normality considered here is expressed in terms of fat-tailedness and asymmetry of the relevant distribution. In this study a skew t distribution that can be used to model a data that exhibit inherent non-normal behavior is considered. This distribution has tails fatter than a normal distribution and it also exhibits skewness. Although maximum likelihood estimates can be obtained by solving iteratively the likelihood equations that are non-linear in form, this can be problematic in terms of convergence and in many other respects as well. Therefore, it is preferred to use the method of modified maximum likelihood in which the likelihood estimates are derived by expressing the intractable non-linear likelihood equations in terms of standardized ordered variates and replacing the intractable terms by their linear approximations obtained from the first two terms of a Taylor series expansion about the quantiles of the distribution. These estimates, called modified maximum likelihood estimates, are obtained in closed form. Hence, they are easy to compute and to manipulate analytically. In fact the modified maximum likelihood estimates are equivalent to maximum likelihood estimates, asymptotically. Even in small samples the modified maximum likelihood estimates are found to be approximately the same as maximum likelihood estimates that are obtained iteratively. It is shown in this study that the modified maximum likelihood estimates are not only unbiased but substantially more efficient than the commonly used moment estimates or the least square estimates that are known to be biased and inefficient in such cases. Furthermore, in conventional regression analysis, it is assumed that the error terms are distributed normally and, hence, the well-known least square method is considered to be a suitable and preferred method for making the relevant statistical inferences. However, a number of empirical researches have shown that non-normal errors are more prevalent. Even transforming and/or filtering techniques may not produce normally distributed residuals. Here, a study is done for multiple linear regression models with random error having non-normal pattern. Through an extensive simulation it is shown that the modified maximum likelihood estimates of regression parameters are plausibly robust to the distributional assumptions and to various data anomalies as compared to the widely used least square estimates. Relevant tests of hypothesis are developed and are explored for desirable properties in terms of their size and power. The tests based upon modified maximum likelihood estimates are found to be substantially more powerful than the tests based upon least square estimates. Several examples are provided from the areas of Economics and Finance where such distributions are interpretable in terms of efficient market hypothesis with respect to asset pricing, portfolio selection, risk measurement and capital allocation, etc.Keywords: least square estimates, linear regression, maximum likelihood estimates, modified maximum likelihood method, non-normality, robustness
Procedia PDF Downloads 397661 A Strategy for the Application of Second-Order Monte Carlo Algorithms to Petroleum Exploration and Production Projects
Authors: Obioma Uche
Abstract:
Due to the recent volatility in oil & gas prices as well as increased development of non-conventional resources, it has become even more essential to critically evaluate the profitability of petroleum prospects prior to making any investment decisions. Traditionally, simple Monte Carlo (MC) algorithms have been used to randomly sample probability distributions of economic and geological factors (e.g. price, OPEX, CAPEX, reserves, productive life, etc.) in order to obtain probability distributions for profitability metrics such as Net Present Value (NPV). In recent years, second-order MC algorithms have been shown to offer an advantage over simple MC techniques due to the added consideration of uncertainties associated with the probability distributions of the relevant variables. Here, a strategy for the application of the second-order MC technique to a case study is demonstrated to analyze its effectiveness as a tool for portfolio management.Keywords: Monte Carlo algorithms, portfolio management, profitability, risk analysis
Procedia PDF Downloads 332660 Marginalized Two-Part Joint Models for Generalized Gamma Family of Distributions
Authors: Mohadeseh Shojaei Shahrokhabadi, Ding-Geng (Din) Chen
Abstract:
Positive continuous outcomes with a substantial number of zero values and incomplete longitudinal follow-up are quite common in medical cost data. To jointly model semi-continuous longitudinal cost data and survival data and to provide marginalized covariate effect estimates, a marginalized two-part joint model (MTJM) has been developed for outcome variables with lognormal distributions. In this paper, we propose MTJM models for outcome variables from a generalized gamma (GG) family of distributions. The GG distribution constitutes a general family that includes approximately all of the most frequently used distributions like the Gamma, Exponential, Weibull, and Log Normal. In the proposed MTJM-GG model, the conditional mean from a conventional two-part model with a three-parameter GG distribution is parameterized to provide the marginal interpretation for regression coefficients. In addition, MTJM-gamma and MTJM-Weibull are developed as special cases of MTJM-GG. To illustrate the applicability of the MTJM-GG, we applied the model to a set of real electronic health record data recently collected in Iran, and we provided SAS code for application. The simulation results showed that when the outcome distribution is unknown or misspecified, which is usually the case in real data sets, the MTJM-GG consistently outperforms other models. The GG family of distribution facilitates estimating a model with improved fit over the MTJM-gamma, standard Weibull, or Log-Normal distributions.Keywords: marginalized two-part model, zero-inflated, right-skewed, semi-continuous, generalized gamma
Procedia PDF Downloads 176659 A Family of Distributions on Learnable Problems without Uniform Convergence
Authors: César Garza
Abstract:
In supervised binary classification and regression problems, it is well-known that learnability is equivalent to a uniform convergence of the hypothesis class, and if a problem is learnable, it is learnable by empirical risk minimization. For the general learning setting of unsupervised learning tasks, there are non-trivial learning problems where uniform convergence does not hold. We present here the task of learning centers of mass with an extra feature that “activates” some of the coordinates over the unit ball in a Hilbert space. We show that the learning problem is learnable under a stable RLM rule. We introduce a family of distributions over the domain space with some mild restrictions for which the sample complexity of uniform convergence for these problems must grow logarithmically with the dimension of the Hilbert space. If we take this dimension to infinity, we obtain a learnable problem for which the uniform convergence property fails for a vast family of distributions.Keywords: statistical learning theory, learnability, uniform convergence, stability, regularized loss minimization
Procedia PDF Downloads 129658 Parameters Estimation of Multidimensional Possibility Distributions
Authors: Sergey Sorokin, Irina Sorokina, Alexander Yazenin
Abstract:
We present a solution to the Maxmin u/E parameters estimation problem of possibility distributions in m-dimensional case. Our method is based on geometrical approach, where minimal area enclosing ellipsoid is constructed around the sample. Also we demonstrate that one can improve results of well-known algorithms in fuzzy model identification task using Maxmin u/E parameters estimation.Keywords: possibility distribution, parameters estimation, Maxmin u\E estimator, fuzzy model identification
Procedia PDF Downloads 470657 Point Estimation for the Type II Generalized Logistic Distribution Based on Progressively Censored Data
Authors: Rana Rimawi, Ayman Baklizi
Abstract:
Skewed distributions are important models that are frequently used in applications. Generalized distributions form a class of skewed distributions and gain widespread use in applications because of their flexibility in data analysis. More specifically, the Generalized Logistic Distribution with its different types has received considerable attention recently. In this study, based on progressively type-II censored data, we will consider point estimation in type II Generalized Logistic Distribution (Type II GLD). We will develop several estimators for its unknown parameters, including maximum likelihood estimators (MLE), Bayes estimators and linear estimators (BLUE). The estimators will be compared using simulation based on the criteria of bias and Mean square error (MSE). An illustrative example of a real data set will be given.Keywords: point estimation, type II generalized logistic distribution, progressive censoring, maximum likelihood estimation
Procedia PDF Downloads 198656 Tumour Radionuclides Therapy: in vitro and in vivo Dose Distribution Study
Authors: Rekaya A. Shabbir, Marco Mingarelli, Glenn Flux, Ananya Choudhury, Tim A. D. Smith
Abstract:
Introduction: Heterogeneity of dose distributions across a tumour is problematic for targeted radiotherapy. Gold nanoparticles (AuNPs) enhance dose-distributions of targeted radionuclides. The aim of this study is to demonstrate if tumour dose-distribution of targeted AuNPs radiolabelled with either of two radioisotopes (¹⁷⁷Lu and ⁹⁰Y) in breast cancer cells produced homogeneous dose distributions. Moreover, in vitro and in vivo studies were conducted to study the importance of receptor level on cytotoxicity of EGFR-targeted AuNPs in breast and colorectal cancer cells. Methods: AuNPs were functionalised with DOTA and OPPS-PEG-SVA to optimise labelling with radionuclide tracers and targeting with Erbitux. Radionuclides were chelated with DOTA, and the uptake of the radiolabelled AuNPs and targeted activity in vitro in both cell lines measured using liquid scintillation counting. Cells with medium (HCT8) and high (MDA-MB-468) EGFR expression were incubated with targeted ¹⁷⁷Lu-AuNPs for 4h, then washed and allowed to form colonies. Nude mice bearing tumours were used to study the biodistribution by injecting ¹⁷⁷Lu-AuNPs or ⁹⁰Y-AuNPs via the tail vein. Heterogeneity of dose-distribution in tumours was determined using autoradiography. Results: Colony formation (% control) was 81 ± 4.7% (HCT8) and 32 ± 9% (MDA-MB-468). High uptake was observed in the liver and spleen, indicating hepatobiliary excretion. Imaging showed heterogeneity in dose-distributions for both radionuclides across the tumours. Conclusion: The cytotoxic effect of EGFR-targeted AuNPs is greater in cells with higher EGFR expression. Dose-distributions for individual radiolabelled nanoparticles were heterogeneous across tumours. Further strategies are required to improve the uniformity of dose distribution prior to clinical trials.Keywords: cancer cells, dose distributions, radionuclide therapy, targeted gold nanoparticles
Procedia PDF Downloads 114655 Effect of Variable Fluxes on Optimal Flux Distribution in a Metabolic Network
Authors: Ehsan Motamedian
Abstract:
Finding all optimal flux distributions of a metabolic model is an important challenge in systems biology. In this paper, a new algorithm is introduced to identify all alternate optimal solutions of a large scale metabolic network. The algorithm reduces the model to decrease computations for finding optimal solutions. The algorithm was implemented on the Escherichia coli metabolic model to find all optimal solutions for lactate and acetate production. There were more optimal flux distributions when acetate production was optimized. The model was reduced from 1076 to 80 variable fluxes for lactate while it was reduced to 91 variable fluxes for acetate. These 11 more variable fluxes resulted in about three times more optimal flux distributions. Variable fluxes were from 12 various metabolic pathways and most of them belonged to nucleotide salvage and extra cellular transport pathways.Keywords: flux variability, metabolic network, mixed-integer linear programming, multiple optimal solutions
Procedia PDF Downloads 434654 An Empirical Study of the Best Fitting Probability Distributions for Stock Returns Modeling
Authors: Jayanta Pokharel, Gokarna Aryal, Netra Kanaal, Chris Tsokos
Abstract:
Investment in stocks and shares aims to seek potential gains while weighing the risk of future needs, such as retirement, children's education etc. Analysis of the behavior of the stock market returns and making prediction is important for investors to mitigate risk on investment. Historically, the normal variance models have been used to describe the behavior of stock market returns. However, the returns of the financial assets are actually skewed with higher kurtosis, heavier tails, and a higher center than the normal distribution. The Laplace distribution and its family are natural candidates for modeling stock returns. The Variance-Gamma (VG) distribution is the most sought-after distributions for modeling asset returns and has been extensively discussed in financial literatures. In this paper, it explore the other Laplace family, such as Asymmetric Laplace, Skewed Laplace, Kumaraswamy Laplace (KS) together with Variance-Gamma to model the weekly returns of the S&P 500 Index and it's eleven business sector indices. The method of maximum likelihood is employed to estimate the parameters of the distributions and our empirical inquiry shows that the Kumaraswamy Laplace distribution performs much better for stock returns modeling among the choice of distributions used in this study and in practice, KS can be used as a strong alternative to VG distribution.Keywords: stock returns, variance-gamma, kumaraswamy laplace, maximum likelihood
Procedia PDF Downloads 70653 Determination of the Best Fit Probability Distribution for Annual Rainfall in Karkheh River at Iran
Authors: Karim Hamidi Machekposhti, Hossein Sedghi
Abstract:
This study was designed to find the best-fit probability distribution of annual rainfall based on 50 years sample (1966-2015) in the Karkheh river basin at Iran using six probability distributions: Normal, 2-Parameter Log Normal, 3-Parameter Log Normal, Pearson Type 3, Log Pearson Type 3 and Gumbel distribution. The best fit probability distribution was selected using Stormwater Management and Design Aid (SMADA) software and based on the Residual Sum of Squares (R.S.S) between observed and estimated values Based on the R.S.S values of fit tests, the Log Pearson Type 3 and then Pearson Type 3 distributions were found to be the best-fit probability distribution at the Jelogir Majin and Pole Zal rainfall gauging station. The annual values of expected rainfall were calculated using the best fit probability distributions and can be used by hydrologists and design engineers in future research at studied region and other region in the world.Keywords: Log Pearson Type 3, SMADA, rainfall, Karkheh River
Procedia PDF Downloads 191652 Evaluation of Carbon Dioxide Pressure through Radial Velocity Difference in Arterial Blood Modeled by Drift Flux Model
Authors: Aicha Rima Cheniti, Hatem Besbes, Joseph Haggege, Christophe Sintes
Abstract:
In this paper, we are interested to determine the carbon dioxide pressure in the arterial blood through radial velocity difference. The blood was modeled as a two phase mixture (an aqueous carbon dioxide solution with carbon dioxide gas) by Drift flux model and the Young-Laplace equation. The distributions of mixture velocities determined from the considered model permitted the calculation of the radial velocity distributions with different values of mean mixture pressure and the calculation of the mean carbon dioxide pressure knowing the mean mixture pressure. The radial velocity distributions are used to deduce a calculation method of the mean mixture pressure through the radial velocity difference between two positions which is measured by ultrasound. The mean carbon dioxide pressure is then deduced from the mean mixture pressure.Keywords: mean carbon dioxide pressure, mean mixture pressure, mixture velocity, radial velocity difference
Procedia PDF Downloads 421651 Assessment Using Copulas of Simultaneous Damage to Multiple Buildings Due to Tsunamis
Authors: Yo Fukutani, Shuji Moriguchi, Takuma Kotani, Terada Kenjiro
Abstract:
If risk management of the assets owned by companies, risk assessment of real estate portfolio, and risk identification of the entire region are to be implemented, it is necessary to consider simultaneous damage to multiple buildings. In this research, the Sagami Trough earthquake tsunami that could have a significant effect on the Japanese capital region is focused on, and a method is proposed for simultaneous damage assessment using copulas that can take into consideration the correlation of tsunami depths and building damage between two sites. First, the tsunami inundation depths at two sites were simulated by using a nonlinear long-wave equation. The tsunamis were simulated by varying the slip amount (five cases) and the depths (five cases) for each of 10 sources of the Sagami Trough. For each source, the frequency distributions of the tsunami inundation depth were evaluated by using the response surface method. Then, Monte-Carlo simulation was conducted, and frequency distributions of tsunami inundation depth were evaluated at the target sites for all sources of the Sagami Trough. These are marginal distributions. Kendall’s tau for the tsunami inundation simulation at two sites was 0.83. Based on this value, the Gaussian copula, t-copula, Clayton copula, and Gumbel copula (n = 10,000) were generated. Then, the simultaneous distributions of the damage rate were evaluated using the marginal distributions and the copulas. For the correlation of the tsunami inundation depth at the two sites, the expected value hardly changed compared with the case of no correlation, but the damage rate of the ninety-ninth percentile value was approximately 2%, and the maximum value was approximately 6% when using the Gumbel copula.Keywords: copulas, Monte-Carlo simulation, probabilistic risk assessment, tsunamis
Procedia PDF Downloads 143650 Assessing Effects of an Intervention on Bottle-Weaning and Reducing Daily Milk Intake from Bottles in Toddlers Using Two-Part Random Effects Models
Authors: Yungtai Lo
Abstract:
Two-part random effects models have been used to fit semi-continuous longitudinal data where the response variable has a point mass at 0 and a continuous right-skewed distribution for positive values. We review methods proposed in the literature for analyzing data with excess zeros. A two-part logit-log-normal random effects model, a two-part logit-truncated normal random effects model, a two-part logit-gamma random effects model, and a two-part logit-skew normal random effects model were used to examine effects of a bottle-weaning intervention on reducing bottle use and daily milk intake from bottles in toddlers aged 11 to 13 months in a randomized controlled trial. We show in all four two-part models that the intervention promoted bottle-weaning and reduced daily milk intake from bottles in toddlers drinking from a bottle. We also show that there are no differences in model fit using either the logit link function or the probit link function for modeling the probability of bottle-weaning in all four models. Furthermore, prediction accuracy of the logit or probit link function is not sensitive to the distribution assumption on daily milk intake from bottles in toddlers not off bottles.Keywords: two-part model, semi-continuous variable, truncated normal, gamma regression, skew normal, Pearson residual, receiver operating characteristic curve
Procedia PDF Downloads 349649 Modelling Volatility of Cryptocurrencies: Evidence from GARCH Family of Models with Skewed Error Innovation Distributions
Authors: Timothy Kayode Samson, Adedoyin Isola Lawal
Abstract:
The past five years have shown a sharp increase in public interest in the crypto market, with its market capitalization growing from $100 billion in June 2017 to $2158.42 billion on April 5, 2022. Despite the outrageous nature of the volatility of cryptocurrencies, the use of skewed error innovation distributions in modelling the volatility behaviour of these digital currencies has not been given much research attention. Hence, this study models the volatility of 5 largest cryptocurrencies by market capitalization (Bitcoin, Ethereum, Tether, Binance coin, and USD Coin) using four variants of GARCH models (GJR-GARCH, sGARCH, EGARCH, and APARCH) estimated using three skewed error innovation distributions (skewed normal, skewed student- t and skewed generalized error innovation distributions). Daily closing prices of these currencies were obtained from Yahoo Finance website. Finding reveals that the Binance coin reported higher mean returns compared to other digital currencies, while the skewness indicates that the Binance coin, Tether, and USD coin increased more than they decreased in values within the period of study. For both Bitcoin and Ethereum, negative skewness was obtained, meaning that within the period of study, the returns of these currencies decreased more than they increased in value. Returns from these cryptocurrencies were found to be stationary but not normality distributed with evidence of the ARCH effect. The skewness parameters in all best forecasting models were all significant (p<.05), justifying of use of skewed error innovation distributions with a fatter tail than normal, Student-t, and generalized error innovation distributions. For Binance coin, EGARCH-sstd outperformed other volatility models, while for Bitcoin, Ethereum, Tether, and USD coin, the best forecasting models were EGARCH-sstd, APARCH-sstd, EGARCH-sged, and GJR-GARCH-sstd, respectively. This suggests the superiority of skewed Student t- distribution and skewed generalized error distribution over the skewed normal distribution.Keywords: skewed generalized error distribution, skewed normal distribution, skewed student t- distribution, APARCH, EGARCH, sGARCH, GJR-GARCH
Procedia PDF Downloads 119648 Finite Sample Inferences for Weak Instrument Models
Authors: Gubhinder Kundhi, Paul Rilstone
Abstract:
It is well established that Instrumental Variable (IV) estimators in the presence of weak instruments can be poorly behaved, in particular, be quite biased in finite samples. Finite sample approximations to the distributions of these estimators are obtained using Edgeworth and Saddlepoint expansions. Departures from normality of the distributions of these estimators are analyzed using higher order analytical corrections in these expansions. In a Monte-Carlo experiment, the performance of these expansions is compared to the first order approximation and other methods commonly used in finite samples such as the bootstrap.Keywords: bootstrap, Instrumental Variable, Edgeworth expansions, Saddlepoint expansions
Procedia PDF Downloads 310647 Development of an Implicit Physical Influence Upwind Scheme for Cell-Centered Finite Volume Method
Authors: Shidvash Vakilipour, Masoud Mohammadi, Rouzbeh Riazi, Scott Ormiston, Kimia Amiri, Sahar Barati
Abstract:
An essential component of a finite volume method (FVM) is the advection scheme that estimates values on the cell faces based on the calculated values on the nodes or cell centers. The most widely used advection schemes are upwind schemes. These schemes have been developed in FVM on different kinds of structured and unstructured grids. In this research, the physical influence scheme (PIS) is developed for a cell-centered FVM that uses an implicit coupled solver. Results are compared with the exponential differencing scheme (EDS) and the skew upwind differencing scheme (SUDS). Accuracy of these schemes is evaluated for a lid-driven cavity flow at Re = 1000, 3200, and 5000 and a backward-facing step flow at Re = 800. Simulations show considerable differences between the results of EDS scheme with benchmarks, especially for the lid-driven cavity flow at high Reynolds numbers. These differences occur due to false diffusion. Comparing SUDS and PIS schemes shows relatively close results for the backward-facing step flow and different results in lid-driven cavity flow. The poor results of SUDS in the lid-driven cavity flow can be related to its lack of sensitivity to the pressure difference between cell face and upwind points, which is critical for the prediction of such vortex dominant flows.Keywords: cell-centered finite volume method, coupled solver, exponential differencing scheme (EDS), physical influence scheme (PIS), pressure weighted interpolation method (PWIM), skew upwind differencing scheme (SUDS)
Procedia PDF Downloads 284646 Exponentiated Transmuted Weibull Distribution: A Generalization of the Weibull Probability Distribution
Authors: Abd El Hady N. Ebraheim
Abstract:
This paper introduces a new generalization of the two parameter Weibull distribution. To this end, the quadratic rank transmutation map has been used. This new distribution is named exponentiated transmuted Weibull (ETW) distribution. The ETW distribution has the advantage of being capable of modeling various shapes of aging and failure criteria. Furthermore, eleven lifetime distributions such as the Weibull, exponentiated Weibull, Rayleigh and exponential distributions, among others follow as special cases. The properties of the new model are discussed and the maximum likelihood estimation is used to estimate the parameters. Explicit expressions are derived for the quantiles. The moments of the distribution are derived, and the order statistics are examined.Keywords: exponentiated, inversion method, maximum likelihood estimation, transmutation map
Procedia PDF Downloads 565645 The Beta-Fisher Snedecor Distribution with Applications to Cancer Remission Data
Authors: K. A. Adepoju, O. I. Shittu, A. U. Chukwu
Abstract:
In this paper, a new four-parameter generalized version of the Fisher Snedecor distribution called Beta- F distribution is introduced. The comprehensive account of the statistical properties of the new distributions was considered. Formal expressions for the cumulative density function, moments, moment generating function and maximum likelihood estimation, as well as its Fisher information, were obtained. The flexibility of this distribution as well as its robustness using cancer remission time data was demonstrated. The new distribution can be used in most applications where the assumption underlying the use of other lifetime distributions is violated.Keywords: fisher-snedecor distribution, beta-f distribution, outlier, maximum likelihood method
Procedia PDF Downloads 347644 Non-Linear Regression Modeling for Composite Distributions
Authors: Mostafa Aminzadeh, Min Deng
Abstract:
Modeling loss data is an important part of actuarial science. Actuaries use models to predict future losses and manage financial risk, which can be beneficial for marketing purposes. In the insurance industry, small claims happen frequently while large claims are rare. Traditional distributions such as Normal, Exponential, and inverse-Gaussian are not suitable for describing insurance data, which often show skewness and fat tails. Several authors have studied classical and Bayesian inference for parameters of composite distributions, such as Exponential-Pareto, Weibull-Pareto, and Inverse Gamma-Pareto. These models separate small to moderate losses from large losses using a threshold parameter. This research introduces a computational approach using a nonlinear regression model for loss data that relies on multiple predictors. Simulation studies were conducted to assess the accuracy of the proposed estimation method. The simulations confirmed that the proposed method provides precise estimates for regression parameters. It's important to note that this approach can be applied to datasets if goodness-of-fit tests confirm that the composite distribution under study fits the data well. To demonstrate the computations, a real data set from the insurance industry is analyzed. A Mathematica code uses the Fisher information algorithm as an iteration method to obtain the maximum likelihood estimation (MLE) of regression parameters.Keywords: maximum likelihood estimation, fisher scoring method, non-linear regression models, composite distributions
Procedia PDF Downloads 33643 Effects Induced by Dispersion-Promoting Cylinder on Fiber-Concentration Distributions in Pulp Suspension Flows
Authors: M. Sumida, T. Fujimoto
Abstract:
Fiber-concentration distributions in pulp liquid flows behind dispersion promoters were experimentally investigated to explore the feasibility of improving operational performance of hydraulic headboxes in papermaking machines. The proposed research was performed in the form of a basic test conducted on a screen-type model comprising a circular cylinder inserted within a channel. Tests were performed using pulp liquid possessing fiber concentrations ranging from 0.3-1.0 wt% under different flow velocities of 0.016-0.74 m/s. Fiber-concentration distributions were measured using the transmitted light attenuation method. Obtained test results were analyzed, and the influence of the flow velocities on wake characteristics behind the cylinder has been investigated with reference to findings of our preceding studies concerning pulp liquid flows in straight channels. Changes in fiber-concentration distribution along the flow direction were observed to be substantially large in the section from the cylinder to four times its diameter downstream of its centerline. Findings of this study provide useful information concerning the development of hydraulic headboxes.Keywords: dispersion promoter, fiber-concentration distribution, hydraulic headbox, pulp liquid flow
Procedia PDF Downloads 346642 Investigation on Fischer-Tropsch Synthesis over Cobalt-Gadolinium Catalyst
Authors: Jian Huang, Weixin Qian, Haitao Zhang, Weiyong Ying
Abstract:
Cobalt-gadolinium catalyst for Fischer-Tropsch synthesis was prepared by impregnation method with commercial silica gel, and its texture properties were characterized by BET, XRD, and TPR. The catalytic performance of the catalyst was tested in a fixed bed reactor. The results showed that the addition of gadolinium to the cobalt catalyst might decrease the size of cobalt particles, and increased the dispersion of catalytic active cobalt phases. The carbon number distributions for the catalysts was calculated by ASF equation.Keywords: Fischer-Tropsch synthesis, cobalt-based catalysts, gadolinium, carbon number distributions
Procedia PDF Downloads 379641 Confidence Intervals for Quantiles in the Two-Parameter Exponential Distributions with Type II Censored Data
Authors: Ayman Baklizi
Abstract:
Based on type II censored data, we consider interval estimation of the quantiles of the two-parameter exponential distribution and the difference between the quantiles of two independent two-parameter exponential distributions. We derive asymptotic intervals, Bayesian, as well as intervals based on the generalized pivot variable. We also include some bootstrap intervals in our comparisons. The performance of these intervals is investigated in terms of their coverage probabilities and expected lengths.Keywords: asymptotic intervals, Bayes intervals, bootstrap, generalized pivot variables, two-parameter exponential distribution, quantiles
Procedia PDF Downloads 414640 Design Challenges for Severely Skewed Steel Bridges
Authors: Muna Mitchell, Akshay Parchure, Krishna Singaraju
Abstract:
There is an increasing need for medium- to long-span steel bridges with complex geometry due to site restrictions in developed areas. One of the solutions to grade separations in congested areas is to use longer spans on skewed supports that avoid at-grade obstructions limiting impacts to the foundation. Where vertical clearances are also a constraint, continuous steel girders can be used to reduce superstructure depths. Combining continuous long steel spans on severe skews can resolve the constraints at a cost. The behavior of skewed girders is challenging to analyze and design with subsequent complexity during fabrication and construction. As a part of a corridor improvement project, Walter P Moore designed two 1700-foot side-by-side bridges carrying four lanes of traffic in each direction over a railroad track. The bridges consist of prestressed concrete girder approach spans and three-span continuous steel plate girder units. The roadway design added complex geometry to the bridge with horizontal and vertical curves combined with superelevation transitions within the plate girder units. The substructure at the steel units was skewed approximately 56 degrees to satisfy the existing railroad right-of-way requirements. A horizontal point of curvature (PC) near the end of the steel units required the use flared girders and chorded slab edges. Due to the flared girder geometry, the cross-frame spacing in each bay is unique. Staggered cross frames were provided based on AASHTO LRFD and NCHRP guidelines for high skew steel bridges. Skewed steel bridges develop significant forces in the cross frames and rotation in the girder websdue to differential displacements along the girders under dead and live loads. In addition, under thermal loads, skewed steel bridges expand and contract not along the alignment parallel to the girders but along the diagonal connecting the acute corners, resulting in horizontal displacement both along and perpendicular to the girders. AASHTO LRFD recommends a 95 degree Fahrenheit temperature differential for the design of joints and bearings. The live load and the thermal loads resulted in significant horizontal forces and rotations in the bearings that necessitated the use of HLMR bearings. A unique bearing layout was selected to minimize the effect of thermal forces. The span length, width, skew, and roadway geometry at the bridges also required modular bridge joint systems (MBJS) with inverted-T bent caps to accommodate movement in the steel units. 2D and 3D finite element analysis models were developed to accurately determine the forces and rotations in the girders, cross frames, and bearings and to estimate thermal displacements at the joints. This paper covers the decision-making process for developing the framing plan, bearing configurations, joint type, and analysis models involved in the design of the high-skew three-span continuous steel plate girder bridges.Keywords: complex geometry, continuous steel plate girders, finite element structural analysis, high skew, HLMR bearings, modular joint
Procedia PDF Downloads 193639 A Brief Study about Nonparametric Adherence Tests
Authors: Vinicius R. Domingues, Luan C. S. M. Ozelim
Abstract:
The statistical study has become indispensable for various fields of knowledge. Not any different, in Geotechnics the study of probabilistic and statistical methods has gained power considering its use in characterizing the uncertainties inherent in soil properties. One of the situations where engineers are constantly faced is the definition of a probability distribution that represents significantly the sampled data. To be able to discard bad distributions, goodness-of-fit tests are necessary. In this paper, three non-parametric goodness-of-fit tests are applied to a data set computationally generated to test the goodness-of-fit of them to a series of known distributions. It is shown that the use of normal distribution does not always provide satisfactory results regarding physical and behavioral representation of the modeled parameters.Keywords: Kolmogorov-Smirnov test, Anderson-Darling test, Cramer-Von-Mises test, nonparametric adherence tests
Procedia PDF Downloads 445638 Classical and Bayesian Inference of the Generalized Log-Logistic Distribution with Applications to Survival Data
Authors: Abdisalam Hassan Muse, Samuel Mwalili, Oscar Ngesa
Abstract:
A generalized log-logistic distribution with variable shapes of the hazard rate was introduced and studied, extending the log-logistic distribution by adding an extra parameter to the classical distribution, leading to greater flexibility in analysing and modeling various data types. The proposed distribution has a large number of well-known lifetime special sub-models such as; Weibull, log-logistic, exponential, and Burr XII distributions. Its basic mathematical and statistical properties were derived. The method of maximum likelihood was adopted for estimating the unknown parameters of the proposed distribution, and a Monte Carlo simulation study is carried out to assess the behavior of the estimators. The importance of this distribution is that its tendency to model both monotone (increasing and decreasing) and non-monotone (unimodal and bathtub shape) or reversed “bathtub” shape hazard rate functions which are quite common in survival and reliability data analysis. Furthermore, the flexibility and usefulness of the proposed distribution are illustrated in a real-life data set and compared to its sub-models; Weibull, log-logistic, and BurrXII distributions and other parametric survival distributions with 3-parmaeters; like the exponentiated Weibull distribution, the 3-parameter lognormal distribution, the 3- parameter gamma distribution, the 3-parameter Weibull distribution, and the 3-parameter log-logistic (also known as shifted log-logistic) distribution. The proposed distribution provided a better fit than all of the competitive distributions based on the goodness-of-fit tests, the log-likelihood, and information criterion values. Finally, Bayesian analysis and performance of Gibbs sampling for the data set are also carried out.Keywords: hazard rate function, log-logistic distribution, maximum likelihood estimation, generalized log-logistic distribution, survival data, Monte Carlo simulation
Procedia PDF Downloads 202637 Regionalization of IDF Curves with L-Moments for Storm Events
Authors: Noratiqah Mohd Ariff, Abdul Aziz Jemain, Mohd Aftar Abu Bakar
Abstract:
The construction of Intensity-Duration-Frequency (IDF) curves is one of the most common and useful tools in order to design hydraulic structures and to provide a mathematical relationship between rainfall characteristics. IDF curves, especially those in Peninsular Malaysia, are often built using moving windows of rainfalls. However, these windows do not represent the actual rainfall events since the duration of rainfalls is usually prefixed. Hence, instead of using moving windows, this study aims to find regionalized distributions for IDF curves of extreme rainfalls based on storm events. Homogeneity test is performed on annual maximum of storm intensities to identify homogeneous regions of storms in Peninsular Malaysia. The L-moment method is then used to regionalized Generalized Extreme Value (GEV) distribution of these annual maximums and subsequently. IDF curves are constructed using the regional distributions. The differences between the IDF curves obtained and IDF curves found using at-site GEV distributions are observed through the computation of the coefficient of variation of root mean square error, mean percentage difference and the coefficient of determination. The small differences implied that the construction of IDF curves could be simplified by finding a general probability distribution of each region. This will also help in constructing IDF curves for sites with no rainfall station.Keywords: IDF curves, L-moments, regionalization, storm events
Procedia PDF Downloads 528636 Extension and Closure of a Field for Engineering Purpose
Authors: Shouji Yujiro, Memei Dukovic, Mist Yakubu
Abstract:
Fields are important objects of study in algebra since they provide a useful generalization of many number systems, such as the rational numbers, real numbers, and complex numbers. In particular, the usual rules of associativity, commutativity and distributivity hold. Fields also appear in many other areas of mathematics; see the examples below. When abstract algebra was first being developed, the definition of a field usually did not include commutativity of multiplication, and what we today call a field would have been called either a commutative field or a rational domain. In contemporary usage, a field is always commutative. A structure which satisfies all the properties of a field except possibly for commutativity, is today called a division ring ordivision algebra or sometimes a skew field. Also non-commutative field is still widely used. In French, fields are called corps (literally, body), generally regardless of their commutativity. When necessary, a (commutative) field is called corps commutative and a skew field-corps gauche. The German word for body is Körper and this word is used to denote fields; hence the use of the blackboard bold to denote a field. The concept of fields was first (implicitly) used to prove that there is no general formula expressing in terms of radicals the roots of a polynomial with rational coefficients of degree 5 or higher. An extension of a field k is just a field K containing k as a subfield. One distinguishes between extensions having various qualities. For example, an extension K of a field k is called algebraic, if every element of K is a root of some polynomial with coefficients in k. Otherwise, the extension is called transcendental. The aim of Galois Theory is the study of algebraic extensions of a field. Given a field k, various kinds of closures of k may be introduced. For example, the algebraic closure, the separable closure, the cyclic closure et cetera. The idea is always the same: If P is a property of fields, then a P-closure of k is a field K containing k, having property, and which is minimal in the sense that no proper subfield of K that contains k has property P. For example if we take P (K) to be the property ‘every non-constant polynomial f in K[t] has a root in K’, then a P-closure of k is just an algebraic closure of k. In general, if P-closures exist for some property P and field k, they are all isomorphic. However, there is in general no preferable isomorphism between two closures.Keywords: field theory, mechanic maths, supertech, rolltech
Procedia PDF Downloads 373635 A Methodology for Characterising the Tail Behaviour of a Distribution
Authors: Serge Provost, Yishan Zang
Abstract:
Following a review of various approaches that are utilized for classifying the tail behavior of a distribution, an easily implementable methodology that relies on an arctangent transformation is presented. The classification criterion is actually based on the difference between two specific quantiles of the transformed distribution. The resulting categories enable one to classify distributional tails as distinctly short, short, nearly medium, medium, extended medium and somewhat long, providing that at least two moments exist. Distributions possessing a single moment are said to be long tailed while those failing to have any finite moments are classified as having an extremely long tail. Several illustrative examples will be presented.Keywords: arctangent transformation, tail classification, heavy-tailed distributions, distributional moments
Procedia PDF Downloads 120