Search results for: probability distribution function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8373

Search results for: probability distribution function

8343 Fragility Analysis of Weir Structure Subjected to Flooding Water Damage

Authors: Oh Hyeon Jeon, WooYoung Jung

Abstract:

In this study, seepage analysis was performed by the level difference between upstream and downstream of weir structure for safety evaluation of weir structure against flooding. Monte Carlo Simulation method was employed by considering the probability distribution of the adjacent ground parameter, i.e., permeability coefficient of weir structure. Moreover, by using a commercially available finite element program (ABAQUS), modeling of the weir structure is carried out. Based on this model, the characteristic of water seepage during flooding was determined at each water level with consideration of the uncertainty of their corresponding permeability coefficient. Subsequently, fragility function could be constructed based on this response from numerical analysis; this fragility function results could be used to determine the weakness of weir structure subjected to flooding disaster. They can also be used as a reference data that can comprehensively predict the probability of failur,e and the degree of damage of a weir structure.

Keywords: weir structure, seepage, flood disaster fragility, probabilistic risk assessment, Monte-Carlo simulation, permeability coefficient

Procedia PDF Downloads 321
8342 A Simplified Distribution for Nonlinear Seas

Authors: M. A. Tayfun, M. A. Alkhalidi

Abstract:

The exact theoretical expression describing the probability distribution of nonlinear sea-surface elevations derived from the second-order narrowband model has a cumbersome form that requires numerical computations, not well-disposed to theoretical or practical applications. Here, the same narrowband model is re-examined to develop a simpler closed-form approximation suitable for theoretical and practical applications. The salient features of the approximate form are explored, and its relative validity is verified with comparisons to other readily available approximations, and oceanic data.

Keywords: ocean waves, probability distributions, second-order nonlinearities, skewness coefficient, wave steepness

Procedia PDF Downloads 408
8341 Kinetic Model to Interpret Whistler Waves in Multicomponent Non-Maxwellian Space Plasmas

Authors: Warda Nasir, M. N. S. Qureshi

Abstract:

Whistler waves are right handed circularly polarized waves and are frequently observed in space plasmas. The Low frequency branch of the Whistler waves having frequencies nearly around 100 Hz, known as Lion roars, are frequently observed in magnetosheath. Another feature of the magnetosheath is the observations of flat top electron distributions with single as well as two electron populations. In the past, lion roars were studied by employing kinetic model using classical bi-Maxwellian distribution function, however, could not be justified both on quantitatively as well as qualitatively grounds. We studied Whistler waves by employing kinetic model using non-Maxwellian distribution function such as the generalized (r,q) distribution function which is the generalized form of kappa and Maxwellian distribution functions by employing kinetic theory with single or two electron populations. We compare our results with the Cluster observations and found good quantitative and qualitative agreement between them. At times when lion roars are observed (not observed) in the data and bi-Maxwellian could not provide the sufficient growth (damping) rates, we showed that when generalized (r,q) distribution function is employed, the resulted growth (damping) rates exactly match the observations.

Keywords: kinetic model, whistler waves, non-maxwellian distribution function, space plasmas

Procedia PDF Downloads 282
8340 Predicting the Uniaxial Strength Distribution of Brittle Materials Based on a Uniaxial Test

Authors: Benjamin Sonnenreich

Abstract:

Brittle fracture failure probability is best described using a stochastic approach which is based on the 'weakest link concept' and the connection between a microstructure and macroscopic fracture scale. A general theoretical and experimental framework is presented to predict the uniaxial strength distribution according to independent uniaxial test data. The framework takes as input the applied stresses, the geometry, the materials, the defect distributions and the relevant random variables from uniaxial test results and gives as output an overall failure probability that can be used to improve the reliability of practical designs. Additionally, the method facilitates comparisons of strength data from several sources, uniaxial tests, and sample geometries.

Keywords: brittle fracture, strength distribution, uniaxial, weakest link concept

Procedia PDF Downloads 297
8339 Comparison of Wind Fragility for Window System in the Simplified 10 and 15-Story Building Considering Exposure Category

Authors: Viriyavudh Sim, WooYoung Jung

Abstract:

Window system in high rise building is occasionally subjected to an excessive wind intensity, particularly during typhoon. The failure of window system did not affect overall safety of structural performance; however, it could endanger the safety of the residents. In this paper, comparison of fragility curves for window system of two residential buildings was studied. The probability of failure for individual window was determined with Monte Carlo Simulation method. Then, lognormal cumulative distribution function was used to represent the fragility. The results showed that windows located on the edge of leeward wall were more susceptible to wind load and the probability of failure for each window panel increased at higher floors.

Keywords: wind fragility, window system, high rise building, wind disaster

Procedia PDF Downloads 292
8338 The Normal-Generalized Hyperbolic Secant Distribution: Properties and Applications

Authors: Hazem M. Al-Mofleh

Abstract:

In this paper, a new four-parameter univariate continuous distribution called the Normal-Generalized Hyperbolic Secant Distribution (NGHS) is defined and studied. Some general and structural distributional properties are investigated and discussed, including: central and non-central n-th moments and incomplete moments, quantile and generating functions, hazard function, Rényi and Shannon entropies, shapes: skewed right, skewed left, and symmetric, modality regions: unimodal and bimodal, maximum likelihood (MLE) estimators for the parameters. Finally, two real data sets are used to demonstrate empirically its flexibility and prove the strength of the new distribution.

Keywords: bimodality, estimation, hazard function, moments, Shannon’s entropy

Procedia PDF Downloads 310
8337 New Variational Approach for Contrast Enhancement of Color Image

Authors: Wanhyun Cho, Seongchae Seo, Soonja Kang

Abstract:

In this work, we propose a variational technique for image contrast enhancement which utilizes global and local information around each pixel. The energy functional is defined by a weighted linear combination of three terms which are called on a local, a global contrast term and dispersion term. The first one is a local contrast term that can lead to improve the contrast of an input image by increasing the grey-level differences between each pixel and its neighboring to utilize contextual information around each pixel. The second one is global contrast term, which can lead to enhance a contrast of image by minimizing the difference between its empirical distribution function and a cumulative distribution function to make the probability distribution of pixel values becoming a symmetric distribution about median. The third one is a dispersion term that controls the departure between new pixel value and pixel value of original image while preserving original image characteristics as well as possible. Second, we derive the Euler-Lagrange equation for true image that can achieve the minimum of a proposed functional by using the fundamental lemma for the calculus of variations. And, we considered the procedure that this equation can be solved by using a gradient decent method, which is one of the dynamic approximation techniques. Finally, by conducting various experiments, we can demonstrate that the proposed method can enhance the contrast of colour images better than existing techniques.

Keywords: color image, contrast enhancement technique, variational approach, Euler-Lagrang equation, dynamic approximation method, EME measure

Procedia PDF Downloads 421
8336 A Statistical Model for the Dynamics of Single Cathode Spot in Vacuum Cylindrical Cathode

Authors: Po-Wen Chen, Jin-Yu Wu, Md. Manirul Ali, Yang Peng, Chen-Te Chang, Der-Jun Jan

Abstract:

Dynamics of cathode spot has become a major part of vacuum arc discharge with its high academic interest and wide application potential. In this article, using a three-dimensional statistical model, we simulate the distribution of the ignition probability of a new cathode spot occurring in different magnetic pressure on old cathode spot surface and at different arcing time. This model for the ignition probability of a new cathode spot was proposed in two typical situations, one by the pure isotropic random walk in the absence of an external magnetic field, other by the retrograde motion in external magnetic field, in parallel with the cathode surface. We mainly focus on developed relationship between the ignition probability density distribution of a new cathode spot and the external magnetic field.

Keywords: cathode spot, vacuum arc discharge, transverse magnetic field, random walk

Procedia PDF Downloads 408
8335 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test

Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman

Abstract:

At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. These findings need to be confirmed with a greater number of stations across other Australian states.

Keywords: floods, FLIKE, probability distributions, flood frequency, outlier

Procedia PDF Downloads 411
8334 Investigating the Effects of Data Transformations on a Bi-Dimensional Chi-Square Test

Authors: Alexandru George Vaduva, Adriana Vlad, Bogdan Badea

Abstract:

In this research, we conduct a Monte Carlo analysis on a two-dimensional χ2 test, which is used to determine the minimum distance required for independent sampling in the context of chaotic signals. We investigate the impact of transforming initial data sets from any probability distribution to new signals with a uniform distribution using the Spearman rank correlation on the χ2 test. This transformation removes the randomness of the data pairs, and as a result, the observed distribution of χ2 test values differs from the expected distribution. We propose a solution to this problem and evaluate it using another chaotic signal.

Keywords: chaotic signals, logistic map, Pearson’s test, Chi Square test, bivariate distribution, statistical independence

Procedia PDF Downloads 57
8333 The Modality of Multivariate Skew Normal Mixture

Authors: Bader Alruwaili, Surajit Ray

Abstract:

Finite mixtures are a flexible and powerful tool that can be used for univariate and multivariate distributions, and a wide range of research analysis has been conducted based on the multivariate normal mixture and multivariate of a t-mixture. Determining the number of modes is an important activity that, in turn, allows one to determine the number of homogeneous groups in a population. Our work currently being carried out relates to the study of the modality of the skew normal distribution in the univariate and multivariate cases. For the skew normal distribution, the aims are associated with studying the modality of the skew normal distribution and providing the ridgeline, the ridgeline elevation function, the $\Pi$ function, and the curvature function, and this will be conducive to an exploration of the number and location of mode when mixing the two components of skew normal distribution. The subsequent objective is to apply these results to the application of real world data sets, such as flow cytometry data.

Keywords: mode, modality, multivariate skew normal, finite mixture, number of mode

Procedia PDF Downloads 461
8332 A Multi-Objective Programming Model to Supplier Selection and Order Allocation Problem in Stochastic Environment

Authors: Rouhallah Bagheri, Morteza Mahmoudi, Hadi Moheb-Alizadeh

Abstract:

This paper aims at developing a multi-objective model for supplier selection and order allocation problem in stochastic environment, where purchasing cost, percentage of delivered items with delay and percentage of rejected items provided by each supplier are supposed to be stochastic parameters following any arbitrary probability distribution. In this regard, dependent chance programming is used which maximizes probability of the event that total purchasing cost, total delivered items with delay and total rejected items are less than or equal to pre-determined values given by decision maker. The abovementioned stochastic multi-objective programming problem is then transformed into a stochastic single objective programming problem using minimum deviation method. In the next step, the further problem is solved applying a genetic algorithm, which performs a simulation process in order to calculate the stochastic objective function as its fitness function. Finally, the impact of stochastic parameters on the given solution is examined via a sensitivity analysis exploiting coefficient of variation. The results show that whatever stochastic parameters have greater coefficients of variation, the value of the objective function in the stochastic single objective programming problem is deteriorated.

Keywords: supplier selection, order allocation, dependent chance programming, genetic algorithm

Procedia PDF Downloads 287
8331 The Linear Combination of Kernels in the Estimation of the Cumulative Distribution Functions

Authors: Abdel-Razzaq Mugdadi, Ruqayyah Sani

Abstract:

The Kernel Distribution Function Estimator (KDFE) method is the most popular method for nonparametric estimation of the cumulative distribution function. The kernel and the bandwidth are the most important components of this estimator. In this investigation, we replace the kernel in the KDFE with a linear combination of kernels to obtain a new estimator based on the linear combination of kernels, the mean integrated squared error (MISE), asymptotic mean integrated squared error (AMISE) and the asymptotically optimal bandwidth for the new estimator are derived. We propose a new data-based method to select the bandwidth for the new estimator. The new technique is based on the Plug-in technique in density estimation. We evaluate the new estimator and the new technique using simulations and real-life data.

Keywords: estimation, bandwidth, mean square error, cumulative distribution function

Procedia PDF Downloads 544
8330 Exact Solutions for Steady Response of Nonlinear Systems under Non-White Excitation

Authors: Yaping Zhao

Abstract:

In the present study, the exact solutions for the steady response of quasi-linear systems under non-white wide-band random excitation are considered by means of the stochastic averaging method. The non linearity of the systems contains the power-law damping and the cross-product term of the power-law damping and displacement. The drift and diffusion coefficients of the Fokker-Planck-Kolmogorov (FPK) equation after averaging are obtained by a succinct approach. After solving the averaged FPK equation, the joint probability density function and the marginal probability density function in steady state are attained. In the process of resolving, the eigenvalue problem of ordinary differential equation is handled by integral equation method. Some new results are acquired and the novel method to deal with the problems in nonlinear random vibration is proposed.

Keywords: random vibration, stochastic averaging method, FPK equation, transition probability density

Procedia PDF Downloads 473
8329 Base Change for Fisher Metrics: Case of the q-Gaussian Inverse Distribution

Authors: Gabriel I. Loaiza Ossa, Carlos A. Cadavid Moreno, Juan C. Arango Parra

Abstract:

It is known that the Riemannian manifold determined by the family of inverse Gaussian distributions endowed with the Fisher metric has negative constant curvature κ= -1/2, as does the family of usual Gaussian distributions. In the present paper, firstly, we arrive at this result by following a different path, much simpler than the previous ones. We first put the family in exponential form, thus endowing the family with a new set of parameters, or coordinates, θ₁, θ₂; then we determine the matrix of the Fisher metric in terms of these parameters; and finally we compute this matrix in the original parameters. Secondly, we define the inverse q-Gaussian distribution family (q < 3) as the family obtained by replacing the usual exponential function with the Tsallis q-exponential function in the expression for the inverse Gaussian distribution and observe that it supports two possible geometries, the Fisher and the q-Fisher geometry. And finally, we apply our strategy to obtain results about the Fisher and q-Fisher geometry of the inverse q-Gaussian distribution family, similar to the ones obtained in the case of the inverse Gaussian distribution family.

Keywords: base of changes, information geometry, inverse Gaussian distribution, inverse q-Gaussian distribution, statistical manifolds

Procedia PDF Downloads 213
8328 Supplier Selection and Order Allocation Using a Stochastic Multi-Objective Programming Model and Genetic Algorithm

Authors: Rouhallah Bagheri, Morteza Mahmoudi, Hadi Moheb-Alizadeh

Abstract:

In this paper, we develop a supplier selection and order allocation multi-objective model in stochastic environment in which purchasing cost, percentage of delivered items with delay and percentage of rejected items provided by each supplier are supposed to be stochastic parameters following any arbitrary probability distribution. To do so, we use dependent chance programming (DCP) that maximizes probability of the event that total purchasing cost, total delivered items with delay and total rejected items are less than or equal to pre-determined values given by decision maker. After transforming the above mentioned stochastic multi-objective programming problem into a stochastic single objective problem using minimum deviation method, we apply a genetic algorithm to get the later single objective problem solved. The employed genetic algorithm performs a simulation process in order to calculate the stochastic objective function as its fitness function. At the end, we explore the impact of stochastic parameters on the given solution via a sensitivity analysis exploiting coefficient of variation. The results show that as stochastic parameters have greater coefficients of variation, the value of objective function in the stochastic single objective programming problem is worsened.

Keywords: dependent chance programming, genetic algorithm, minimum deviation method, order allocation, supplier selection

Procedia PDF Downloads 224
8327 Gaussian Probability Density for Forest Fire Detection Using Satellite Imagery

Authors: S. Benkraouda, Z. Djelloul-Khedda, B. Yagoubi

Abstract:

we present a method for early detection of forest fires from a thermal infrared satellite image, using the image matrix of the probability of belonging. The principle of the method is to compare a theoretical mathematical model to an experimental model. We considered that each line of the image matrix, as an embodiment of a non-stationary random process. Since the distribution of pixels in the satellite image is statistically dependent, we divided these lines into small stationary and ergodic intervals to characterize the image by an adequate mathematical model. A standard deviation was chosen to generate random variables, so each interval behaves naturally like white Gaussian noise. The latter has been selected as the mathematical model that represents a set of very majority pixels, which we can be considered as the image background. Before modeling the image, we made a few pretreatments, then the parameters of the theoretical Gaussian model were extracted from the modeled image, these settings will be used to calculate the probability of each interval of the modeled image to belong to the theoretical Gaussian model. The high intensities pixels are regarded as foreign elements to it, so they will have a low probability, and the pixels that belong to the background image will have a high probability. Finally, we did present the reverse of the matrix of probabilities of these intervals for a better fire detection.

Keywords: forest fire, forest fire detection, satellite image, normal distribution, theoretical gaussian model, thermal infrared matrix image

Procedia PDF Downloads 113
8326 Informality, Trade Facilitation, and Trade: Evidence from Guinea-Bissau

Authors: Julio Vicente Cateia

Abstract:

This paper aims to assess the role of informality and trade facilitation on the export probability of Guinea-Bissau. We include informality in the Féchet function, which gives the expression for the country's supply probability. We find that Guinea-Bissau is about 7.2% less likely to export due to the 1% increase in informality. The export's probability increases by about 1.7%, 4%, and 1.1% due to a 1% increase in trade facilitation, R&D stock, and year of education. These results are significant at the usual levels. We suggest a development agenda aimed at reducing the level of informality in this country.

Keywords: development, trade, informality, trade facilitation, economy of Guinea-Bissau

Procedia PDF Downloads 132
8325 Influence of Maximum Fatigue Load on Probabilistic Aspect of Fatigue Crack Propagation Life at Specified Grown Crack in Magnesium Alloys

Authors: Seon Soon Choi

Abstract:

The principal purpose of this paper is to find the influence of maximum fatigue load on the probabilistic aspect of fatigue crack propagation life at a specified grown crack in magnesium alloys. The experiments of fatigue crack propagation are carried out in laboratory air under different conditions of the maximum fatigue loads to obtain the fatigue crack propagation data for the statistical analysis. In order to analyze the probabilistic aspect of fatigue crack propagation life, the goodness-of fit test for probability distribution of the fatigue crack propagation life at a specified grown crack is implemented through Anderson-Darling test. The good probability distribution of the fatigue crack propagation life is also verified under the conditions of the maximum fatigue loads.

Keywords: fatigue crack propagation life, magnesium alloys, maximum fatigue load, probability

Procedia PDF Downloads 358
8324 Parameter Estimation for Contact Tracing in Graph-Based Models

Authors: Augustine Okolie, Johannes Müller, Mirjam Kretzchmar

Abstract:

We adopt a maximum-likelihood framework to estimate parameters of a stochastic susceptible-infected-recovered (SIR) model with contact tracing on a rooted random tree. Given the number of detectees per index case, our estimator allows to determine the degree distribution of the random tree as well as the tracing probability. Since we do not discover all infectees via contact tracing, this estimation is non-trivial. To keep things simple and stable, we develop an approximation suited for realistic situations (contract tracing probability small, or the probability for the detection of index cases small). In this approximation, the only epidemiological parameter entering the estimator is the basic reproduction number R0. The estimator is tested in a simulation study and applied to covid-19 contact tracing data from India. The simulation study underlines the efficiency of the method. For the empirical covid-19 data, we are able to compare different degree distributions and perform a sensitivity analysis. We find that particularly a power-law and a negative binomial degree distribution meet the data well and that the tracing probability is rather large. The sensitivity analysis shows no strong dependency on the reproduction number.

Keywords: stochastic SIR model on graph, contact tracing, branching process, parameter inference

Procedia PDF Downloads 49
8323 The Unsteady Non-Equilibrium Distribution Function and Exact Equilibrium Time for a Dilute Gas Affected by Thermal Radiation Field

Authors: Taha Zakaraia Abdel Wahid

Abstract:

The behavior of the unsteady non-equilibrium distribution function for a dilute gas under the effect of non-linear thermal radiation field is presented. For the best of our knowledge this is done for the first time at all. The distinction and comparisons between the unsteady perturbed and the unsteady equilibrium velocity distribution functions are illustrated. The equilibrium time for the dilute gas is determined for the first time. The non-equilibrium thermodynamic properties of the system (gas+the heated plate) are investigated. The results are applied to the Argon gas, for various values of radiation field intensity. 3D-Graphics illustrating the calculated variables are drawn to predict their behavior. The results are discussed.

Keywords: dilute gas, radiation field, exact solutions, travelling wave method, unsteady BGK model, irreversible thermodynamics, unsteady non-equilibrium distribution functions

Procedia PDF Downloads 471
8322 Frequency Analysis Using Multiple Parameter Probability Distributions for Rainfall to Determine Suitable Probability Distribution in Pakistan

Authors: Tasir Khan, Yejuan Wang

Abstract:

The study of extreme rainfall events is very important for flood management in river basins and the design of water conservancy infrastructure. Evaluation of quantiles of annual maximum rainfall (AMRF) is required in different environmental fields, agriculture operations, renewable energy sources, climatology, and the design of different structures. Therefore, the annual maximum rainfall (AMRF) was performed at different stations in Pakistan. Multiple probability distributions, log normal (LN), generalized extreme value (GEV), Gumbel (max), and Pearson type3 (P3) were used to find out the most appropriate distributions in different stations. The L moments method was used to evaluate the distribution parameters. Anderson darling test, Kolmogorov- Smirnov test, and chi-square test showed that two distributions, namely GUM (max) and LN, were the best appropriate distributions. The quantile estimate of a multi-parameter PD offers extreme rainfall through a specific location and is therefore important for decision-makers and planners who design and construct different structures. This result provides an indication of these multi-parameter distribution consequences for the study of sites and peak flow prediction and the design of hydrological maps. Therefore, this discovery can support hydraulic structure and flood management.

Keywords: RAMSE, multiple frequency analysis, annual maximum rainfall, L-moments

Procedia PDF Downloads 53
8321 Parameter Estimation of Gumbel Distribution with Maximum-Likelihood Based on Broyden Fletcher Goldfarb Shanno Quasi-Newton

Authors: Dewi Retno Sari Saputro, Purnami Widyaningsih, Hendrika Handayani

Abstract:

Extreme data on an observation can occur due to unusual circumstances in the observation. The data can provide important information that can’t be provided by other data so that its existence needs to be further investigated. The method for obtaining extreme data is one of them using maxima block method. The distribution of extreme data sets taken with the maxima block method is called the distribution of extreme values. Distribution of extreme values is Gumbel distribution with two parameters. The parameter estimation of Gumbel distribution with maximum likelihood method (ML) is difficult to determine its exact value so that it is necessary to solve the approach. The purpose of this study was to determine the parameter estimation of Gumbel distribution with quasi-Newton BFGS method. The quasi-Newton BFGS method is a numerical method used for nonlinear function optimization without constraint so that the method can be used for parameter estimation from Gumbel distribution whose distribution function is in the form of exponential doubel function. The quasi-New BFGS method is a development of the Newton method. The Newton method uses the second derivative to calculate the parameter value changes on each iteration. Newton's method is then modified with the addition of a step length to provide a guarantee of convergence when the second derivative requires complex calculations. In the quasi-Newton BFGS method, Newton's method is modified by updating both derivatives on each iteration. The parameter estimation of the Gumbel distribution by a numerical approach using the quasi-Newton BFGS method is done by calculating the parameter values that make the distribution function maximum. In this method, we need gradient vector and hessian matrix. This research is a theory research and application by studying several journals and textbooks. The results of this study obtained the quasi-Newton BFGS algorithm and estimation of Gumbel distribution parameters. The estimation method is then applied to daily rainfall data in Purworejo District to estimate the distribution parameters. This indicates that the high rainfall that occurred in Purworejo District decreased its intensity and the range of rainfall that occurred decreased.

Keywords: parameter estimation, Gumbel distribution, maximum likelihood, broyden fletcher goldfarb shanno (BFGS)quasi newton

Procedia PDF Downloads 297
8320 Survival and Hazard Maximum Likelihood Estimator with Covariate Based on Right Censored Data of Weibull Distribution

Authors: Al Omari Mohammed Ahmed

Abstract:

This paper focuses on Maximum Likelihood Estimator with Covariate. Covariates are incorporated into the Weibull model. Under this regression model with regards to maximum likelihood estimator, the parameters of the covariate, shape parameter, survival function and hazard rate of the Weibull regression distribution with right censored data are estimated. The mean square error (MSE) and absolute bias are used to compare the performance of Weibull regression distribution. For the simulation comparison, the study used various sample sizes and several specific values of the Weibull shape parameter.

Keywords: weibull regression distribution, maximum likelihood estimator, survival function, hazard rate, right censoring

Procedia PDF Downloads 413
8319 Quantum Mechanics Approach for Ruin Probability

Authors: Ahmet Kaya

Abstract:

Incoming cash flows and outgoing claims play an important role to determine how is companies’ profit or loss. In this matter, ruin probability provides to describe vulnerability of the companies against ruin. Quantum mechanism is one of the significant approaches to model ruin probability as stochastically. Using the Hamiltonian method, we have performed formalisation of quantum mechanics < x|e-ᵗᴴ|x' > and obtained the transition probability of 2x2 and 3x3 matrix as traditional and eigenvector basis where A is a ruin operator and H|x' > is a Schroedinger equation. This operator A and Schroedinger equation are defined by a Hamiltonian matrix H. As a result, probability of not to be in ruin can be simulated and calculated as stochastically.

Keywords: ruin probability, quantum mechanics, Hamiltonian technique, operator approach

Procedia PDF Downloads 304
8318 Parameter Estimation for the Mixture of Generalized Gamma Model

Authors: Wikanda Phaphan

Abstract:

Mixture generalized gamma distribution is a combination of two distributions: generalized gamma distribution and length biased generalized gamma distribution. These two distributions were presented by Suksaengrakcharoen and Bodhisuwan in 2014. The findings showed that probability density function (pdf) had fairly complexities, so it made problems in estimating parameters. The problem occurred in parameter estimation was that we were unable to calculate estimators in the form of critical expression. Thus, we will use numerical estimation to find the estimators. In this study, we presented a new method of the parameter estimation by using the expectation – maximization algorithm (EM), the conjugate gradient method, and the quasi-Newton method. The data was generated by acceptance-rejection method which is used for estimating α, β, λ and p. λ is the scale parameter, p is the weight parameter, α and β are the shape parameters. We will use Monte Carlo technique to find the estimator's performance. Determining the size of sample equals 10, 30, 100; the simulations were repeated 20 times in each case. We evaluated the effectiveness of the estimators which was introduced by considering values of the mean squared errors and the bias. The findings revealed that the EM-algorithm had proximity to the actual values determined. Also, the maximum likelihood estimators via the conjugate gradient and the quasi-Newton method are less precision than the maximum likelihood estimators via the EM-algorithm.

Keywords: conjugate gradient method, quasi-Newton method, EM-algorithm, generalized gamma distribution, length biased generalized gamma distribution, maximum likelihood method

Procedia PDF Downloads 198
8317 Daily Probability Model of Storm Events in Peninsular Malaysia

Authors: Mohd Aftar Abu Bakar, Noratiqah Mohd Ariff, Abdul Aziz Jemain

Abstract:

Storm Event Analysis (SEA) provides a method to define rainfalls events as storms where each storm has its own amount and duration. By modelling daily probability of different types of storms, the onset, offset and cycle of rainfall seasons can be determined and investigated. Furthermore, researchers from the field of meteorology will be able to study the dynamical characteristics of rainfalls and make predictions for future reference. In this study, four categories of storms; short, intermediate, long and very long storms; are introduced based on the length of storm duration. Daily probability models of storms are built for these four categories of storms in Peninsular Malaysia. The models are constructed by using Bernoulli distribution and by applying linear regression on the first Fourier harmonic equation. From the models obtained, it is found that daily probability of storms at the Eastern part of Peninsular Malaysia shows a unimodal pattern with high probability of rain beginning at the end of the year and lasting until early the next year. This is very likely due to the Northeast monsoon season which occurs from November to March every year. Meanwhile, short and intermediate storms at other regions of Peninsular Malaysia experience a bimodal cycle due to the two inter-monsoon seasons. Overall, these models indicate that Peninsular Malaysia can be divided into four distinct regions based on the daily pattern for the probability of various storm events.

Keywords: daily probability model, monsoon seasons, regions, storm events

Procedia PDF Downloads 319
8316 Assessing Effects of an Intervention on Bottle-Weaning and Reducing Daily Milk Intake from Bottles in Toddlers Using Two-Part Random Effects Models

Authors: Yungtai Lo

Abstract:

Two-part random effects models have been used to fit semi-continuous longitudinal data where the response variable has a point mass at 0 and a continuous right-skewed distribution for positive values. We review methods proposed in the literature for analyzing data with excess zeros. A two-part logit-log-normal random effects model, a two-part logit-truncated normal random effects model, a two-part logit-gamma random effects model, and a two-part logit-skew normal random effects model were used to examine effects of a bottle-weaning intervention on reducing bottle use and daily milk intake from bottles in toddlers aged 11 to 13 months in a randomized controlled trial. We show in all four two-part models that the intervention promoted bottle-weaning and reduced daily milk intake from bottles in toddlers drinking from a bottle. We also show that there are no differences in model fit using either the logit link function or the probit link function for modeling the probability of bottle-weaning in all four models. Furthermore, prediction accuracy of the logit or probit link function is not sensitive to the distribution assumption on daily milk intake from bottles in toddlers not off bottles.

Keywords: two-part model, semi-continuous variable, truncated normal, gamma regression, skew normal, Pearson residual, receiver operating characteristic curve

Procedia PDF Downloads 324
8315 Modeling Binomial Dependent Distribution of the Values: Synthesis Tables of Probabilities of Errors of the First and Second Kind of Biometrics-Neural Network Authentication System

Authors: B. S.Akhmetov, S. T. Akhmetova, D. N. Nadeyev, V. Yu. Yegorov, V. V. Smogoonov

Abstract:

Estimated probabilities of errors of the first and second kind for nonideal biometrics-neural transducers 256 outputs, the construction of nomograms based error probability of 'own' and 'alien' from the mathematical expectation and standard deviation of the normalized measures Hamming.

Keywords: modeling, errors, probability, biometrics, neural network, authentication

Procedia PDF Downloads 462
8314 Analytical Downlink Effective SINR Evaluation in LTE Networks

Authors: Marwane Ben Hcine, Ridha Bouallegue

Abstract:

The aim of this work is to provide an original analytical framework for downlink effective SINR evaluation in LTE networks. The classical single carrier SINR performance evaluation is extended to multi-carrier systems operating over frequency selective channels. Extension is achieved by expressing the link outage probability in terms of the statistics of the effective SINR. For effective SINR computation, the exponential effective SINR mapping (EESM) method is used on this work. Closed-form expression for the link outage probability is achieved assuming a log skew normal approximation for single carrier case. Then we rely on the lognormal approximation to express the exponential effective SINR distribution as a function of the mean and standard deviation of the SINR of a generic subcarrier. Achieved formulas is easily computable and can be obtained for a user equipment (UE) located at any distance from its serving eNodeB. Simulations show that the proposed framework provides results with accuracy within 0.5 dB.

Keywords: LTE, OFDMA, effective SINR, log skew normal approximation

Procedia PDF Downloads 332