Search results for: skewed distributions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 693

Search results for: skewed distributions

633 The Effect of Non-Normality on CB-SEM and PLS-SEM Path Estimates

Authors: Z. Jannoo, B. W. Yap, N. Auchoybur, M. A. Lazim

Abstract:

The two common approaches to Structural Equation Modeling (SEM) are the Covariance-Based SEM (CB-SEM) and Partial Least Squares SEM (PLS-SEM). There is much debate on the performance of CB-SEM and PLS-SEM for small sample size and when distributions are non-normal. This study evaluates the performance of CB-SEM and PLS-SEM under normality and non-normality conditions via a simulation. Monte Carlo Simulation in R programming language was employed to generate data based on the theoretical model with one endogenous and four exogenous variables. Each latent variable has three indicators. For normal distributions, CB-SEM estimates were found to be inaccurate for small sample size while PLS-SEM could produce the path estimates. Meanwhile, for a larger sample size, CB-SEM estimates have lower variability compared to PLS-SEM. Under non-normality, CB-SEM path estimates were inaccurate for small sample size. However, CB-SEM estimates are more accurate than those of PLS-SEM for sample size of 50 and above. The PLS-SEM estimates are not accurate unless sample size is very large.

Keywords: CB-SEM, Monte Carlo simulation, normality conditions, non-normality, PLS-SEM

Procedia PDF Downloads 371
632 Micromechanical Modeling of Fiber-Matrix Debonding in Unidirectional Composites

Authors: M. Palizvan, M. T. Abadi, M. H. Sadr

Abstract:

Due to variations in damage mechanisms in the microscale, the behavior of fiber-reinforced composites is nonlinear and difficult to model. To make use of computational advantages, homogenization method is applied to the micro-scale model in order to minimize the cost at the expense of detail of local microscale phenomena. In this paper, the effective stiffness is calculated using the homogenization of nonlinear behavior of a composite representative volume element (RVE) containing fiber-matrix debonding. The damage modes for the RVE are considered by using cohesive elements and contacts for the cohesive behavior of the interface between fiber and matrix. To predict more realistic responses of composite materials, different random distributions of fibers are proposed besides square and hexagonal arrays. It was shown that in some cases, there is quite different damage behavior in different fiber distributions. A comprehensive comparison has been made between different graphs.

Keywords: homogenization, cohesive zone model, fiber-matrix debonding, RVE

Procedia PDF Downloads 144
631 Modeling of System Availability and Bayesian Analysis of Bivariate Distribution

Authors: Muhammad Farooq, Ahtasham Gul

Abstract:

To meet the desired standard, it is important to monitor and analyze different engineering processes to get desired output. The bivariate distributions got a lot of attention in recent years to describe the randomness of natural as well as artificial mechanisms. In this article, a bivariate model is constructed using two independent models developed by the nesting approach to study the effect of each component on reliability for better understanding. Further, the Bayes analysis of system availability is studied by considering prior parametric variations in the failure time and repair time distributions. Basic statistical characteristics of marginal distribution, like mean median and quantile function, are discussed. We use inverse Gamma prior to study its frequentist properties by conducting Monte Carlo Markov Chain (MCMC) sampling scheme.

Keywords: reliability, system availability Weibull, inverse Lomax, Monte Carlo Markov Chain, Bayesian

Procedia PDF Downloads 48
630 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test

Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman

Abstract:

At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. These findings need to be confirmed with a greater number of stations across other Australian states.

Keywords: floods, FLIKE, probability distributions, flood frequency, outlier

Procedia PDF Downloads 409
629 Mixtures of Length-Biased Weibull Distributions for Loss Severity Modelling

Authors: Taehan Bae

Abstract:

In this paper, a class of length-biased Weibull mixtures is presented to model loss severity data. The proposed model generalizes the Erlang mixtures with the common scale parameter, and it shares many important modelling features, such as flexibility to fit various data distribution shapes and weak-denseness in the class of positive continuous distributions, with the Erlang mixtures. We show that the asymptotic tail estimate of the length-biased Weibull mixture is Weibull-type, which makes the model effective to fit loss severity data with heavy-tailed observations. A method of statistical estimation is discussed with applications on real catastrophic loss data sets.

Keywords: Erlang mixture, length-biased distribution, transformed gamma distribution, asymptotic tail estimate, EM algorithm, expectation-maximization algorithm

Procedia PDF Downloads 196
628 Time Series Simulation by Conditional Generative Adversarial Net

Authors: Rao Fu, Jie Chen, Shutian Zeng, Yiping Zhuang, Agus Sudjianto

Abstract:

Generative Adversarial Net (GAN) has proved to be a powerful machine learning tool in image data analysis and generation. In this paper, we propose to use Conditional Generative Adversarial Net (CGAN) to learn and simulate time series data. The conditions include both categorical and continuous variables with different auxiliary information. Our simulation studies show that CGAN has the capability to learn different types of normal and heavy-tailed distributions, as well as dependent structures of different time series. It also has the capability to generate conditional predictive distributions consistent with training data distributions. We also provide an in-depth discussion on the rationale behind GAN and the neural networks as hierarchical splines to establish a clear connection with existing statistical methods of distribution generation. In practice, CGAN has a wide range of applications in market risk and counterparty risk analysis: it can be applied to learn historical data and generate scenarios for the calculation of Value-at-Risk (VaR) and Expected Shortfall (ES), and it can also predict the movement of the market risk factors. We present a real data analysis including a backtesting to demonstrate that CGAN can outperform Historical Simulation (HS), a popular method in market risk analysis to calculate VaR. CGAN can also be applied in economic time series modeling and forecasting. In this regard, we have included an example of hypothetical shock analysis for economic models and the generation of potential CCAR scenarios by CGAN at the end of the paper.

Keywords: conditional generative adversarial net, market and credit risk management, neural network, time series

Procedia PDF Downloads 110
627 A Molecular Dynamics Study on Intermittent Plasticity and Dislocation Avalanche Emissions in FCC and BCC Crystals

Authors: Javier Varillas, Jorge Alcalá

Abstract:

We investigate dislocation avalanche phenomena in face-centered cubic (FCC) and body-centered cubic (BCC) crystals using massive, large-scale molecular dynamics (MD) simulations. The analysis is focused on the intermittent development of dense dislocation arrangements subjected to uniaxial tensile straining under displacement control. We employ a novel computational scheme that allows us to inject an entangled dislocation structure in periodic MD domains. We assess the emission of plastic bursts (or dislocation avalanches) in terms of the sharp stress drops detected in the stress-strain curve. The plastic activity corresponds to the sporadic operation of specific dislocation glide processes exhibiting quiescent periods between successive avalanche events. We find that the plastic intermittences in our simulations do not overlap in time under sufficiently low strain rates as dissipation operates faster than driving, where the dense dislocation networks evolve through the emission of dislocation avalanche events whose carried slip adheres to self-organized power-law distributions. These findings enable the extension of the slip distributions obtained from strict displacement-controlled micropillar compression experiments towards smaller values of slip size. Our results furnish further understanding upon the development of entangled dislocation networks in metal plasticity, including specific mechanisms of dislocation propagation and annihilation, along with the evolution of specific dislocation populations through dislocation density analyses.

Keywords: dislocations, intermittent plasticity, molecular dynamics, slip distributions

Procedia PDF Downloads 111
626 Stock Market Developments, Income Inequality, Wealth Inequality

Authors: Quang Dong Dang

Abstract:

This paper examines the possible effects of stock market developments by channels on income and wealth inequality. We use the Bayesian Multilevel Model with the explanatory variables of the market’s channels, such as accessibility, efficiency, and market health in six selected countries: the US, UK, Japan, Vietnam, Thailand, and Malaysia. We found that generally, the improvements in the stock market alleviate income inequality. However, stock market expansions in higher-income countries are likely to trigger income inequality. We also found that while enhancing the quality of channels of the stock market has counter-effects on wealth equality distributions, open accessibilities help reduce wealth inequality distributions within the scope of the study. In addition, the inverted U-shaped hypothesis seems not to be valid in six selected countries between the period from 2006 to 2020.

Keywords: Bayesian multilevel model, income inequality, inverted u-shaped hypothesis, stock market development, wealth inequality

Procedia PDF Downloads 68
625 A Deterministic Large Deviation Model Based on Complex N-Body Systems

Authors: David C. Ni

Abstract:

In the previous efforts, we constructed N-Body Systems by an extended Blaschke product (EBP), which represents a non-temporal and nonlinear extension of Lorentz transformation. In this construction, we rely only on two parameters, nonlinear degree, and relative momentum to characterize the systems. We further explored root computation via iteration with an algorithm extended from Jenkins-Traub method. The solution sets demonstrate a form of σ+ i [-t, t], where σ and t are the real numbers, and the [-t, t] shows various canonical distributions. In this paper, we correlate the convergent sets in the original domain with solution sets, which demonstrating large-deviation distributions in the codomain. We proceed to compare our approach with the formula or principles, such as Donsker-Varadhan and Wentzell-Freidlin theories. The deterministic model based on this construction allows us to explore applications in the areas of finance and statistical mechanics.

Keywords: nonlinear Lorentz transformation, Blaschke equation, iteration solutions, root computation, large deviation distribution, deterministic model

Procedia PDF Downloads 365
624 Numerical Simulation of the Coal Spontaneous Combustion Dangerous Area in Composite Long-Wall Gobs

Authors: Changshan Zhang, Zhijin Yu, Shixing Fan

Abstract:

A comprehensive hazard evaluation for coal self-heating in composite long-wall gobs is heavily dependent on computational simulation. In this study, the spatial distributions of cracks which caused significant air leakage were simulated by universal distinct element code (UDEC) simulation. Based on the main routes of air leakage and characteristics of coal self-heating, a computational fluid dynamics (CFD) modeling was conducted to model the coal spontaneous combustion dangerous area in composite long-wall gobs. The results included the oxygen concentration distributions and temperature profiles showed that the numerical approach is validated by comparison with the test data. Furthermore, under the conditions of specific engineering, the major locations where some techniques for extinguishing and preventing long-wall gob fires need to be put into practice were also examined.

Keywords: computational simulation, UDEC simulation, coal self-heating, CFD modeling, long-wall gobs

Procedia PDF Downloads 278
623 Modeling Depth Averaged Velocity and Boundary Shear Stress Distributions

Authors: Ebissa Gadissa Kedir, C. S. P. Ojha, K. S. Hari Prasad

Abstract:

In the present study, the depth-averaged velocity and boundary shear stress in non-prismatic compound channels with three different converging floodplain angles ranging from 1.43ᶱ to 7.59ᶱ have been studied. The analytical solutions were derived by considering acting forces on the channel beds and walls. In the present study, five key parameters, i.e., non-dimensional coefficient, secondary flow term, secondary flow coefficient, friction factor, and dimensionless eddy viscosity, were considered and discussed. An expression for non-dimensional coefficient and integration constants was derived based on the boundary conditions. The model was applied to different data sets of the present experiments and experiments from other sources, respectively, to examine and analyse the influence of floodplain converging angles on depth-averaged velocity and boundary shear stress distributions. The results show that the non-dimensional parameter plays important in portraying the variation of depth-averaged velocity and boundary shear stress distributions with different floodplain converging angles. Thus, the variation of the non-dimensional coefficient needs attention since it affects the secondary flow term and secondary flow coefficient in both the main channel and floodplains. The analysis shows that the depth-averaged velocities are sensitive to a shear stress-dependent model parameter non-dimensional coefficient, and the analytical solutions are well agreed with experimental data when five parameters are included. It is inferred that the developed model may facilitate the interest of others in complex flow modeling.

Keywords: depth-average velocity, converging floodplain angles, non-dimensional coefficient, non-prismatic compound channels

Procedia PDF Downloads 50
622 Wind Power Density and Energy Conversion in Al-Adwas Ras-Huwirah Area, Hadhramout, Yemen

Authors: Bawadi M. A., Abbad J. A., Baras E. A.

Abstract:

This study was conducted to assess wind energy resources in the area of Al-Adwas Ras-Huwirah Hadhramout Governorate, Yemen, through using statistical calculations, the Weibull model and SPSS program were used in the monthly and the annual to analyze the wind energy resource; the convergence of wind energy; turbine efficiency in the selected area. Wind speed data was obtained from NASA over a period of ten years (2010-2019) and at heights of 50 m above ground level. Probability distributions derived from wind data and their distribution parameters are determined. The density probability function is fitted to the measured probability distributions on an annual basis. This study also involves locating preliminary sites for wind farms using Geographic Information System (GIS) technology. This further leads to maximizing the output energy from the most suitable wind turbines in the proposed site.

Keywords: wind speed analysis, Yemen wind energy, wind power density, Weibull distribution model

Procedia PDF Downloads 55
621 A Parametric Investigation into the Free Vibration and Flutter Characteristics of High Aspect Ratio Aircraft Wings Using Polynomial Distributions of Stiffness and Mass Properties

Authors: Ranjan Banerjee, W. D. Gunawardana

Abstract:

The free vibration and flutter analysis plays a major part in aircraft design which is indeed, a mandatory requirement. In particular, high aspect ratio transport airliner wings are prone to free vibration and flutter problems that must be addressed during the design process as demanded by the airworthiness authorities. The purpose of this paper is to carry out a detailed free vibration and flutter analysis for a wide range of high aspect ratio aircraft wings and generate design curves to provide useful visions and understandings of aircraft design from an aeroelastic perspective. In the initial stage of the investigation, the bending and torsional stiffnesses of a number of transport aircraft wings are looked at and critically examined to see whether it is possible to express the stiffness distributions in polynomial form, but in a sufficiently accurate manner. A similar attempt is made for mass and mass moment of inertia distributions of the wing. Once the choice of stiffness and mass distributions in polynomial form is made, the high aspect ratio wing is idealised by a series of bending-torsion coupled beams from a structural standpoint. Then the dynamic stiffness method is applied to compute the natural frequencies and mode shape of the wing. Next the wing is idealised aerodynamically and to this end, unsteady aerodynamic of Theodorsen type is employed to represent the harmonically oscillating wing. Following this step, a normal mode method through the use of generalised coordinates is applied to formulate the flutter problem. In essence, the generalised mass, stiffness and aerodynamic matrices are combined to obtain the flutter matrix which is subsequently solved in the complex domain to determine the flutter speed and flutter frequency. In the final stage of the investigation, an exhaustive parametric study is carried out by varying significant wing parameters to generate design curves which help to predict the free vibration and flutter behaviour of high aspect ratio transport aircraft wings in a generic manner. It is in the aeroelastic context of aircraft design where the results are expected to be most useful.

Keywords: high-aspect ratio wing, flutter, dynamic stiffness method, free vibration, aeroelasticity

Procedia PDF Downloads 260
620 Refined Procedures for Second Order Asymptotic Theory

Authors: Gubhinder Kundhi, Paul Rilstone

Abstract:

Refined procedures for higher-order asymptotic theory for non-linear models are developed. These include a new method for deriving stochastic expansions of arbitrary order, new methods for evaluating the moments of polynomials of sample averages, a new method for deriving the approximate moments of the stochastic expansions; an application of these techniques to gather improved inferences with the weak instruments problem is considered. It is well established that Instrumental Variable (IV) estimators in the presence of weak instruments can be poorly behaved, in particular, be quite biased in finite samples. In our application, finite sample approximations to the distributions of these estimators are obtained using Edgeworth and Saddlepoint expansions. Departures from normality of the distributions of these estimators are analyzed using higher order analytical corrections in these expansions. In a Monte-Carlo experiment, the performance of these expansions is compared to the first order approximation and other methods commonly used in finite samples such as the bootstrap.

Keywords: edgeworth expansions, higher order asymptotics, saddlepoint expansions, weak instruments

Procedia PDF Downloads 254
619 Base Change for Fisher Metrics: Case of the q-Gaussian Inverse Distribution

Authors: Gabriel I. Loaiza Ossa, Carlos A. Cadavid Moreno, Juan C. Arango Parra

Abstract:

It is known that the Riemannian manifold determined by the family of inverse Gaussian distributions endowed with the Fisher metric has negative constant curvature κ= -1/2, as does the family of usual Gaussian distributions. In the present paper, firstly, we arrive at this result by following a different path, much simpler than the previous ones. We first put the family in exponential form, thus endowing the family with a new set of parameters, or coordinates, θ₁, θ₂; then we determine the matrix of the Fisher metric in terms of these parameters; and finally we compute this matrix in the original parameters. Secondly, we define the inverse q-Gaussian distribution family (q < 3) as the family obtained by replacing the usual exponential function with the Tsallis q-exponential function in the expression for the inverse Gaussian distribution and observe that it supports two possible geometries, the Fisher and the q-Fisher geometry. And finally, we apply our strategy to obtain results about the Fisher and q-Fisher geometry of the inverse q-Gaussian distribution family, similar to the ones obtained in the case of the inverse Gaussian distribution family.

Keywords: base of changes, information geometry, inverse Gaussian distribution, inverse q-Gaussian distribution, statistical manifolds

Procedia PDF Downloads 211
618 Industrial Rock Characterization using Nuclear Magnetic Resonance (NMR): A Case Study of Ewekoro Quarry

Authors: Olawale Babatunde Olatinsu, Deborah Oluwaseun Olorode

Abstract:

Industrial rocks were collected from a quarry site at Ewekoro in south-western Nigeria and analysed using Nuclear Magnetic Resonance (NMR) technique. NMR measurement was conducted on the samples in partial water-saturated and full brine-saturated conditions. Raw NMR data were analysed with the aid of T2 curves and T2 spectra generated by inversion of raw NMR data using conventional regularized least-squares inversion routine. Results show that NMR transverse relaxation (T2) signatures fairly adequately distinguish between the rock types. Similar T2 curve trend and rates at partial saturation suggests that the relaxation is mainly due to adsorption of water on micropores of similar sizes while T2 curves at full saturation depict relaxation decay rate as: 1/T2(shale)>1/ T2(glauconite)>1/ T2(limestone) and 1/T2(sandstone). NMR T2 distributions at full brine-saturation show: unimodal distribution in shale; bimodal distribution in sandstone and glauconite; and trimodal distribution in limestone. Full saturation T2 distributions revealed the presence of well-developed and more abundant micropores in all the samples with T2 in the range, 402-504 μs. Mesopores with amplitudes much lower than those of micropores are present in limestone, sandstone and glauconite with T2 range: 8.45-26.10 ms, 6.02-10.55 ms, and 9.45-13.26 ms respectively. Very low amplitude macropores of T2 values, 90.26-312.16 ms, are only recognizable in limestone samples. Samples with multiple peaks showed well-connected pore systems with sandstone having the highest degree of connectivity. The difference in T2 curves and distributions for the rocks at full saturation can be utilised as a potent diagnostic tool for discrimination of these rock types found at Ewekoro.

Keywords: Ewekoro, NMR techniques, industrial rocks, characterization, relaxation

Procedia PDF Downloads 267
617 Regression for Doubly Inflated Multivariate Poisson Distributions

Authors: Ishapathik Das, Sumen Sen, N. Rao Chaganty, Pooja Sengupta

Abstract:

Dependent multivariate count data occur in several research studies. These data can be modeled by a multivariate Poisson or Negative binomial distribution constructed using copulas. However, when some of the counts are inflated, that is, the number of observations in some cells are much larger than other cells, then the copula based multivariate Poisson (or Negative binomial) distribution may not fit well and it is not an appropriate statistical model for the data. There is a need to modify or adjust the multivariate distribution to account for the inflated frequencies. In this article, we consider the situation where the frequencies of two cells are higher compared to the other cells, and develop a doubly inflated multivariate Poisson distribution function using multivariate Gaussian copula. We also discuss procedures for regression on covariates for the doubly inflated multivariate count data. For illustrating the proposed methodologies, we present a real data containing bivariate count observations with inflations in two cells. Several models and linear predictors with log link functions are considered, and we discuss maximum likelihood estimation to estimate unknown parameters of the models.

Keywords: copula, Gaussian copula, multivariate distributions, inflated distributios

Procedia PDF Downloads 131
616 Air Flow Characteristics and Pressure Distributions for Staggered Wing Shaped Tubes Bundle

Authors: Sayed A. Elsayed, Emad Z. Ibrahim, Osama M. Mesalhy, Mohamed A. Abdelatief

Abstract:

An experimental and numerical study has been conducted to clarify fluid flow characteristics and pressure drop distributions of a cross-flow heat exchanger employing staggered wing-shaped tubes at different angels of attack. The water-side Rew and the air-side Rea were at 5 x 102 and at from 1.8 x 103 to 9.7 x 103, respectively. Three cases of the tubes arrangements with various angles of attack, row angles of attack and 90° cone angles were employed at the considered Rea range. Correlation of pressure drop coefficient Pdc in terms of Rea, design parameters for the studied cases were presented. The flow pattern around the staggered wing-shaped tubes bundle were predicted by using commercial CFD FLUENT 6.3.26 software package. Results indicated that the values of Pdc were increased by increasing the angle of attack from 0° to 45°, while the opposite was true for angles of attack from 135° to 180°. Comparisons between the experimental and numerical results of the present study and those, previously, obtained for similar available studies showed good agreements.

Keywords: wing-shaped tubes, cross-flow cooling, staggered arrangement, CFD

Procedia PDF Downloads 334
615 The Role of Law Corruption and Culture in Investment Fund Manager Fees

Authors: Samir Assal

Abstract:

This paper considers an international sample of venture capital and private equity funds to assess the role of law, corruption and culture in setting fund manager fees in terms of their fixed management fees, carried interest performance fees, clawbacks of fees and cash versus share distributions of fees. The data highlight a role of legal conditions in shaping fees paid to fund managers. In countries with better legal conditions, fixed fees are lower, carried interest fees are higher, clawbacks are less likely, and share distributions are more likely. These findings suggest legal conditions help to align the interests of managers and shareholders. More specifically, we examine which element of legal conditions matter most, and discover that corruption levels play a pronounced role in shaping fund manager fee contracts. We also show that cultural forces such as Hofstede’s measures of power distance and uncertainty avoidance likewise play a role in influencing fees.

Keywords: managerial compensation, incentive contracts, private equity, law and finance

Procedia PDF Downloads 279
614 Asymptotic Confidence Intervals for the Difference of Coefficients of Variation in Gamma Distributions

Authors: Patarawan Sangnawakij, Sa-Aat Niwitpong

Abstract:

In this paper, we proposed two new confidence intervals for the difference of coefficients of variation, CIw and CIs, in two independent gamma distributions. These proposed confidence intervals using the close form method of variance estimation which was presented by Donner and Zou (2010) based on concept of Wald and Score confidence interval, respectively. Monte Carlo simulation study is used to evaluate the performance, coverage probability and expected length, of these confidence intervals. The results indicate that values of coverage probabilities of the new confidence interval based on Wald and Score are satisfied the nominal coverage and close to nominal level 0.95 in various situations, particularly, the former proposed confidence interval is better when sample sizes are small. Moreover, the expected lengths of the proposed confidence intervals are nearly difference when sample sizes are moderate to large. Therefore, in this study, the confidence interval for the difference of coefficients of variation which based on Wald is preferable than the other one confidence interval.

Keywords: confidence interval, score’s interval, wald’s interval, coefficient of variation, gamma distribution, simulation study

Procedia PDF Downloads 390
613 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength

Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos

Abstract:

Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.

Keywords: statistical slope stability analysis, skew distributions, probability of failure, functions of random variables

Procedia PDF Downloads 307
612 Communication of Expected Survival Time to Cancer Patients: How It Is Done and How It Should Be Done

Authors: Geir Kirkebøen

Abstract:

Most patients with serious diagnoses want to know their prognosis, in particular their expected survival time. As part of the informed consent process, physicians are legally obligated to communicate such information to patients. However, there is no established (evidence based) ‘best practice’ for how to do this. The two questions explored in this study are: How do physicians communicate expected survival time to patients, and how should it be done? We explored the first, descriptive question in a study with Norwegian oncologists as participants. The study had a scenario and a survey part. In the scenario part, the doctors should imagine that a patient, recently diagnosed with a serious cancer diagnosis, has asked them: ‘How long can I expect to live with such a diagnosis? I want an honest answer from you!’ The doctors should assume that the diagnosis is certain, and that from an extensive recent study they had optimal statistical knowledge, described in detail as a right-skewed survival curve, about how long such patients with this kind of diagnosis could be expected to live. The main finding was that very few of the oncologists would explain to the patient the variation in survival time as described by the survival curve. The majority would not give the patient an answer at all. Of those who gave an answer, the typical answer was that survival time varies a lot, that it is hard to say in a specific case, that we will come back to it later etc. The survey part of the study clearly indicates that the main reason why the oncologists would not deliver the mortality prognosis was discomfort with its uncertainty. The scenario part of the study confirmed this finding. The majority of the oncologists explicitly used the uncertainty, the variation in survival time, as a reason to not give the patient an answer. Many studies show that patients want realistic information about their mortality prognosis, and that they should be given hope. The question then is how to communicate the uncertainty of the prognosis in a realistic and optimistic – hopeful – way. Based on psychological research, our hypothesis is that the best way to do this is by explicitly describing the variation in survival time, the (usually) right skewed survival curve of the prognosis, and emphasize to the patient the (small) possibility of being a ‘lucky outlier’. We tested this hypothesis in two scenario studies with lay people as participants. The data clearly show that people prefer to receive expected survival time as a median value together with explicit information about the survival curve’s right skewedness (e.g., concrete examples of ‘positive outliers’), and that communicating expected survival time this way not only provides people with hope, but also gives them a more realistic understanding compared with the typical way expected survival time is communicated. Our data indicate that it is not the existence of the uncertainty regarding the mortality prognosis that is the problem for patients, but how this uncertainty is, or is not, communicated and explained.

Keywords: cancer patients, decision psychology, doctor-patient communication, mortality prognosis

Procedia PDF Downloads 298
611 On the Optimality Assessment of Nano-Particle Size Spectrometry and Its Association to the Entropy Concept

Authors: A. Shaygani, R. Saifi, M. S. Saidi, M. Sani

Abstract:

Particle size distribution, the most important characteristics of aerosols, is obtained through electrical characterization techniques. The dynamics of charged nano-particles under the influence of electric field in electrical mobility spectrometer (EMS) reveals the size distribution of these particles. The accuracy of this measurement is influenced by flow conditions, geometry, electric field and particle charging process, therefore by the transfer function (transfer matrix) of the instrument. In this work, a wire-cylinder corona charger was designed and the combined field-diffusion charging process of injected poly-disperse aerosol particles was numerically simulated as a prerequisite for the study of a multi-channel EMS. The result, a cloud of particles with non-uniform charge distribution, was introduced to the EMS. The flow pattern and electric field in the EMS were simulated using computational fluid dynamics (CFD) to obtain particle trajectories in the device and therefore to calculate the reported signal by each electrometer. According to the output signals (resulted from bombardment of particles and transferring their charges as currents), we proposed a modification to the size of detecting rings (which are connected to electrometers) in order to evaluate particle size distributions more accurately. Based on the capability of the system to transfer information contents about size distribution of the injected particles, we proposed a benchmark for the assessment of optimality of the design. This method applies the concept of Von Neumann entropy and borrows the definition of entropy from information theory (Shannon entropy) to measure optimality. Entropy, according to the Shannon entropy, is the ''average amount of information contained in an event, sample or character extracted from a data stream''. Evaluating the responses (signals) which were obtained via various configurations of detecting rings, the best configuration which gave the best predictions about the size distributions of injected particles, was the modified configuration. It was also the one that had the maximum amount of entropy. A reasonable consistency was also observed between the accuracy of the predictions and the entropy content of each configuration. In this method, entropy is extracted from the transfer matrix of the instrument for each configuration. Ultimately, various clouds of particles were introduced to the simulations and predicted size distributions were compared to the exact size distributions.

Keywords: aerosol nano-particle, CFD, electrical mobility spectrometer, von neumann entropy

Procedia PDF Downloads 312
610 Fem Models of Glued Laminated Timber Beams Enhanced by Bayesian Updating of Elastic Moduli

Authors: L. Melzerová, T. Janda, M. Šejnoha, J. Šejnoha

Abstract:

Two finite element (FEM) models are presented in this paper to address the random nature of the response of glued timber structures made of wood segments with variable elastic moduli evaluated from 3600 indentation measurements. This total database served to create the same number of ensembles as was the number of segments in the tested beam. Statistics of these ensembles were then assigned to given segments of beams and the Latin Hypercube Sampling (LHS) method was called to perform 100 simulations resulting into the ensemble of 100 deflections subjected to statistical evaluation. Here, a detailed geometrical arrangement of individual segments in the laminated beam was considered in the construction of two-dimensional FEM model subjected to in four-point bending to comply with the laboratory tests. Since laboratory measurements of local elastic moduli may in general suffer from a significant experimental error, it appears advantageous to exploit the full scale measurements of timber beams, i.e. deflections, to improve their prior distributions with the help of the Bayesian statistical method. This, however, requires an efficient computational model when simulating the laboratory tests numerically. To this end, a simplified model based on Mindlin’s beam theory was established. The improved posterior distributions show that the most significant change of the Young’s modulus distribution takes place in laminae in the most strained zones, i.e. in the top and bottom layers within the beam center region. Posterior distributions of moduli of elasticity were subsequently utilized in the 2D FEM model and compared with the original simulations.

Keywords: Bayesian inference, FEM, four point bending test, laminated timber, parameter estimation, prior and posterior distribution, Young’s modulus

Procedia PDF Downloads 254
609 Investigations of Flow Field with Different Turbulence Models on NREL Phase VI Blade

Authors: T. Y. Liu, C. H. Lin, Y. M. Ferng

Abstract:

Wind energy is one of the clean renewable energy. However, the low frequency (20-200HZ) noise generated from the wind turbine blades, which bothers the residents, becomes the major problem to be developed. It is useful for predicting the aerodynamic noise by flow field and pressure distribution analysis on the wind turbine blades. Therefore, the main objective of this study is to use different turbulence models to analyse the flow field and pressure distributions of the wing blades. Three-dimensional Computation Fluid Dynamics (CFD) simulation of the flow field was used to calculate the flow phenomena for the National Renewable Energy Laboratory (NREL) Phase VI horizontal axis wind turbine rotor. Two different flow cases with different wind speeds were investigated: 7m/s with 72rpm and 15m/s with 72rpm. Four kinds of RANS-based turbulence models, Standard k-ε, Realizable k-ε, SST k-ω, and v2f, were used to predict and analyse the results in the present work. The results show that the predictions on pressure distributions with SST k-ω and v2f turbulence models have good agreements with experimental data.

Keywords: horizontal axis wind turbine, turbulence model, noise, fluid dynamics

Procedia PDF Downloads 236
608 The Extended Skew Gaussian Process for Regression

Authors: M. T. Alodat

Abstract:

In this paper, we propose a generalization to the Gaussian process regression(GPR) model called the extended skew Gaussian process for regression(ESGPr) model. The ESGPR model works better than the GPR model when the errors are skewed. We derive the predictive distribution for the ESGPR model at a new input. Also we apply the ESGPR model to FOREX data and we find that it fits the Forex data better than the GPR model.

Keywords: extended skew normal distribution, Gaussian process for regression, predictive distribution, ESGPr model

Procedia PDF Downloads 520
607 Modelling Operational Risk Using Extreme Value Theory and Skew t-Copulas via Bayesian Inference

Authors: Betty Johanna Garzon Rozo, Jonathan Crook, Fernando Moreira

Abstract:

Operational risk losses are heavy tailed and are likely to be asymmetric and extremely dependent among business lines/event types. We propose a new methodology to assess, in a multivariate way, the asymmetry and extreme dependence between severity distributions, and to calculate the capital for Operational Risk. This methodology simultaneously uses (i) several parametric distributions and an alternative mix distribution (the Lognormal for the body of losses and the Generalized Pareto Distribution for the tail) via extreme value theory using SAS®, (ii) the multivariate skew t-copula applied for the first time for operational losses and (iii) Bayesian theory to estimate new n-dimensional skew t-copula models via Markov chain Monte Carlo (MCMC) simulation. This paper analyses a newly operational loss data set, SAS Global Operational Risk Data [SAS OpRisk], to model operational risk at international financial institutions. All the severity models are constructed in SAS® 9.2. We implement the procedure PROC SEVERITY and PROC NLMIXED. This paper focuses in describing this implementation.

Keywords: operational risk, loss distribution approach, extreme value theory, copulas

Procedia PDF Downloads 561
606 Sniff-Camera for Imaging of Ethanol Vapor in Human Body Gases after Drinking

Authors: Toshiyuki Sato, Kenta Iitani, Koji Toma, Takahiro Arakawa, Kohji Mitsubayashi

Abstract:

A 2-dimensional imaging system (Sniff-camera) for gaseous ethanol emissions from a human palm skin was constructed and demonstrated. This imaging system measures gaseous ethanol concentrations as intensities of chemiluminescence (CL) by luminol reaction induced by alcohol oxidase and luminol-hydrogen peroxide system. A conversion of ethanol distributions and concentrations to 2-dimensional CL was conducted on an enzyme-immobilized mesh substrate in a dark box, which contained a luminol solution. In order to visualize ethanol emissions from human palm skin, we developed highly sensitive and selective imaging system for transpired gaseous ethanol at sub ppm-levels. High sensitivity imaging allows us to successfully visualize the emissions dynamics of transdermal gaseous ethanol. The intensity of each pixel on the palm shows the reflection of ethanol concentrations distributions based on the metabolism of oral alcohol administration. This imaging system is significant and useful for the assessment of ethanol measurement of the palmar skin.

Keywords: sniff-camera, gas-imaging, ethanol vapor, human body gas

Procedia PDF Downloads 338
605 Modelling of Creep in a Thick-Walled Cylindrical Vessel Subjected to Internal Pressure

Authors: Tejeet Singh, Ishvneet Singh, Vinay Gupta

Abstract:

The present study focussed on carrying out the creep analysis in an isotropic thick-walled composite cylindrical pressure vessel composed of aluminium matrix reinforced with silicon-carbide in particulate form. The creep behaviour of the composite material has been described by the threshold stress based creep law. The value of stress exponent appearing in the creep law was selected as 3, 5 and 8. The constitutive equations were developed using well known von-Mises yield criteria. Models were developed to find out the distributions of creep stresses and strain rate in thick-walled composite cylindrical pressure vessels under internal pressure. In order to obtain the stress distributions in the cylinder, the equilibrium equation of the continuum mechanics and the constitutive equations are solved together. It was observed that the radial stress, tangential stress and axial stress increases along with the radial distance. The cross-over was also obtained almost at the middle region of cylindrical vessel for tangential and axial stress for different values of stress exponent. The strain rates were also decreasing in nature along the entire radius.

Keywords: creep, composite, cylindrical vessel, internal pressure

Procedia PDF Downloads 540
604 Simulation Study of Multiple-Thick Gas Electron Multiplier-Based Microdosimeters for Fast Neutron Measurements

Authors: Amir Moslehi, Gholamreza Raisali

Abstract:

Microdosimetric detectors based on multiple-thick gas electron multiplier (multiple-THGEM) configurations are being used in various fields of radiation protection and dosimetry. In the present work, microdosimetric response of these detectors to fast neutrons has been investigated by Monte Carlo method. Three similar microdosimeters made of A-150 and rexolite as the wall materials are designed; the first based on single-THGEM, the second based on double-THGEM and the third is based on triple-THGEM. Sensitive volume of the three microdosimeters is a right cylinder of 5 mm height and diameter which is filled with the propane-based tissue-equivalent (TE) gas. The TE gas with 0.11 atm pressure at the room temperature simulates 1 µm of tissue. Lineal energy distributions for several neutron energies from 10 keV to 14 MeV including 241Am-Be neutrons are calculated by the Geant4 simulation toolkit. Also, mean quality factor and dose-equivalent value for any neutron energy has been determined by these distributions. Obtained data derived from the three microdosimeters are in agreement. Therefore, we conclude that the multiple-THGEM structures present similar microdosimetric responses to fast neutrons.

Keywords: fast neutrons, geant4, multiple-thick gas electron multiplier, microdosimeter

Procedia PDF Downloads 324