Search results for: cumulative normal distribution function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11705

Search results for: cumulative normal distribution function

11675 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength

Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos

Abstract:

Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.

Keywords: statistical slope stability analysis, skew distributions, probability of failure, functions of random variables

Procedia PDF Downloads 307
11674 Reliability Analysis of Construction Schedule Plan Based on Building Information Modelling

Authors: Lu Ren, You-Liang Fang, Yan-Gang Zhao

Abstract:

In recent years, the application of BIM (Building Information Modelling) to construction schedule plan has been the focus of more and more researchers. In order to assess the reasonable level of the BIM-based construction schedule plan, that is whether the schedule can be completed on time, some researchers have introduced reliability theory to evaluate. In the process of evaluation, the uncertain factors affecting the construction schedule plan are regarded as random variables, and probability distributions of the random variables are assumed to be normal distribution, which is determined using two parameters evaluated from the mean and standard deviation of statistical data. However, in practical engineering, most of the uncertain influence factors are not normal random variables. So the evaluation results of the construction schedule plan will be unreasonable under the assumption that probability distributions of random variables submitted to the normal distribution. Therefore, in order to get a more reasonable evaluation result, it is necessary to describe the distribution of random variables more comprehensively. For this purpose, cubic normal distribution is introduced in this paper to describe the distribution of arbitrary random variables, which is determined by the first four moments (mean, standard deviation, skewness and kurtosis). In this paper, building the BIM model firstly according to the design messages of the structure and making the construction schedule plan based on BIM, then the cubic normal distribution is used to describe the distribution of the random variables due to the collecting statistical data of the random factors influencing construction schedule plan. Next the reliability analysis of the construction schedule plan based on BIM can be carried out more reasonably. Finally, the more accurate evaluation results can be given providing reference for the implementation of the actual construction schedule plan. In the last part of this paper, the more efficiency and accuracy of the proposed methodology for the reliability analysis of the construction schedule plan based on BIM are conducted through practical engineering case.

Keywords: BIM, construction schedule plan, cubic normal distribution, reliability analysis

Procedia PDF Downloads 107
11673 The Effect of Excel on Undergraduate Students’ Understanding of Statistics and the Normal Distribution

Authors: Masomeh Jamshid Nejad

Abstract:

Nowadays, statistical literacy is no longer a necessary skill but an essential skill with broad applications across diverse fields, especially in operational decision areas such as business management, finance, and economics. As such, learning and deep understanding of statistical concepts are essential in the context of business studies. One of the crucial topics in statistical theory and its application is the normal distribution, often called a bell-shaped curve. To interpret data and conduct hypothesis tests, comprehending the properties of normal distribution (the mean and standard deviation) is essential for business students. This requires undergraduate students in the field of economics and business management to visualize and work with data following a normal distribution. Since technology is interconnected with education these days, it is important to teach statistics topics in the context of Python, R-studio, and Microsoft Excel to undergraduate students. This research endeavours to shed light on the effect of Excel-based instruction on learners’ knowledge of statistics, specifically the central concept of normal distribution. As such, two groups of undergraduate students (from the Business Management program) were compared in this research study. One group underwent Excel-based instruction and another group relied only on traditional teaching methods. We analyzed experiential data and BBA participants’ responses to statistic-related questions focusing on the normal distribution, including its key attributes, such as the mean and standard deviation. The results of our study indicate that exposing students to Excel-based learning supports learners in comprehending statistical concepts more effectively compared with the other group of learners (teaching with the traditional method). In addition, students in the context of Excel-based instruction showed ability in picturing and interpreting data concentrated on normal distribution.

Keywords: statistics, excel-based instruction, data visualization, pedagogy

Procedia PDF Downloads 29
11672 Kinetic Model to Interpret Whistler Waves in Multicomponent Non-Maxwellian Space Plasmas

Authors: Warda Nasir, M. N. S. Qureshi

Abstract:

Whistler waves are right handed circularly polarized waves and are frequently observed in space plasmas. The Low frequency branch of the Whistler waves having frequencies nearly around 100 Hz, known as Lion roars, are frequently observed in magnetosheath. Another feature of the magnetosheath is the observations of flat top electron distributions with single as well as two electron populations. In the past, lion roars were studied by employing kinetic model using classical bi-Maxwellian distribution function, however, could not be justified both on quantitatively as well as qualitatively grounds. We studied Whistler waves by employing kinetic model using non-Maxwellian distribution function such as the generalized (r,q) distribution function which is the generalized form of kappa and Maxwellian distribution functions by employing kinetic theory with single or two electron populations. We compare our results with the Cluster observations and found good quantitative and qualitative agreement between them. At times when lion roars are observed (not observed) in the data and bi-Maxwellian could not provide the sufficient growth (damping) rates, we showed that when generalized (r,q) distribution function is employed, the resulted growth (damping) rates exactly match the observations.

Keywords: kinetic model, whistler waves, non-maxwellian distribution function, space plasmas

Procedia PDF Downloads 282
11671 Nonparametric Specification Testing for the Drift of the Short Rate Diffusion Process Using a Panel of Yields

Authors: John Knight, Fuchun Li, Yan Xu

Abstract:

Based on a new method of the nonparametric estimator of the drift function, we propose a consistent test for the parametric specification of the drift function in the short rate diffusion process using observations from a panel of yields. The test statistic is shown to follow an asymptotic normal distribution under the null hypothesis that the parametric drift function is correctly specified, and converges to infinity under the alternative. Taking the daily 7-day European rates as a proxy of the short rate, we use our test to examine whether the drift of the short rate diffusion process is linear or nonlinear, which is an unresolved important issue in the short rate modeling literature. The testing results indicate that none of the drift functions in this literature adequately captures the dynamics of the drift, but nonlinear specification performs better than the linear specification.

Keywords: diffusion process, nonparametric estimation, derivative security price, drift function and volatility function

Procedia PDF Downloads 341
11670 Temperature Distribution Control for Baby Incubator System Using Arduino AT Mega 2560

Authors: W. Widhiada, D. N. K. P. Negara, P. A. Suryawan

Abstract:

The technological advances in the field of health to be very important, especially on the safety of the baby. In this case a lot of premature infants death caused by poorly managed health facilities. Mostly the death of premature baby caused by bacteria since the temperature around the baby is not normal. Related to this, the incubator equipment needs to be important, especially in how to control the temperature in incubator. On/Off controls is used to regulate the temperature distribution in the incubator so that the desired temperature is 36 °C to stay awake and stable. The authors have been observed and analyzed the data to determine the temperature distribution in the incubator using program of MATLAB/Simulink. The output temperature distribution is obtained at 36 °C in 400 seconds using an Arduino AT 2560. This incubator is able to maintain an ambient temperature and maintain the baby's body temperature within normal limits and keep the moisture in the air in accordance with the limit values required in infant incubator.

Keywords: on/off control, distribution temperature, Arduino AT 2560, baby incubator

Procedia PDF Downloads 460
11669 Parameters Estimation of Power Function Distribution Based on Selective Order Statistics

Authors: Moh'd Alodat

Abstract:

In this paper, we discuss the power function distribution and derive the maximum likelihood estimator of its parameter as well as the reliability parameter. We derive the large sample properties of the estimators based on the selective order statistic scheme. We conduct simulation studies to investigate the significance of the selective order statistic scheme in our setup and to compare the efficiency of the new proposed estimators.

Keywords: fisher information, maximum likelihood estimator, power function distribution, ranked set sampling, selective order statistics sampling

Procedia PDF Downloads 433
11668 The Market Structure Simulation of Heterogenous Firms

Authors: Arunas Burinskas, Manuela Tvaronavičienė

Abstract:

Although the new trade theories, unlike the theories of an industrial organisation, see the structure of the market and competition between enterprises through their heterogeneity according to various parameters, they do not pay any particular attention to the analysis of the market structure and its development. In this article, although we relied mainly on models developed by the scholars of new trade theory, we proposed a different approach. In our simulation model, we model market demand according to normal distribution function, while on the supply side (as it is in the new trade theory models), productivity is modeled with the Pareto distribution function. The results of the simulation show that companies with higher productivity (lower marginal costs) do not pass on all the benefits of such economies to buyers. However, even with higher marginal costs, firms can choose to offer higher value-added goods to stay in the market. In general, the structure of the market is formed quickly enough and depends on the skills available to firms.

Keywords: market, structure, simulation, heterogenous firms

Procedia PDF Downloads 114
11667 New Estimation in Autoregressive Models with Exponential White Noise by Using Reversible Jump MCMC Algorithm

Authors: Suparman Suparman

Abstract:

A white noise in autoregressive (AR) model is often assumed to be normally distributed. In application, the white noise usually do not follows a normal distribution. This paper aims to estimate a parameter of AR model that has a exponential white noise. A Bayesian method is adopted. A prior distribution of the parameter of AR model is selected and then this prior distribution is combined with a likelihood function of data to get a posterior distribution. Based on this posterior distribution, a Bayesian estimator for the parameter of AR model is estimated. Because the order of AR model is considered a parameter, this Bayesian estimator cannot be explicitly calculated. To resolve this problem, a method of reversible jump Markov Chain Monte Carlo (MCMC) is adopted. A result is a estimation of the parameter AR model can be simultaneously calculated.

Keywords: autoregressive (AR) model, exponential white Noise, bayesian, reversible jump Markov Chain Monte Carlo (MCMC)

Procedia PDF Downloads 328
11666 Lee-Carter Mortality Forecasting Method with Dynamic Normal Inverse Gaussian Mortality Index

Authors: Funda Kul, İsmail Gür

Abstract:

Pension scheme providers have to price mortality risk by accurate mortality forecasting method. There are many mortality-forecasting methods constructed and used in literature. The Lee-Carter model is the first model to consider stochastic improvement trends in life expectancy. It is still precisely used. Mortality forecasting is done by mortality index in the Lee-Carter model. It is assumed that mortality index fits ARIMA time series model. In this paper, we propose and use dynamic normal inverse gaussian distribution to modeling mortality indes in the Lee-Carter model. Using population mortality data for Italy, France, and Turkey, the model is forecasting capability is investigated, and a comparative analysis with other models is ensured by some well-known benchmarking criterions.

Keywords: mortality, forecasting, lee-carter model, normal inverse gaussian distribution

Procedia PDF Downloads 330
11665 Competing Risk Analyses in Survival Trials During COVID-19 Pandemic

Authors: Ping Xu, Gregory T. Golm, Guanghan (Frank) Liu

Abstract:

In the presence of competing events, traditional survival analysis may not be appropriate and can result in biased estimates, as it assumes independence between competing events and the event of interest. Instead, competing risk analysis should be considered to correctly estimate the survival probability of the event of interest and the hazard ratio between treatment groups. The COVID-19 pandemic has provided a potential source of competing risks in clinical trials, as participants in trials may experienceCOVID-related competing events before the occurrence of the event of interest, for instance, death due to COVID-19, which can affect the incidence rate of the event of interest. We have performed simulation studies to compare multiple competing risk analysis models, including the cumulative incidence function, the sub-distribution hazard function, and the cause-specific hazard function, to the traditional survival analysis model under various scenarios. We also provide a general recommendation on conducting competing risk analysis in randomized clinical trials during the era of the COVID-19 pandemic based on the extensive simulation results.

Keywords: competing risk, survival analysis, simulations, randomized clinical trial, COVID-19 pandemic

Procedia PDF Downloads 161
11664 Pure Scalar Equilibria for Normal-Form Games

Authors: Herbert W. Corley

Abstract:

A scalar equilibrium (SE) is an alternative type of equilibrium in pure strategies for an n-person normal-form game G. It is defined using optimization techniques to obtain a pure strategy for each player of G by maximizing an appropriate utility function over the acceptable joint actions. The players’ actions are determined by the choice of the utility function. Such a utility function could be agreed upon by the players or chosen by an arbitrator. An SE is an equilibrium since no players of G can increase the value of this utility function by changing their strategies. SEs are formally defined, and examples are given. In a greedy SE, the goal is to assign actions to the players giving them the largest individual payoffs jointly possible. In a weighted SE, each player is assigned weights modeling the degree to which he helps every player, including himself, achieve as large a payoff as jointly possible. In a compromise SE, each player wants a fair payoff for a reasonable interpretation of fairness. In a parity SE, the players want their payoffs to be as nearly equal as jointly possible. Finally, a satisficing SE achieves a personal target payoff value for each player. The vector payoffs associated with each of these SEs are shown to be Pareto optimal among all such acceptable vectors, as well as computationally tractable.

Keywords: compromise equilibrium, greedy equilibrium, normal-form game, parity equilibrium, pure strategies, satisficing equilibrium, scalar equilibria, utility function, weighted equilibrium

Procedia PDF Downloads 88
11663 Base Change for Fisher Metrics: Case of the q-Gaussian Inverse Distribution

Authors: Gabriel I. Loaiza Ossa, Carlos A. Cadavid Moreno, Juan C. Arango Parra

Abstract:

It is known that the Riemannian manifold determined by the family of inverse Gaussian distributions endowed with the Fisher metric has negative constant curvature κ= -1/2, as does the family of usual Gaussian distributions. In the present paper, firstly, we arrive at this result by following a different path, much simpler than the previous ones. We first put the family in exponential form, thus endowing the family with a new set of parameters, or coordinates, θ₁, θ₂; then we determine the matrix of the Fisher metric in terms of these parameters; and finally we compute this matrix in the original parameters. Secondly, we define the inverse q-Gaussian distribution family (q < 3) as the family obtained by replacing the usual exponential function with the Tsallis q-exponential function in the expression for the inverse Gaussian distribution and observe that it supports two possible geometries, the Fisher and the q-Fisher geometry. And finally, we apply our strategy to obtain results about the Fisher and q-Fisher geometry of the inverse q-Gaussian distribution family, similar to the ones obtained in the case of the inverse Gaussian distribution family.

Keywords: base of changes, information geometry, inverse Gaussian distribution, inverse q-Gaussian distribution, statistical manifolds

Procedia PDF Downloads 211
11662 Risk Management of Natural Disasters on Insurance Stock Market

Authors: Tarah Bouaricha

Abstract:

The impact of worst natural disasters is analysed in terms of insured losses which happened between 2010 and 2014 on S&P insurance index. Event study analysis is used to test whether natural disasters impact insurance index stock market price. There is no negative impact on insurance stock market price around the disasters event. To analyse the reaction of insurance stock market, normal returns (NR), abnormal returns (AR), cumulative abnormal returns (CAR), cumulative average abnormal returns (CAAR) and a parametric test on AR and on CAR are used.

Keywords: study event, natural disasters, insurance, reinsurance, stock market

Procedia PDF Downloads 363
11661 Existence of Rational Primitive Normal Pairs with Prescribed Norm and Trace

Authors: Soniya Takshak, R. K. Sharma

Abstract:

Let q and n be positive integers, then Fᵩ denotes the finite field of q elements, and Fqn denotes the extension of Fᵩ of degree n. Also, Fᵩ* represents the multiplicative group of non-zero elements of Fᵩ, and the generators of Fᵩ* are called primitive elements. A normal element α of a finite field Fᵩⁿ is such that {α, αᵠ, . . . , αᵠⁿ⁻¹} forms a basis for Fᵩⁿ over Fᵩ. Primitive normal elements have several applications in coding theory and cryptography. So, establishing the existence of primitive normal elements under certain conditions is both theoretically important and a natural issue. In this article, we provide a sufficient condition for the existence of a primitive normal element α in Fᵩⁿ of a prescribed primitive norm and non-zero trace over Fᵩ such that f(α) is also primitive, where f(x) ∈ Fᵩⁿ(x) is a rational function of degree sum m. Particularly, we investigated the rational functions of degree sum 4 over Fᵩⁿ, where q = 11ᵏ and demonstrated that there are only 3 exceptional pairs (q, n), n ≥ 7 for which such kind of primitive normal elements may not exist. In general, we show that such elements always exist except for finitely many choices of (q, n). To arrive to our conclusion, we used additive and multiplicative character sums.

Keywords: finite field, primitive element, normal element, norm, trace, character

Procedia PDF Downloads 65
11660 A Bayesian Model with Improved Prior in Extreme Value Problems

Authors: Eva L. Sanjuán, Jacinto Martín, M. Isabel Parra, Mario M. Pizarro

Abstract:

In Extreme Value Theory, inference estimation for the parameters of the distribution is made employing a small part of the observation values. When block maxima values are taken, many data are discarded. We developed a new Bayesian inference model to seize all the information provided by the data, introducing informative priors and using the relations between baseline and limit parameters. Firstly, we studied the accuracy of the new model for three baseline distributions that lead to a Gumbel extreme distribution: Exponential, Normal and Gumbel. Secondly, we considered mixtures of Normal variables, to simulate practical situations when data do not adjust to pure distributions, because of perturbations (noise).

Keywords: bayesian inference, extreme value theory, Gumbel distribution, highly informative prior

Procedia PDF Downloads 166
11659 The Unsteady Non-Equilibrium Distribution Function and Exact Equilibrium Time for a Dilute Gas Affected by Thermal Radiation Field

Authors: Taha Zakaraia Abdel Wahid

Abstract:

The behavior of the unsteady non-equilibrium distribution function for a dilute gas under the effect of non-linear thermal radiation field is presented. For the best of our knowledge this is done for the first time at all. The distinction and comparisons between the unsteady perturbed and the unsteady equilibrium velocity distribution functions are illustrated. The equilibrium time for the dilute gas is determined for the first time. The non-equilibrium thermodynamic properties of the system (gas+the heated plate) are investigated. The results are applied to the Argon gas, for various values of radiation field intensity. 3D-Graphics illustrating the calculated variables are drawn to predict their behavior. The results are discussed.

Keywords: dilute gas, radiation field, exact solutions, travelling wave method, unsteady BGK model, irreversible thermodynamics, unsteady non-equilibrium distribution functions

Procedia PDF Downloads 468
11658 DISGAN: Efficient Generative Adversarial Network-Based Method for Cyber-Intrusion Detection

Authors: Hongyu Chen, Li Jiang

Abstract:

Ubiquitous anomalies endanger the security of our system con- stantly. They may bring irreversible damages to the system and cause leakage of privacy. Thus, it is of vital importance to promptly detect these anomalies. Traditional supervised methods such as Decision Trees and Support Vector Machine (SVM) are used to classify normality and abnormality. However, in some case, the abnormal status are largely rarer than normal status, which leads to decision bias of these methods. Generative adversarial network (GAN) has been proposed to handle the case. With its strong generative ability, it only needs to learn the distribution of normal status, and identify the abnormal status through the gap between it and the learned distribution. Nevertheless, existing GAN-based models are not suitable to process data with discrete values, leading to immense degradation of detection performance. To cope with the discrete features, in this paper, we propose an efficient GAN-based model with specifically-designed loss function. Experiment results show that our model outperforms state-of-the-art models on discrete dataset and remarkably reduce the overhead.

Keywords: GAN, discrete feature, Wasserstein distance, multiple intermediate layers

Procedia PDF Downloads 94
11657 Comparison of Receiver Operating Characteristic Curve Smoothing Methods

Authors: D. Sigirli

Abstract:

The Receiver Operating Characteristic (ROC) curve is a commonly used statistical tool for evaluating the diagnostic performance of screening and diagnostic test with continuous or ordinal scale results which aims to predict the presence or absence probability of a condition, usually a disease. When the test results were measured as numeric values, sensitivity and specificity can be computed across all possible threshold values which discriminate the subjects as diseased and non-diseased. There are infinite numbers of possible decision thresholds along the continuum of the test results. The ROC curve presents the trade-off between sensitivity and the 1-specificity as the threshold changes. The empirical ROC curve which is a non-parametric estimator of the ROC curve is robust and it represents data accurately. However, especially for small sample sizes, it has a problem of variability and as it is a step function there can be different false positive rates for a true positive rate value and vice versa. Besides, the estimated ROC curve being in a jagged form, since the true ROC curve is a smooth curve, it underestimates the true ROC curve. Since the true ROC curve is assumed to be smooth, several smoothing methods have been explored to smooth a ROC curve. These include using kernel estimates, using log-concave densities, to fit parameters for the specified density function to the data with the maximum-likelihood fitting of univariate distributions or to create a probability distribution by fitting the specified distribution to the data nd using smooth versions of the empirical distribution functions. In the present paper, we aimed to propose a smooth ROC curve estimation based on the boundary corrected kernel function and to compare the performances of ROC curve smoothing methods for the diagnostic test results coming from different distributions in different sample sizes. We performed simulation study to compare the performances of different methods for different scenarios with 1000 repetitions. It is seen that the performance of the proposed method was typically better than that of the empirical ROC curve and only slightly worse compared to the binormal model when in fact the underlying samples were generated from the normal distribution.

Keywords: empirical estimator, kernel function, smoothing, receiver operating characteristic curve

Procedia PDF Downloads 123
11656 Parameter Estimation of Gumbel Distribution with Maximum-Likelihood Based on Broyden Fletcher Goldfarb Shanno Quasi-Newton

Authors: Dewi Retno Sari Saputro, Purnami Widyaningsih, Hendrika Handayani

Abstract:

Extreme data on an observation can occur due to unusual circumstances in the observation. The data can provide important information that can’t be provided by other data so that its existence needs to be further investigated. The method for obtaining extreme data is one of them using maxima block method. The distribution of extreme data sets taken with the maxima block method is called the distribution of extreme values. Distribution of extreme values is Gumbel distribution with two parameters. The parameter estimation of Gumbel distribution with maximum likelihood method (ML) is difficult to determine its exact value so that it is necessary to solve the approach. The purpose of this study was to determine the parameter estimation of Gumbel distribution with quasi-Newton BFGS method. The quasi-Newton BFGS method is a numerical method used for nonlinear function optimization without constraint so that the method can be used for parameter estimation from Gumbel distribution whose distribution function is in the form of exponential doubel function. The quasi-New BFGS method is a development of the Newton method. The Newton method uses the second derivative to calculate the parameter value changes on each iteration. Newton's method is then modified with the addition of a step length to provide a guarantee of convergence when the second derivative requires complex calculations. In the quasi-Newton BFGS method, Newton's method is modified by updating both derivatives on each iteration. The parameter estimation of the Gumbel distribution by a numerical approach using the quasi-Newton BFGS method is done by calculating the parameter values that make the distribution function maximum. In this method, we need gradient vector and hessian matrix. This research is a theory research and application by studying several journals and textbooks. The results of this study obtained the quasi-Newton BFGS algorithm and estimation of Gumbel distribution parameters. The estimation method is then applied to daily rainfall data in Purworejo District to estimate the distribution parameters. This indicates that the high rainfall that occurred in Purworejo District decreased its intensity and the range of rainfall that occurred decreased.

Keywords: parameter estimation, Gumbel distribution, maximum likelihood, broyden fletcher goldfarb shanno (BFGS)quasi newton

Procedia PDF Downloads 297
11655 Survival and Hazard Maximum Likelihood Estimator with Covariate Based on Right Censored Data of Weibull Distribution

Authors: Al Omari Mohammed Ahmed

Abstract:

This paper focuses on Maximum Likelihood Estimator with Covariate. Covariates are incorporated into the Weibull model. Under this regression model with regards to maximum likelihood estimator, the parameters of the covariate, shape parameter, survival function and hazard rate of the Weibull regression distribution with right censored data are estimated. The mean square error (MSE) and absolute bias are used to compare the performance of Weibull regression distribution. For the simulation comparison, the study used various sample sizes and several specific values of the Weibull shape parameter.

Keywords: weibull regression distribution, maximum likelihood estimator, survival function, hazard rate, right censoring

Procedia PDF Downloads 412
11654 Improved Imaging and Tracking Algorithm for Maneuvering Extended UAVs Using High-Resolution ISAR Radar System

Authors: Mohamed Barbary, Mohamed H. Abd El-Azeem

Abstract:

Maneuvering extended object tracking (M-EOT) using high-resolution inverse synthetic aperture radar (ISAR) observations has been gaining momentum recently. This work presents a new robust implementation of the multiple models (MM) multi-Bernoulli (MB) filter for M-EOT, where the M-EOT’s ISAR observations are characterized using a skewed (SK) non-symmetrically normal distribution. To cope with the possible abrupt change of kinematic state, extension, and observation distribution over an extended object when a target maneuvers, a multiple model technique is represented based on MB-track-before-detect (TBD) filter supported by SK-sub-random matrix model (RMM) or sub-ellipses framework. Simulation results demonstrate this remarkable impact.

Keywords: maneuvering extended objects, ISAR, skewed normal distribution, sub-RMM, MM-MB-TBD filter

Procedia PDF Downloads 46
11653 The Analysis of Personalized Low-Dose Computed Tomography Protocol Based on Cumulative Effective Radiation Dose and Cumulative Organ Dose for Patients with Breast Cancer with Regular Chest Computed Tomography Follow up

Authors: Okhee Woo

Abstract:

Purpose: The aim of this study is to evaluate 2-year cumulative effective radiation dose and cumulative organ dose on regular follow-up computed tomography (CT) scans in patients with breast cancer and to establish personalized low-dose CT protocol. Methods and Materials: A retrospective study was performed on the patients with breast cancer who were diagnosed and managed consistently on the basis of routine breast cancer follow-up protocol between 2012-01 and 2016-06. Based on ICRP (International Commission on Radiological Protection) 103, the cumulative effective radiation doses of each patient for 2-year follow-up were analyzed using the commercial radiation management software (Radimetrics, Bayer healthcare). The personalized effective doses on each organ were analyzed in detail by the software-providing Monte Carlo simulation. Results: A total of 3822 CT scans on 490 patients was evaluated (age: 52.32±10.69). The mean scan number for each patient was 7.8±4.54. Each patient was exposed 95.54±63.24 mSv of radiation for 2 years. The cumulative CT radiation dose was significantly higher in patients with lymph node metastasis (p = 0.00). The HER-2 positive patients were more exposed to radiation compared to estrogen or progesterone receptor positive patient (p = 0.00). There was no difference in the cumulative effective radiation dose with different age groups. Conclusion: To acknowledge how much radiation exposed to a patient is a starting point of management of radiation exposure for patients with long-term CT follow-up. The precise and personalized protocol, as well as iterative reconstruction, may reduce hazard from unnecessary radiation exposure.

Keywords: computed tomography, breast cancer, effective radiation dose, cumulative organ dose

Procedia PDF Downloads 161
11652 A Comparative Study of Generalized Autoregressive Conditional Heteroskedasticity (GARCH) and Extreme Value Theory (EVT) Model in Modeling Value-at-Risk (VaR)

Authors: Longqing Li

Abstract:

The paper addresses the inefficiency of the classical model in measuring the Value-at-Risk (VaR) using a normal distribution or a Student’s t distribution. Specifically, the paper focuses on the one day ahead Value-at-Risk (VaR) of major stock market’s daily returns in US, UK, China and Hong Kong in the most recent ten years under 95% confidence level. To improve the predictable power and search for the best performing model, the paper proposes using two leading alternatives, Extreme Value Theory (EVT) and a family of GARCH models, and compares the relative performance. The main contribution could be summarized in two aspects. First, the paper extends the GARCH family model by incorporating EGARCH and TGARCH to shed light on the difference between each in estimating one day ahead Value-at-Risk (VaR). Second, to account for the non-normality in the distribution of financial markets, the paper applies Generalized Error Distribution (GED), instead of the normal distribution, to govern the innovation term. A dynamic back-testing procedure is employed to assess the performance of each model, a family of GARCH and the conditional EVT. The conclusion is that Exponential GARCH yields the best estimate in out-of-sample one day ahead Value-at-Risk (VaR) forecasting. Moreover, the discrepancy of performance between the GARCH and the conditional EVT is indistinguishable.

Keywords: Value-at-Risk, Extreme Value Theory, conditional EVT, backtesting

Procedia PDF Downloads 298
11651 Occupational Cumulative Effective Doses of Radiation Workers in Hamad Medical Corporation in Qatar

Authors: Omar Bobes, Abeer Al-Attar, Mohammad Hassan Kharita, Huda Al-Naemi

Abstract:

The number of radiological examinations has increased steadily in recent years. As a result, the risk of possible radiation-induced consequential damage also increases through continuous, lifelong, and increasing exposure to ionizing radiation. Therefore, radiation dose monitoring in medicine became an essential element of medical practice. In this study, the occupational cumulative doses for radiation workers in Hamad medical corporation in Qatar have been assessed for a period of five years. The number of monitored workers selected for this study was 555 (out of a total of 1250 monitored workers) who have been working continuously -with no interruption- with ionizing radiation over the past five years from 2015 to 2019. The aim of this work is to examine the occupational groups and the activities where the higher radiation exposure occurred and in what order of magnitude. The most exposed group was the nuclear medicine technologist staff, with an average cumulative dose of 8.4 mSv. The highest individual cumulative dose was 9.8 mSv recorded for the PET-CT technologist category.

Keywords: cumulative dose, effective dose, monitoring, occupational exposure, dosimetry

Procedia PDF Downloads 207
11650 Modelling Rainfall-Induced Shallow Landslides in the Northern New South Wales

Authors: S. Ravindran, Y.Liu, I. Gratchev, D.Jeng

Abstract:

Rainfall-induced shallow landslides are more common in the northern New South Wales (NSW), Australia. From 2009 to 2017, around 105 rainfall-induced landslides occurred along the road corridors and caused temporary road closures in the northern NSW. Rainfall causing shallow landslides has different distributions of rainfall varying from uniform, normal, decreasing to increasing rainfall intensity. The duration of rainfall varied from one day to 18 days according to historical data. The objective of this research is to analyse slope instability of some of the sites in the northern NSW by varying cumulative rainfall using SLOPE/W and SEEP/W and compare with field data of rainfall causing shallow landslides. The rainfall data and topographical data from public authorities and soil data obtained from laboratory tests will be used for this modelling. There is a likelihood of shallow landslides if the cumulative rainfall is between 100 mm to 400 mm in accordance with field data.

Keywords: landslides, modelling, rainfall, suction

Procedia PDF Downloads 134
11649 Exploring the Energy Model of Cumulative Grief

Authors: Masica Jordan Alston, Angela N. Bullock, Angela S. Henderson, Stephanie Strianse, Sade Dunn, Joseph Hackett, Alaysia Black Hackett, Marcus Mason

Abstract:

The Energy Model of Cumulative Grief was created in 2018. The Energy Model of Cumulative Grief utilizes historic models of grief stage theories. The innovative model is additionally unique due to its focus on cultural responsiveness. The Energy Model of Cumulative Grief helps to train practitioners who work with clients dealing with grief and loss. This paper assists in introducing the world to this innovative model and exploring how this model positively impacted a convenience sample of 140 practitioners and individuals experiencing grief and loss. Respondents participated in Webinars provided by the National Grief and Loss Center of America (NGLCA). Participants in this cross-sectional research design study completed one of three Grief and Loss Surveys created by the Grief and Loss Centers of America. Data analysis for this study was conducted via SPSS and Survey Hero to examine survey results for respondents. Results indicate that the Energy Model of Cumulative Grief was an effective resource for participants in addressing grief and loss. The majority of participants found the Webinars to be helpful and a conduit to providing them with higher levels of hope. The findings suggest that using The Energy Model of Cumulative Grief is effective in providing culturally responsive grief and loss resources to practitioners and clients. There are far reaching implications with the use of technology to provide hope to those suffering from grief and loss worldwide through The Energy Model of Cumulative Grief.

Keywords: grief, loss, grief energy, grieving brain

Procedia PDF Downloads 53
11648 Implementation of the Recursive Formula for Evaluation of the Strength of Daniels' Bundle

Authors: Vaclav Sadilek, Miroslav Vorechovsky

Abstract:

The paper deals with the classical fiber bundle model of equal load sharing, sometimes referred to as the Daniels' bundle or the democratic bundle. Daniels formulated a multidimensional integral and also a recursive formula for evaluation of the strength cumulative distribution function. This paper describes three algorithms for evaluation of the recursive formula and also their implementations with source codes in high-level programming language Python. A comparison of the algorithms are provided with respect to execution time. Analysis of orders of magnitudes of addends in the recursion is also provided.

Keywords: equal load sharing, mpmath, python, strength of Daniels' bundle

Procedia PDF Downloads 375
11647 ISAR Imaging and Tracking Algorithm for Maneuvering Non-ellipsoidal Extended Objects Using Jump Markov Systems

Authors: Mohamed Barbary, Mohamed H. Abd El-azeem

Abstract:

Maneuvering non-ellipsoidal extended object tracking (M-NEOT) using high-resolution inverse synthetic aperture radar (ISAR) observations is gaining momentum recently. This work presents a new robust implementation of the Jump Markov (JM) multi-Bernoulli (MB) filter for M-NEOT, where the M-NEOT’s ISAR observations are characterized using a skewed (SK) non-symmetrically normal distribution. To cope with the possible abrupt change of kinematic state, extension, and observation distribution over an extended object when a target maneuvers, a multiple model technique is represented based on an MB-track-before-detect (TBD) filter supported by SK-sub-random matrix model (RMM) or sub-ellipses framework. Simulation results demonstrate this remarkable impact.

Keywords: maneuvering extended objects, ISAR, skewed normal distribution, sub-RMM, JM-MB-TBD filter

Procedia PDF Downloads 31
11646 The Study of Rapeseed Characteristics by Factor Analysis under Normal and Drought Stress Conditions

Authors: Ali Bakhtiari Gharibdosti, Mohammad Hosein Bijeh Keshavarzi, Samira Alijani

Abstract:

To understand internal characteristics relationships and determine factors which explain under consideration characteristics in rapeseed varieties, 10 rapeseed genotypes were implemented in complete accidental plot with three-time repetitions under drought stress in 2009-2010 in research field of agriculture college, Islamic Azad University, Karaj branch. In this research, 11 characteristics include of characteristics related to growth, production and functions stages was considered. Variance analysis results showed that there is a significant difference among rapeseed varieties characteristics. By calculating simple correlation coefficient under both conditions, normal and drought stress indicate that seed function characteristics in plant and pod number have positive and significant correlation in 1% probable level with seed function and selection on the base of these characteristics was effective for improving this function. Under normal and drought stress, analyzing the main factors showed that numbers of factors which have more than one amount, had five factors under normal conditions which were 82.72% of total variance totally, but under drought stress four factors diagnosed which were 76.78% of total variance. By considering total results of this research and by assessing effective characteristics for factor analysis and selecting different components of these characteristics, they can be used for modifying works to select applicable and tolerant genotypes in drought stress conditions.

Keywords: correlation, drought stress, factor analysis, rapeseed

Procedia PDF Downloads 151