Search results for: beta binomial posterior predictive distribution
2367 Estimation of Bayesian Sample Size for Binomial Proportions Using Areas P-tolerance with Lowest Posterior Loss
Authors: H. Bevrani, N. Najafi
Abstract:
This paper uses p-tolerance with the lowest posterior loss, quadratic loss function, average length criteria, average coverage criteria, and worst outcome criterion for computing of sample size to estimate proportion in Binomial probability function with Beta prior distribution. The proposed methodology is examined, and its effectiveness is shown.Keywords: Bayesian inference, Beta-binomial Distribution, LPLcriteria, quadratic loss function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17492366 A Posterior Predictive Model-Based Control Chart for Monitoring Healthcare
Authors: Yi-Fan Lin, Peter P. Howley, Frank A. Tuyl
Abstract:
Quality measurement and reporting systems are used in healthcare internationally. In Australia, the Australian Council on Healthcare Standards records and reports hundreds of clinical indicators (CIs) nationally across the healthcare system. These CIs are measures of performance in the clinical setting, and are used as a screening tool to help assess whether a standard of care is being met. Existing analysis and reporting of these CIs incorporate Bayesian methods to address sampling variation; however, such assessments are retrospective in nature, reporting upon the previous six or twelve months of data. The use of Bayesian methods within statistical process control for monitoring systems is an important pursuit to support more timely decision-making. Our research has developed and assessed a new graphical monitoring tool, similar to a control chart, based on the beta-binomial posterior predictive (BBPP) distribution to facilitate the real-time assessment of health care organizational performance via CIs. The BBPP charts have been compared with the traditional Bernoulli CUSUM (BC) chart by simulation. The more traditional “central” and “highest posterior density” (HPD) interval approaches were each considered to define the limits, and the multiple charts were compared via in-control and out-of-control average run lengths (ARLs), assuming that the parameter representing the underlying CI rate (proportion of cases with an event of interest) required estimation. Preliminary results have identified that the BBPP chart with HPD-based control limits provides better out-of-control run length performance than the central interval-based and BC charts. Further, the BC chart’s performance may be improved by using Bayesian parameter estimation of the underlying CI rate.
Keywords: Average run length, Bernoulli CUSUM chart, beta binomial posterior predictive distribution, clinical indicator, health care organization, highest posterior density interval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8752365 On the Parameter of the Burr Type X under Bayesian Principles
Authors: T. N. Sindhu, M. Aslam
Abstract:
A comprehensive Bayesian analysis has been carried out in the context of informative and non-informative priors for the shape parameter of the Burr type X distribution under different symmetric and asymmetric loss functions. Elicitation of hyperparameter through prior predictive approach is also discussed. Also we derive the expression for posterior predictive distributions, predictive intervals and the credible Intervals. As an illustration, comparisons of these estimators are made through simulation study.
Keywords: Credible Intervals, Loss Functions, Posterior Predictive Distributions, Predictive Intervals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15062364 Moment Estimators of the Parameters of Zero-One Inflated Negative Binomial Distribution
Authors: Rafid Saeed Abdulrazak Alshkaki
Abstract:
In this paper, zero-one inflated negative binomial distribution is considered, along with some of its structural properties, then its parameters were estimated using the method of moments. It is found that the method of moments to estimate the parameters of the zero-one inflated negative binomial models is not a proper method and may give incorrect conclusions.Keywords: Zero one inflated models, negative binomial distribution, moments estimator, non-negative integer sampling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10962363 On Bayesian Analysis of Failure Rate under Topp Leone Distribution using Complete and Censored Samples
Abstract:
The article is concerned with analysis of failure rate (shape parameter) under the Topp Leone distribution using a Bayesian framework. Different loss functions and a couple of noninformative priors have been assumed for posterior estimation. The posterior predictive distributions have also been derived. A simulation study has been carried to compare the performance of different estimators. A real life example has been used to illustrate the applicability of the results obtained. The findings of the study suggest that the precautionary loss function based on Jeffreys prior and singly type II censored samples can effectively be employed to obtain the Bayes estimate of the failure rate under Topp Leone distribution.
Keywords: loss functions, type II censoring, posterior distribution, Bayes estimators.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25602362 The Performance of Predictive Classification Using Empirical Bayes
Authors: N. Deetae, S. Sukparungsee, Y. Areepong, K. Jampachaisri
Abstract:
This research is aimed to compare the percentages of correct classification of Empirical Bayes method (EB) to Classical method when data are constructed as near normal, short-tailed and long-tailed symmetric, short-tailed and long-tailed asymmetric. The study is performed using conjugate prior, normal distribution with known mean and unknown variance. The estimated hyper-parameters obtained from EB method are replaced in the posterior predictive probability and used to predict new observations. Data are generated, consisting of training set and test set with the sample sizes 100, 200 and 500 for the binary classification. The results showed that EB method exhibited an improved performance over Classical method in all situations under study.
Keywords: Classification, Empirical Bayes, Posterior predictive probability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15972361 Multinomial Dirichlet Gaussian Process Model for Classification of Multidimensional Data
Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park
Abstract:
We present probabilistic multinomial Dirichlet classification model for multidimensional data and Gaussian process priors. Here, we have considered efficient computational method that can be used to obtain the approximate posteriors for latent variables and parameters needed to define the multiclass Gaussian process classification model. We first investigated the process of inducing a posterior distribution for various parameters and latent function by using the variational Bayesian approximations and important sampling method, and next we derived a predictive distribution of latent function needed to classify new samples. The proposed model is applied to classify the synthetic multivariate dataset in order to verify the performance of our model. Experiment result shows that our model is more accurate than the other approximation methods.Keywords: Multinomial dirichlet classification model, Gaussian process priors, variational Bayesian approximation, Importance sampling, approximate posterior distribution, Marginal likelihood evidence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16142360 Wet Polymeric Precipitation Synthesis for Monophasic Tricalcium Phosphate
Authors: I. Grigoraviciute-Puroniene, K. Tsuru, E. Garskaite, Z. Stankeviciute, A. Beganskiene, K. Ishikawa, A. Kareiva
Abstract:
Tricalcium phosphate (β-Ca3(PO4)2, β-TCP) powders were synthesized using wet polymeric precipitation method for the first time to our best knowledge. The results of X-ray diffraction analysis showed the formation of almost single a Ca-deficient hydroxyapatite (CDHA) phase of a poor crystallinity already at room temperature. With continuously increasing the calcination temperature up to 800 °C, the crystalline β-TCP was obtained as the main phase. It was demonstrated that infrared spectroscopy is very effective method to characterize the formation of β-TCP. The SEM results showed that β-TCP solids were homogeneous having a small particle size distribution. The β-TCP powders consisted of spherical particles varying in size from 100 to 300 nm. Fabricated β-TCP specimens were placed to the bones of the rats and maintained for 1-2 months.
Keywords: β-TCP, bone regeneration, wet chemical processing, polymeric precipitation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10642359 Human Action Recognition Using Variational Bayesian HMM with Dirichlet Process Mixture of Gaussian Wishart Emission Model
Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park
Abstract:
In this paper, we present the human action recognition method using the variational Bayesian HMM with the Dirichlet process mixture (DPM) of the Gaussian-Wishart emission model (GWEM). First, we define the Bayesian HMM based on the Dirichlet process, which allows an infinite number of Gaussian-Wishart components to support continuous emission observations. Second, we have considered an efficient variational Bayesian inference method that can be applied to drive the posterior distribution of hidden variables and model parameters for the proposed model based on training data. And then we have derived the predictive distribution that may be used to classify new action. Third, the paper proposes a process of extracting appropriate spatial-temporal feature vectors that can be used to recognize a wide range of human behaviors from input video image. Finally, we have conducted experiments that can evaluate the performance of the proposed method. The experimental results show that the method presented is more efficient with human action recognition than existing methods.
Keywords: Human action recognition, Bayesian HMM, Dirichlet process mixture model, Gaussian-Wishart emission model, Variational Bayesian inference, Prior distribution and approximate posterior distribution, KTH dataset.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10052358 The Influence of Beta Shape Parameters in Project Planning
Authors: Αlexios Kotsakis, Stefanos Katsavounis, Dimitra Alexiou
Abstract:
Networks can be utilized to represent project planning problems, using nodes for activities and arcs to indicate precedence relationship between them. For fixed activity duration, a simple algorithm calculates the amount of time required to complete a project, followed by the activities that comprise the critical path. Program Evaluation and Review Technique (PERT) generalizes the above model by incorporating uncertainty, allowing activity durations to be random variables, producing nevertheless a relatively crude solution in planning problems. In this paper, based on the findings of the relevant literature, which strongly suggests that a Beta distribution can be employed to model earthmoving activities, we utilize Monte Carlo simulation, to estimate the project completion time distribution and measure the influence of skewness, an element inherent in activities of modern technical projects. We also extract the activity criticality index, with an ultimate goal to produce more accurate planning estimations.
Keywords: Beta distribution, PERT, Monte Carlo Simulation, skewness, project completion time distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7702357 Normalizing Flow to Augmented Posterior: Conditional Density Estimation with Interpretable Dimension Reduction for High Dimensional Data
Authors: Cheng Zeng, George Michailidis, Hitoshi Iyatomi, Leo L Duan
Abstract:
The conditional density characterizes the distribution of a response variable y given other predictor x, and plays a key role in many statistical tasks, including classification and outlier detection. Although there has been abundant work on the problem of Conditional Density Estimation (CDE) for a low-dimensional response in the presence of a high-dimensional predictor, little work has been done for a high-dimensional response such as images. The promising performance of normalizing flow (NF) neural networks in unconditional density estimation acts a motivating starting point. In this work, we extend NF neural networks when external x is present. Specifically, they use the NF to parameterize a one-to-one transform between a high-dimensional y and a latent z that comprises two components [zP , zN]. The zP component is a low-dimensional subvector obtained from the posterior distribution of an elementary predictive model for x, such as logistic/linear regression. The zN component is a high-dimensional independent Gaussian vector, which explains the variations in y not or less related to x. Unlike existing CDE methods, the proposed approach, coined Augmented Posterior CDE (AP-CDE), only requires a simple modification on the common normalizing flow framework, while significantly improving the interpretation of the latent component, since zP represents a supervised dimension reduction. In image analytics applications, AP-CDE shows good separation of x-related variations due to factors such as lighting condition and subject id, from the other random variations. Further, the experiments show that an unconditional NF neural network, based on an unsupervised model of z, such as Gaussian mixture, fails to generate interpretable results.
Keywords: Conditional density estimation, image generation, normalizing flow, supervised dimension reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1652356 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm
Authors: Ameur Abdelkader, Abed Bouarfa Hafida
Abstract:
Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.
Keywords: Predictive analysis, big data, predictive analysis algorithms. CART algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10752355 Status Report of the GERDA Phase II Startup
Authors: Valerio D’Andrea
Abstract:
The GERmanium Detector Array (GERDA) experiment, located at the Laboratori Nazionali del Gran Sasso (LNGS) of INFN, searches for 0νββ of 76Ge. Germanium diodes enriched to ∼ 86 % in the double beta emitter 76Ge(enrGe) are exposed being both source and detectors of 0νββ decay. Neutrinoless double beta decay is considered a powerful probe to address still open issues in the neutrino sector of the (beyond) Standard Model of particle Physics. Since 2013, just after the completion of the first part of its experimental program (Phase I), the GERDA setup has been upgraded to perform its next step in the 0νββ searches (Phase II). Phase II aims to reach a sensitivity to the 0νββ decay half-life larger than 1026 yr in about 3 years of physics data taking. This exposing a detector mass of about 35 kg of enrGe and with a background index of about 10^−3 cts/(keV·kg·yr). One of the main new implementations is the liquid argon scintillation light read-out, to veto those events that only partially deposit their energy both in Ge and in the surrounding LAr. In this paper, the GERDA Phase II expected goals, the upgrade work and few selected features from the 2015 commissioning and 2016 calibration runs will be presented. The main Phase I achievements will be also reviewed.Keywords: Gerda, double beta decay, germanium, LNGS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15242354 Variational EM Inference Algorithm for Gaussian Process Classification Model with Multiclass and Its Application to Human Action Classification
Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park
Abstract:
In this paper, we propose the variational EM inference algorithm for the multi-class Gaussian process classification model that can be used in the field of human behavior recognition. This algorithm can drive simultaneously both a posterior distribution of a latent function and estimators of hyper-parameters in a Gaussian process classification model with multiclass. Our algorithm is based on the Laplace approximation (LA) technique and variational EM framework. This is performed in two steps: called expectation and maximization steps. First, in the expectation step, using the Bayesian formula and LA technique, we derive approximately the posterior distribution of the latent function indicating the possibility that each observation belongs to a certain class in the Gaussian process classification model. Second, in the maximization step, using a derived posterior distribution of latent function, we compute the maximum likelihood estimator for hyper-parameters of a covariance matrix necessary to define prior distribution for latent function. These two steps iteratively repeat until a convergence condition satisfies. Moreover, we apply the proposed algorithm with human action classification problem using a public database, namely, the KTH human action data set. Experimental results reveal that the proposed algorithm shows good performance on this data set.
Keywords: Bayesian rule, Gaussian process classification model with multiclass, Gaussian process prior, human action classification, laplace approximation, variational EM algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17582353 Exergy Analysis of Combined Cycle of Air Separation and Natural Gas Liquefaction
Authors: Hanfei Tuo, Yanzhong Li
Abstract:
This paper presented a novel combined cycle of air separation and natural gas liquefaction. The idea is that natural gas can be liquefied, meanwhile gaseous or liquid nitrogen and oxygen are produced in one combined cryogenic system. Cycle simulation and exergy analysis were performed to evaluate the process and thereby reveal the influence of the crucial parameter, i.e., flow rate ratio through two stages expanders β on heat transfer temperature difference, its distribution and consequent exergy loss. Composite curves for the combined hot streams (feeding natural gas and recycled nitrogen) and the cold stream showed the degree of optimization available in this process if appropriate β was designed. The results indicated that increasing β reduces temperature difference and exergy loss in heat exchange process. However, the maximum limit value of β should be confined in terms of minimum temperature difference proposed in heat exchanger design standard and heat exchanger size. The optimal βopt under different operation conditions corresponding to the required minimum temperature differences was investigated.
Keywords: combined cycle simulation, exergy analysis, natural gas liquefaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27912352 Time Series Forecasting Using a Hybrid RBF Neural Network and AR Model Based On Binomial Smoothing
Authors: Fengxia Zheng, Shouming Zhong
Abstract:
ANNARIMA that combines both autoregressive integrated moving average (ARIMA) model and artificial neural network (ANN) model is a valuable tool for modeling and forecasting nonlinear time series, yet the over-fitting problem is more likely to occur in neural network models. This paper provides a hybrid methodology that combines both radial basis function (RBF) neural network and auto regression (AR) model based on binomial smoothing (BS) technique which is efficient in data processing, which is called BSRBFAR. This method is examined by using the data of Canadian Lynx data. Empirical results indicate that the over-fitting problem can be eased using RBF neural network based on binomial smoothing which is called BS-RBF, and the hybrid model–BS-RBFAR can be an effective way to improve forecasting accuracy achieved by BSRBF used separately.
Keywords: Binomial smoothing (BS), hybrid, Canadian Lynx data, forecasting accuracy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36852351 Computer Modeling of Drug Distribution after Intravitreal Administration
Authors: N. Haghjou, M. J. Abdekhodaie, Y. L. Cheng, M. Saadatmand
Abstract:
Intravitreal injection (IVI) is the most common treatment for eye posterior segment diseases such as endopthalmitis, retinitis, age-related macular degeneration, diabetic retinopathy, uveitis, and retinal detachment. Most of the drugs used to treat vitreoretinal diseases, have a narrow concentration range in which they are effective, and may be toxic at higher concentrations. Therefore, it is critical to know the drug distribution within the eye following intravitreal injection. Having knowledge of drug distribution, ophthalmologists can decide on drug injection frequency while minimizing damage to tissues. The goal of this study was to develop a computer model to predict intraocular concentrations and pharmacokinetics of intravitreally injected drugs. A finite volume model was created to predict distribution of two drugs with different physiochemical properties in the rabbit eye. The model parameters were obtained from literature review. To validate this numeric model, the in vivo data of spatial concentration profile from the lens to the retina were compared with the numeric data. The difference was less than 5% between the numerical and experimental data. This validation provides strong support for the numerical methodology and associated assumptions of the current study.
Keywords: Posterior segment, Intravitreal injection (IVI), Pharmacokinetic, Modelling, Finite volume method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24462350 Bayesian Decision Approach to Protection on the Flood Event in Upper Ayeyarwady River, Myanmar
Authors: Min Min Swe Zin
Abstract:
This paper introduces the foundations of Bayesian probability theory and Bayesian decision method. The main goal of Bayesian decision theory is to minimize the expected loss of a decision or minimize the expected risk. The purposes of this study are to review the decision process on the issue of flood occurrences and to suggest possible process for decision improvement. This study examines the problem structure of flood occurrences and theoretically explicates the decision-analytic approach based on Bayesian decision theory and application to flood occurrences in Environmental Engineering. In this study, we will discuss about the flood occurrences upon an annual maximum water level in cm, 43-year record available from 1965 to 2007 at the gauging station of Sagaing on the Ayeyarwady River with the drainage area - 120193 sq km by using Bayesian decision method. As a result, we will discuss the loss and risk of vast areas of agricultural land whether which will be inundated or not in the coming year based on the two standard maximum water levels during 43 years. And also we forecast about that lands will be safe from flood water during the next 10 years.
Keywords: Bayesian decision method, conditional binomial distribution, minimax rules, prior beta distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15782349 Multivariable Predictive PID Control for Quadruple Tank
Authors: Qamar Saeed, Vali Uddin, Reza Katebi
Abstract:
In this paper multivariable predictive PID controller has been implemented on a multi-inputs multi-outputs control problem i.e., quadruple tank system, in comparison with a simple multiloop PI controller. One of the salient feature of this system is an adjustable transmission zero which can be adjust to operate in both minimum and non-minimum phase configuration, through the flow distribution to upper and lower tanks in quadruple tank system. Stability and performance analysis has also been carried out for this highly interactive two input two output system, both in minimum and non-minimum phases. Simulations of control system revealed that better performance are obtained in predictive PID design.Keywords: Proportional-integral-derivative Control, GeneralizedPredictive Control, Predictive PID Control, Multivariable Systems
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32752348 An EWMA p Chart Based On Improved Square Root Transformation
Authors: S. Sukparungsee
Abstract:
Generally, the traditional Shewhart p chart has been developed by for charting the binomial data. This chart has been developed using the normal approximation with condition as low defect level and the small to moderate sample size. In real applications, however, are away from these assumptions due to skewness in the exact distribution. In this paper, a modified Exponentially Weighted Moving Average (EWMA) control chat for detecting a change in binomial data by improving square root transformations, namely ISRT p EWMA control chart. The numerical results show that ISRT p EWMA chart is superior to ISRT p chart for small to moderate shifts, otherwise, the latter is better for large shifts.
Keywords: Number of defects, Exponentially Weighted Moving Average, Average Run Length, Square root transformations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24852347 Design of Smith-like Predictive Controller with Communication Delay Adaptation
Authors: Jasmin Velagic
Abstract:
This paper addresses the design of predictive networked controller with adaptation of a communication delay. The networked control system contains random delays from sensor to controller and from controller to actuator. The proposed predictive controller includes an adaptation loop which decreases the influence of communication delay on the control performance. Also, the predictive controller contains a filter which improves the robustness of the control system. The performance of the proposed adaptive predictive controller is demonstrated by simulation results in comparison with PI controller and predictive controller with constant delay.Keywords: Predictive control, adaptation, communication delay, communication network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18522346 Beta-spline Surface Fitting to Multi-slice Images
Authors: Normi Abdul Hadi, Arsmah Ibrahim, Fatimah Yahya, Jamaludin Md. Ali
Abstract:
Beta-spline is built on G2 continuity which guarantees smoothness of generated curves and surfaces using it. This curve is preferred to be used in object design rather than reconstruction. This study however, employs the Beta-spline in reconstructing a 3- dimensional G2 image of the Stanford Rabbit. The original data consists of multi-slice binary images of the rabbit. The result is then compared with related works using other techniques.Keywords: Beta-spline, multi-slice image, rectangular surface, 3D reconstruction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18822345 Acceptance Single Sampling Plan with Fuzzy Parameter with The Using of Poisson Distribution
Authors: Ezzatallah Baloui Jamkhaneh, Bahram Sadeghpour-Gildeh, Gholamhossein Yari
Abstract:
This purpose of this paper is to present the acceptance single sampling plan when the fraction of nonconforming items is a fuzzy number and being modeled based on the fuzzy Poisson distribution. We have shown that the operating characteristic (oc) curves of the plan is like a band having a high and low bounds whose width depends on the ambiguity proportion parameter in the lot when that sample size and acceptance numbers is fixed. Finally we completed discuss opinion by a numerical example. And then we compared the oc bands of using of binomial with the oc bands of using of Poisson distribution.
Keywords: Statistical quality control, acceptance single sampling, fuzzy number.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19892344 Statistical Analysis for Overdispersed Medical Count Data
Authors: Y. N. Phang, E. F. Loh
Abstract:
Many researchers have suggested the use of zero inflated Poisson (ZIP) and zero inflated negative binomial (ZINB) models in modeling overdispersed medical count data with extra variations caused by extra zeros and unobserved heterogeneity. The studies indicate that ZIP and ZINB always provide better fit than using the normal Poisson and negative binomial models in modeling overdispersed medical count data. In this study, we proposed the use of Zero Inflated Inverse Trinomial (ZIIT), Zero Inflated Poisson Inverse Gaussian (ZIPIG) and zero inflated strict arcsine models in modeling overdispered medical count data. These proposed models are not widely used by many researchers especially in the medical field. The results show that these three suggested models can serve as alternative models in modeling overdispersed medical count data. This is supported by the application of these suggested models to a real life medical data set. Inverse trinomial, Poisson inverse Gaussian and strict arcsine are discrete distributions with cubic variance function of mean. Therefore, ZIIT, ZIPIG and ZISA are able to accommodate data with excess zeros and very heavy tailed. They are recommended to be used in modeling overdispersed medical count data when ZIP and ZINB are inadequate.
Keywords: Zero inflated, inverse trinomial distribution, Poisson inverse Gaussian distribution, strict arcsine distribution, Pearson’s goodness of fit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33152343 The Study of the Discrete Risk Model with Random Income
Authors: Peichen Zhao
Abstract:
In this paper, we extend the compound binomial model to the case where the premium income process, based on a binomial process, is no longer a linear function. First, a mathematically recursive formula is derived for non ruin probability, and then, we examine the expected discounted penalty function, satisfy a defect renewal equation. Third, the asymptotic estimate for the expected discounted penalty function is then given. Finally, we give two examples of ruin quantities to illustrate applications of the recursive formula and the asymptotic estimate for penalty function.
Keywords: Discounted penalty function, compound binomial process, recursive formula, discrete renewal equation, asymptotic estimate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14232342 Nonlinear Model Predictive Control of Water Quality in Drinking Water Distribution Systems with DBPs Objectives
Authors: Mingyu Xie, Mietek Brdys
Abstract:
The paper develops a Non-Linear Model Predictive Control (NMPC) of water quality in Drinking Water Distribution Systems (DWDS) based on the advanced non-linear quality dynamics model including disinfections by-products (DBPs). A special attention is paid to the analysis of an impact of the flow trajectories prescribed by an upper control level of the recently developed two-time scale architecture of an integrated quality and quantity control in DWDS. The new quality controller is to operate within this architecture in the fast time scale as the lower level quality controller. The controller performance is validated by a comprehensive simulation study based on an example case study DWDS.Keywords: Model predictive control, hierarchical control structure, genetic algorithm, water quality with DBPs objectives.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24772341 Synthesis of Wavelet Filters using Wavelet Neural Networks
Authors: Wajdi Bellil, Chokri Ben Amar, Adel M. Alimi
Abstract:
An application of Beta wavelet networks to synthesize pass-high and pass-low wavelet filters is investigated in this work. A Beta wavelet network is constructed using a parametric function called Beta function in order to resolve some nonlinear approximation problem. We combine the filter design theory with wavelet network approximation to synthesize perfect filter reconstruction. The order filter is given by the number of neurons in the hidden layer of the neural network. In this paper we use only the first derivative of Beta function to illustrate the proposed design procedures and exhibit its performance.Keywords: Beta wavelets, Wavenet, multiresolution analysis, perfect filter reconstruction, salient point detect, repeatability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16632340 Effect of Geographical Co-Ordinates on the Parameters in the Rain Rate Model for Radio Propagation Applications
Authors: Olatinwo M. O., Oyeleke Olaosebikan, David Henry O.
Abstract:
Rain attenuation plays a lot of roles in the design of satellite and terrestrial microwave radio links, hence a good knowledge of its effect is of great interest to Engineers and scientists in that it is often required to give a high level of accuracy of the rainrate distribution that expresses rainrate from the lowest value to the highest. This study proposes a model to express rainrate parameters alpha (α) and beta (β) as a function of geographical location at 0.01% of the time. The tropical locations used in the development of the effect were Ilorin, Ile-Ife, Douala, Dar-es-Selam, Nairobi, Lusaka, and Brazilia.
This expression clearly confirms the variability of rainfall from place to place. When consistency test was carried out using the expression to generate rainrate for each location examined, the result obtained was reliable for rain intensities between 5mm/h and 200mm/h. The variability of α and β with latitude also shows that different latitudes have different cumulative rain distribution. The model proposed in this study would be one of the useful tools to Radio Engineers since the precipitation effect in the design of satellite and terrestrial microwave radio links is among the factors to consider when designing communication systems.
Keywords: Rain rate, attenuation, geographical location.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17142339 Bayesian Online Learning of Corresponding Points of Objects with Sequential Monte Carlo
Authors: Miika Toivanen, Jouko Lampinen
Abstract:
This paper presents an online method that learns the corresponding points of an object from un-annotated grayscale images containing instances of the object. In the first image being processed, an ensemble of node points is automatically selected which is matched in the subsequent images. A Bayesian posterior distribution for the locations of the nodes in the images is formed. The likelihood is formed from Gabor responses and the prior assumes the mean shape of the node ensemble to be similar in a translation and scale free space. An association model is applied for separating the object nodes and background nodes. The posterior distribution is sampled with Sequential Monte Carlo method. The matched object nodes are inferred to be the corresponding points of the object instances. The results show that our system matches the object nodes as accurately as other methods that train the model with annotated training images.Keywords: Bayesian modeling, Gabor filters, Online learning, Sequential Monte Carlo.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15812338 Solving the Flexible Job Shop Scheduling Problem with Uniform Processing Time Uncertainty
Authors: Nasr Al-Hinai, Tarek Y. ElMekkawy
Abstract:
The performance of schedules released to a shop floor may greatly be affected by unexpected disruptions. Thus, this paper considers the flexible job shop scheduling problem when processing times of some operations are represented by a uniform distribution with given lower and upper bounds. The objective is to find a predictive schedule that can deal with this uncertainty. The paper compares two genetic approaches to obtain predictive schedule. To determine the performance of the predictive schedules obtained by both approaches, an experimental study is conducted on a number of benchmark problems.
Keywords: Genetic algorithm, met-heuristic, robust scheduling, uncertainty of processing times
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2872