Search results for: negative binomial distribution
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2686

Search results for: negative binomial distribution

2686 Moment Estimators of the Parameters of Zero-One Inflated Negative Binomial Distribution

Authors: Rafid Saeed Abdulrazak Alshkaki

Abstract:

In this paper, zero-one inflated negative binomial distribution is considered, along with some of its structural properties, then its parameters were estimated using the method of moments. It is found that the method of moments to estimate the parameters of the zero-one inflated negative binomial models is not a proper method and may give incorrect conclusions.

Keywords: Zero one inflated models, negative binomial distribution, moments estimator, non-negative integer sampling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1036
2685 Estimation of Bayesian Sample Size for Binomial Proportions Using Areas P-tolerance with Lowest Posterior Loss

Authors: H. Bevrani, N. Najafi

Abstract:

This paper uses p-tolerance with the lowest posterior loss, quadratic loss function, average length criteria, average coverage criteria, and worst outcome criterion for computing of sample size to estimate proportion in Binomial probability function with Beta prior distribution. The proposed methodology is examined, and its effectiveness is shown.

Keywords: Bayesian inference, Beta-binomial Distribution, LPLcriteria, quadratic loss function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1705
2684 A Note on Negative Hypergeometric Distribution and Its Approximation

Authors: S. B. Mansuri

Abstract:

In this paper, at first we explain about negative hypergeometric distribution and its properties. Then we use the w-function and the Stein identity to give a result on the poisson approximation to the negative hypergeometric distribution in terms of the total variation distance between the negative hypergeometric and poisson distributions and its upper bound.

Keywords: Negative hypergeometric distribution, Poisson distribution, Poisson approximation, Stein-Chen identity, w-function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3027
2683 Statistical Analysis for Overdispersed Medical Count Data

Authors: Y. N. Phang, E. F. Loh

Abstract:

Many researchers have suggested the use of zero inflated Poisson (ZIP) and zero inflated negative binomial (ZINB) models in modeling overdispersed medical count data with extra variations caused by extra zeros and unobserved heterogeneity. The studies indicate that ZIP and ZINB always provide better fit than using the normal Poisson and negative binomial models in modeling overdispersed medical count data. In this study, we proposed the use of Zero Inflated Inverse Trinomial (ZIIT), Zero Inflated Poisson Inverse Gaussian (ZIPIG) and zero inflated strict arcsine models in modeling overdispered medical count data. These proposed models are not widely used by many researchers especially in the medical field. The results show that these three suggested models can serve as alternative models in modeling overdispersed medical count data. This is supported by the application of these suggested models to a real life medical data set. Inverse trinomial, Poisson inverse Gaussian and strict arcsine are discrete distributions with cubic variance function of mean. Therefore, ZIIT, ZIPIG and ZISA are able to accommodate data with excess zeros and very heavy tailed. They are recommended to be used in modeling overdispersed medical count data when ZIP and ZINB are inadequate.

Keywords: Zero inflated, inverse trinomial distribution, Poisson inverse Gaussian distribution, strict arcsine distribution, Pearson’s goodness of fit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3259
2682 Using Artificial Neural Network to Predict Collisions on Horizontal Tangents of 3D Two-Lane Highways

Authors: Omer F. Cansiz, Said M. Easa

Abstract:

The purpose of this study is mainly to predict collision frequency on the horizontal tangents combined with vertical curves using artificial neural network methods. The proposed ANN models are compared with existing regression models. First, the variables that affect collision frequency were investigated. It was found that only the annual average daily traffic, section length, access density, the rate of vertical curvature, smaller curve radius before and after the tangent were statistically significant according to related combinations. Second, three statistical models (negative binomial, zero inflated Poisson and zero inflated negative binomial) were developed using the significant variables for three alignment combinations. Third, ANN models are developed by applying the same variables for each combination. The results clearly show that the ANN models have the lowest mean square error value than those of the statistical models. Similarly, the AIC values of the ANN models are smaller to those of the regression models for all the combinations. Consequently, the ANN models have better statistical performances than statistical models for estimating collision frequency. The ANN models presented in this paper are recommended for evaluating the safety impacts 3D alignment elements on horizontal tangents.

Keywords: Collision frequency, horizontal tangent, 3D two-lane highway, negative binomial, zero inflated Poisson, artificial neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1595
2681 Modelling Dengue Fever (DF) and Dengue Haemorrhagic Fever (DHF) Outbreak Using Poisson and Negative Binomial Model

Authors: W. Y. Wan Fairos, W. H. Wan Azaki, L. Mohamad Alias, Y. Bee Wah

Abstract:

Dengue fever has become a major concern for health authorities all over the world particularly in the tropical countries. These countries, in particular are experiencing the most worrying outbreak of dengue fever (DF) and dengue haemorrhagic fever (DHF). The DF and DHF epidemics, thus, have become the main causes of hospital admissions and deaths in Malaysia. This paper, therefore, attempts to examine the environmental factors that may influence the recent dengue outbreak. The aim of this study is twofold, firstly is to establish a statistical model to describe the relationship between the number of dengue cases and a range of explanatory variables and secondly, to identify the lag operator for explanatory variables which affect the dengue incidence the most. The explanatory variables involved include the level of cloud cover, percentage of relative humidity, amount of rainfall, maximum temperature, minimum temperature and wind speed. The Poisson and Negative Binomial regression analyses were used in this study. The results of the analyses on the 915 observations (daily data taken from July 2006 to Dec 2008), reveal that the climatic factors comprising of daily temperature and wind speed were found to significantly influence the incidence of dengue fever after 2 and 3 weeks of their occurrences. The effect of humidity, on the other hand, appears to be significant only after 2 weeks.

Keywords: Dengue Fever, Dengue Hemorrhagic Fever, Negative Binomial Regression model, Poisson Regression model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2760
2680 Air Pollution and Respiratory-Related Restricted Activity Days in Tunisia

Authors: Mokhtar Kouki Inès Rekik

Abstract:

This paper focuses on the assessment of the air pollution and morbidity relationship in Tunisia. Air pollution is measured by ozone air concentration and the morbidity is measured by the number of respiratory-related restricted activity days during the 2-week period prior to the interview. Socioeconomic data are also collected in order to adjust for any confounding covariates. Our sample is composed by 407 Tunisian respondents; 44.7% are women, the average age is 35.2, near 69% are living in a house built after 1980, and 27.8% have reported at least one day of respiratory-related restricted activity. The model consists on the regression of the number of respiratory-related restricted activity days on the air quality measure and the socioeconomic covariates. In order to correct for zero-inflation and heterogeneity, we estimate several models (Poisson, negative binomial, zero inflated Poisson, Poisson hurdle, negative binomial hurdle and finite mixture Poisson models). Bootstrapping and post-stratification techniques are used in order to correct for any sample bias. According to the Akaike information criteria, the hurdle negative binomial model has the greatest goodness of fit. The main result indicates that, after adjusting for socioeconomic data, the ozone concentration increases the probability of positive number of restricted activity days.

Keywords: Bootstrapping, hurdle negbin model, overdispersion, ozone concentration, respiratory-related restricted activity days.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2080
2679 A Study on Exclusive Breastfeeding using Over-dispersed Statistical Models

Authors: Naushad Mamode Khan, Cheika Jahangeer, Maleika Heenaye-Mamode Khan

Abstract:

Breastfeeding is an important concept in the maternal life of a woman. In this paper, we focus on exclusive breastfeeding. Exclusive breastfeeding is the feeding of a baby on no other milk apart from breast milk. This type of breastfeeding is very important during the first six months because it supports optimal growth and development during infancy and reduces the risk of obliterating diseases and problems. Moreover, in Mauritius, exclusive breastfeeding has decreased the incidence and/or severity of diarrhea, lower respiratory infection and urinary tract infection. In this paper, we give an overview of exclusive breastfeeding in Mauritius and the factors influencing it. We further analyze the local practices of exclusive breastfeeding using the Generalized Poisson regression model and the negative-binomial model since the data are over-dispersed.

Keywords: Exclusive breast feeding, regression model, generalized poisson, negative binomial.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535
2678 Child Homicide Victimization and Community Context: A Research Note

Authors: Bohsiu Wu

Abstract:

Among serious crimes, child homicide is a rather rare event. However, the killing of children stirs up a special type of emotion in society that pales other criminal acts. This study examines the relevancy of three possible community-level explanations for child homicide: social deprivation, female empowerment, and social isolation. The social deprivation hypothesis posits that child homicide results from lack of resources in communities. The female empowerment hypothesis argues that a higher female status translates into a higher level of capability to prevent child homicide. Finally, the social isolation hypothesis regards child homicide as a result of lack of social connectivity. Child homicide data, aggregated by US postal ZIP codes in California from 1990 to 1999, were analyzed with a negative binomial regression. The results of the negative binomial analysis demonstrate that social deprivation is the most salient and consistent predictor among all other factors in explaining child homicide victimization at the ZIP-code level. Both social isolation and female labor force participation are weak predictors of child homicide victimization across communities. Further, results from the negative binomial regression show that it is the communities with a higher, not lower, degree of female labor force participation that are associated with a higher count of child homicide. It is possible that poor communities with a higher level of female employment have a lesser capacity to provide the necessary care and protection for the children. Policies aiming at reducing social deprivation and strengthening female empowerment possess the potential to reduce child homicide in the community.

Keywords: Child homicide, deprivation, empowerment, isolation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 639
2677 Time Series Forecasting Using a Hybrid RBF Neural Network and AR Model Based On Binomial Smoothing

Authors: Fengxia Zheng, Shouming Zhong

Abstract:

ANNARIMA that combines both autoregressive integrated moving average (ARIMA) model and artificial neural network (ANN) model is a valuable tool for modeling and forecasting nonlinear time series, yet the over-fitting problem is more likely to occur in neural network models. This paper provides a hybrid methodology that combines both radial basis function (RBF) neural network and auto regression (AR) model based on binomial smoothing (BS) technique which is efficient in data processing, which is called BSRBFAR. This method is examined by using the data of Canadian Lynx data. Empirical results indicate that the over-fitting problem can be eased using RBF neural network based on binomial smoothing which is called BS-RBF, and the hybrid model–BS-RBFAR can be an effective way to improve forecasting accuracy achieved by BSRBF used separately.

Keywords: Binomial smoothing (BS), hybrid, Canadian Lynx data, forecasting accuracy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3613
2676 An EWMA p Chart Based On Improved Square Root Transformation

Authors: S. Sukparungsee

Abstract:

Generally, the traditional Shewhart p chart has been developed by for charting the binomial data. This chart has been developed using the normal approximation with condition as low defect level and the small to moderate sample size. In real applications, however, are away from these assumptions due to skewness in the exact distribution. In this paper, a modified Exponentially Weighted Moving Average (EWMA) control chat for detecting a change in binomial data by improving square root transformations, namely ISRT p EWMA control chart. The numerical results show that ISRT p EWMA chart is superior to ISRT p chart for small to moderate shifts, otherwise, the latter is better for large shifts.

Keywords: Number of defects, Exponentially Weighted Moving Average, Average Run Length, Square root transformations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2424
2675 Acceptance Single Sampling Plan with Fuzzy Parameter with The Using of Poisson Distribution

Authors: Ezzatallah Baloui Jamkhaneh, Bahram Sadeghpour-Gildeh, Gholamhossein Yari

Abstract:

This purpose of this paper is to present the acceptance single sampling plan when the fraction of nonconforming items is a fuzzy number and being modeled based on the fuzzy Poisson distribution. We have shown that the operating characteristic (oc) curves of the plan is like a band having a high and low bounds whose width depends on the ambiguity proportion parameter in the lot when that sample size and acceptance numbers is fixed. Finally we completed discuss opinion by a numerical example. And then we compared the oc bands of using of binomial with the oc bands of using of Poisson distribution.

Keywords: Statistical quality control, acceptance single sampling, fuzzy number.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1936
2674 The Study of the Discrete Risk Model with Random Income

Authors: Peichen Zhao

Abstract:

In this paper, we extend the compound binomial model to the case where the premium income process, based on a binomial process, is no longer a linear function. First, a mathematically recursive formula is derived for non ruin probability, and then, we examine the expected discounted penalty function, satisfy a defect renewal equation. Third, the asymptotic estimate for the expected discounted penalty function is then given. Finally, we give two examples of ruin quantities to illustrate applications of the recursive formula and the asymptotic estimate for penalty function.

Keywords: Discounted penalty function, compound binomial process, recursive formula, discrete renewal equation, asymptotic estimate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1367
2673 Study on the Effect of Road Infrastructure, Socio-Economic and Demographic Features on Road Crashes in Bangladesh

Authors: Shakil M. Rifaat, Md. H. Rahman, Mohammed, Mosabbir Pasha

Abstract:

Road crashes not only claim lives and inflict injuries but also create economic burden to the society due to loss of productivity. The problem of deaths and injuries as a result of road traffic crashes is now acknowledged to be a global phenomenon with authorities in virtually all countries of the world concerned about the growth in the number of people killed and seriously injured on their roads. However, the road crash scenario of a developing country like Bangladesh is much worse comparing with this of developed countries. For developing proper countermeasures it is necessary to identify the factors affecting crash occurrences. The objectives of the study is to examine the effect of district wise road infrastructure, socioeconomic and demographic features on crash occurrence .The unit of analysis will be taken as individual district which has not been explored much in the past. Reported crash data obtained from Bangladesh Road Transport Authority (BRTA) from the year 2004 to 2010 are utilized to develop negative binomial model. The model result will reveal the effect of road length (both paved and unpaved), road infrastructure and several socio economic characteristics on district level crash frequency in Bangladesh.

Keywords: Demographic, Negative Binomial Model, Road Infrastructure, Socio-economic, Traffic Safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3260
2672 On the Use of Correlated Binary Model in Social Network Analysis

Authors: Elsayed A. Habib Elamir

Abstract:

In social network analysis the mean nodal degree and density of the graph can be considered as a measure of the activity of all actors in the network and this is an important property of a graph and for making comparisons among networks. Since subjects in a family or organization are subject to common environment factors, it is prime interest to study the association between responses. Therefore, we study the distribution of the mean nodal degree and density of the graph under correlated binary units. The cross product ratio is used to capture the intra-units association among subjects. Computer program and an application are given to show the benefits of the method.

Keywords: Correlated Binary data, cross product ratio, densityof the graph, multiplicative binomial distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1400
2671 Fermat’s Last Theorem a Simple Demonstration

Authors: Jose William Porras Ferreira

Abstract:

This paper presents two solutions to the Fermat’s Last Theorem (FLT). The first one using some algebraic basis related to the Pythagorean theorem, expression of equations, an analysis of their behavior, when compared with power  and power  and using " the “Well Ordering Principle” of natural numbers it is demonstrated that in Fermat equation . The second one solution is using the connection between  and power  through the Pascal’s triangle or  Newton’s binomial coefficients, where de Fermat equation do not fulfill the first coefficient, then it is impossible that:

zn=xn+yn for n>2 and (x, y, z) E Z+ - {0}

 

Keywords: Fermat’s Last Theorem, Pythagorean Theorem, Newton Binomial Coefficients, Pascal’s Triangle, Well Ordering Principle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2934
2670 Base Change for Fisher Metrics: Case of the q−Gaussian Inverse Distribution

Authors: Gabriel I. Loaiza O., Carlos A. Cadavid M., Juan C. Arango P.

Abstract:

It is known that the Riemannian manifold determined by the family of inverse Gaussian distributions endowed with the Fisher metric has negative constant curvature κ = −1/2 , as does the family of usual Gaussian distributions. In the present paper, firstly we arrive at this result by following a different path, much simpler than the previous ones. We first put the family in exponential form, thus endowing the family with a new set of parameters, or coordinates, θ1, θ2; then we determine the matrix of the Fisher metric in terms of these parameters; and finally we compute this matrix in the original parameters. Secondly, we define the Inverse q−Gaussian distribution family (q < 3), as the family obtained by replacing the usual exponential function by the Tsallis q−exponential function in the expression for the Inverse Gaussian distribution, and observe that it supports two possible geometries, the Fisher and the q−Fisher geometry. And finally, we apply our strategy to obtain results about the Fisher and q−Fisher geometry of the Inverse q−Gaussian distribution family, similar to the ones obtained in the case of the Inverse Gaussian distribution family.

Keywords: Base of Changes, Information Geometry, Inverse Gaussian distribution, Inverse q-Gaussian distribution, Statistical Manifolds.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 298
2669 A Posterior Predictive Model-Based Control Chart for Monitoring Healthcare

Authors: Yi-Fan Lin, Peter P. Howley, Frank A. Tuyl

Abstract:

Quality measurement and reporting systems are used in healthcare internationally. In Australia, the Australian Council on Healthcare Standards records and reports hundreds of clinical indicators (CIs) nationally across the healthcare system. These CIs are measures of performance in the clinical setting, and are used as a screening tool to help assess whether a standard of care is being met. Existing analysis and reporting of these CIs incorporate Bayesian methods to address sampling variation; however, such assessments are retrospective in nature, reporting upon the previous six or twelve months of data. The use of Bayesian methods within statistical process control for monitoring systems is an important pursuit to support more timely decision-making. Our research has developed and assessed a new graphical monitoring tool, similar to a control chart, based on the beta-binomial posterior predictive (BBPP) distribution to facilitate the real-time assessment of health care organizational performance via CIs. The BBPP charts have been compared with the traditional Bernoulli CUSUM (BC) chart by simulation. The more traditional “central” and “highest posterior density” (HPD) interval approaches were each considered to define the limits, and the multiple charts were compared via in-control and out-of-control average run lengths (ARLs), assuming that the parameter representing the underlying CI rate (proportion of cases with an event of interest) required estimation. Preliminary results have identified that the BBPP chart with HPD-based control limits provides better out-of-control run length performance than the central interval-based and BC charts. Further, the BC chart’s performance may be improved by using Bayesian parameter estimation of the underlying CI rate.

Keywords: Average run length, Bernoulli CUSUM chart, beta binomial posterior predictive distribution, clinical indicator, health care organization, highest posterior density interval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 804
2668 Evaluating Efficiency of Nina Distribution Company Using Window Data Envelopment Analysis and Malmquist Index

Authors: Hossein Taherian Far, Ali Bazaee

Abstract:

Achieving continuous sustained economic growth and following economic development can be the target for all countries which are looking for it. In this regard, distribution industry plays an important role in growth and development of any nation. So, estimating the efficiency and productivity of the so called industry and identifying factors influencing it, is very necessary. The objective of the present study is to measure the efficiency and productivity of seven branches of Nina Distribution Company using window data envelopment analysis and Malmquist productivity index from spring 2013 to summer 2015. In this study, using criteria of fixed assets, payroll personnel, operating costs and duration of collection of receivables were selected as inputs and people and net sales, gross profit and percentage of coverage to customers were selected as outputs. Then, the process of performance window data envelopment analysis was driven and process efficiency has been measured using Malmquist index. The results indicate that the average technical efficiency of window Data Envelopment Analysis (DEA) model and fluctuating trend is sustainable. But the average management efficiency in window DEA model is related with negative growth (decline) of about 13%. The mean scale efficiency in all windows, except in the second one which is faced with 8%, shows growth of 18% compared to the first window. On the other hand, the mean change in total factor productivity in all branches of the industry shows average negative growth (decrease) of 12% which are the result of a negative change in technology.

Keywords: Nina Distribution Company branches, window data envelopment analysis, Malmquist productivity index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1112
2667 Negative Temperature Dependence of a Gravity - A Reality

Authors: Alexander L. Dmitriev, Sophia A. Bulgakova

Abstract:

Temperature dependence of force of gravitation is one of the fundamental problems of physics. This problem has got special value in connection with that the general theory of relativity, supposing the weakest positive influence of a body temperature on its weight, actually rejects an opportunity of measurement of negative influence of temperature on gravity in laboratory conditions. Really, the recognition of negative temperature dependence of gravitation, for example, means basic impossibility of achievement of a singularity («a black hole») at a gravitational collapse. Laboratory experiments with exact weighing the heated up metal samples, indicating negative influence temperatures of bodies on their physical weight are described. Influence of mistakes of measurements is analyzed. Calculations of distribution of temperature in volume of the bar, agreed with experimental data of time dependence of weight of samples are executed. The physical substantiation of negative temperature dependence of weight of the bodies, based on correlation of acceleration at thermal movement of micro-particles of a body and its absolute temperature, are given.

Keywords: Gravitation, temperature, weight.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1770
2666 Bayesian Decision Approach to Protection on the Flood Event in Upper Ayeyarwady River, Myanmar

Authors: Min Min Swe Zin

Abstract:

This paper introduces the foundations of Bayesian probability theory and Bayesian decision method. The main goal of Bayesian decision theory is to minimize the expected loss of a decision or minimize the expected risk. The purposes of this study are to review the decision process on the issue of flood occurrences and to suggest possible process for decision improvement. This study examines the problem structure of flood occurrences and theoretically explicates the decision-analytic approach based on Bayesian decision theory and application to flood occurrences in Environmental Engineering. In this study, we will discuss about the flood occurrences upon an annual maximum water level in cm, 43-year record available from 1965 to 2007 at the gauging station of Sagaing on the Ayeyarwady River with the drainage area - 120193 sq km by using Bayesian decision method. As a result, we will discuss the loss and risk of vast areas of agricultural land whether which will be inundated or not in the coming year based on the two standard maximum water levels during 43 years. And also we forecast about that lands will be safe from flood water during the next 10 years.

Keywords: Bayesian decision method, conditional binomial distribution, minimax rules, prior beta distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1530
2665 Loss Analysis by Loading Conditions of Distribution Transformers

Authors: A. Bozkurt, C. Kocatepe, R. Yumurtaci, İ. C. Tastan, G. Tulun

Abstract:

Efficient use of energy, the increase in demand of energy and also with the reduction of natural energy sources, has improved its importance in recent years. Most of the losses in the system from electricity produced until the point of consumption is mostly composed by the energy distribution system. In this study, analysis of the resulting loss in power distribution transformer and distribution power cable is realized which are most of the losses in the distribution system. Transformer losses in the real distribution system are analyzed by CYME Power Engineering Software program. These losses are disclosed for different voltage levels and different loading conditions.

Keywords: Distribution system, distribution transformer, power cable, technical losses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2659
2664 Zero Truncated Strict Arcsine Model

Authors: Y. N. Phang, E. F. Loh

Abstract:

The zero truncated model is usually used in modeling count data without zero. It is the opposite of zero inflated model. Zero truncated Poisson and zero truncated negative binomial models are discussed and used by some researchers in analyzing the abundance of rare species and hospital stay. Zero truncated models are used as the base in developing hurdle models. In this study, we developed a new model, the zero truncated strict arcsine model, which can be used as an alternative model in modeling count data without zero and with extra variation. Two simulated and one real life data sets are used and fitted into this developed model. The results show that the model provides a good fit to the data. Maximum likelihood estimation method is used in estimating the parameters.

Keywords: Hurdle models, maximum likelihood estimation method, positive count data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1812
2663 Reliability Evaluation of Distribution System Considering Distributed Generation

Authors: Raju Kaduru, Narsaiah Srinivas Gondlala

Abstract:

This paper presents an analytical approach for evaluating distribution system reliability indices in the presence of distributed generation. Modeling distributed generation and evaluation of distribution system reliability indices using the frequency duration technique. Using model implements and case studies are discussed. Results showed that location of DG and its effect in distribution reliability indices. In this respect, impact of DG on distribution system is investigated using the IEEE Roy Billinton test system (RBTS2) included feeder 1. Therefore, it will help to the distribution system planners in the DG resource placement.

Keywords: Distributed Generation, DG Location, Distribution System, Reliability Indices.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2144
2662 Metal Berthelot Tubes with Windows for Observing Cavitation under Static Negative Pressure

Authors: K. Hiro, Y. Imai, T. Sasayama

Abstract:

Cavitation under static negative pressure is not revealed well. The Berthelot method to generate such negative pressure can be a means to study cavitation inception. In this study, metal Berthelot tubes built in observation windows are newly developed and are checked whether high static negative pressure is generated or not. Negative pressure in the tube with a pair of a corundum plate and an aluminum gasket increased with temperature cycles. The trend was similar to that as reported before.

Keywords: Berthelot method, negative pressure, cavitation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1005
2661 Exponentiated Transmuted Weibull Distribution A Generalization of the Weibull Distribution

Authors: Abd El Hady N. Ebraheim

Abstract:

This paper introduces a new generalization of the two parameter Weibull distribution. To this end, the quadratic rank transmutation map has been used. This new distribution is named exponentiated transmuted Weibull (ETW) distribution. The ETW distribution has the advantage of being capable of modeling various shapes of aging and failure criteria. Furthermore, eleven lifetime distributions such as the Weibull, exponentiated Weibull, Rayleigh and exponential distributions, among others follow as special cases. The properties of the new model are discussed and the maximum likelihood estimation is used to estimate the parameters. Explicit expressions are derived for the quantiles. The moments of the distribution are derived, and the order statistics are examined.

Keywords: Exponentiated, Inversion Method, Maximum Likelihood Estimation, Transmutation Map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3413
2660 Modelling Sudden Deaths from Myocardial Infarction and Stroke

Authors: Yusoff Y. S., Streftaris, G., Waters, H. R

Abstract:

Death within 30 days is an important factor to be looked into, as there is a significant risk of deaths immediately following or soon after, myocardial infarction (MI) or stroke. In this paper, we will model the deaths within 30 days following a myocardial infarction (MI) or stroke in the UK. We will see how the probabilities of sudden deaths from MI or stroke have changed over the period 1981-2000. We will model the sudden deaths using a generalized linear model (GLM), fitted using the R statistical package, under a Binomial distribution for the number of sudden deaths. We parameterize our model using the extensive and detailed data from the Framingham Heart Study, adjusted to match UK rates. The results show that there is a reduction for the sudden deaths following a MI over time but no significant improvement for sudden deaths following a stroke.

Keywords: Sudden deaths, myocardial infarction, stroke, ischemic heart disease.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1744
2659 Efficient Design of Distribution Logistics by Using a Model-Based Decision Support System

Authors: J. Becker, R. Arnold

Abstract:

The design of distribution logistics has a decisive impact on a company's logistics costs and performance. Hence, such solutions make an essential contribution to corporate success. This article describes a decision support system for analyzing the potential of distribution logistics in terms of logistics costs and performance. In contrast to previous procedures of business process re-engineering (BPR), this method maps distribution logistics holistically under variable distribution structures. Combined with qualitative measures the decision support system will contribute to a more efficient design of distribution logistics.

Keywords: Decision support system distribution logistics, potential analyses, supply chain management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1778
2658 Optimal Compensation of Reactive Power in the Restructured Distribution Network

Authors: Atefeh Pourshafie, Mohsen. Saniei, S. S. Mortazavi, A. Saeedian

Abstract:

In this paper optimal capacitor placement problem has been formulated in a restructured distribution network. In this scenario the distribution network operator can consider reactive energy also as a service that can be sold to transmission system. Thus search for optimal location, size and number of capacitor banks with the objective of loss reduction, maximum income from selling reactive energy to transmission system and return on investment for capacitors, has been performed. Results is influenced with economic value of reactive energy, therefore problem has been solved for various amounts of it. The implemented optimization technique is genetic algorithm. For any value of reactive power economic value, when reverse of investment index increase and change from zero or negative values to positive values, the threshold value of selling reactive power has been obtained. This increasing price of economic parameter is reasonable until the network losses is less than loss before compensation.

Keywords: capacitor placement, deregulated electric market, distribution network optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2074
2657 Learning to Recommend with Negative Ratings Based on Factorization Machine

Authors: Caihong Sun, Xizi Zhang

Abstract:

Rating prediction is an important problem for recommender systems. The task is to predict the rating for an item that a user would give. Most of the existing algorithms for the task ignore the effect of negative ratings rated by users on items, but the negative ratings have a significant impact on users’ purchasing decisions in practice. In this paper, we present a rating prediction algorithm based on factorization machines that consider the effect of negative ratings inspired by Loss Aversion theory. The aim of this paper is to develop a concave and a convex negative disgust function to evaluate the negative ratings respectively. Experiments are conducted on MovieLens dataset. The experimental results demonstrate the effectiveness of the proposed methods by comparing with other four the state-of-the-art approaches. The negative ratings showed much importance in the accuracy of ratings predictions.

Keywords: Factorization machines, feature engineering, negative ratings, recommendation systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 893