Search results for: Binomial smoothing (BS)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 116

Search results for: Binomial smoothing (BS)

116 Time Series Forecasting Using a Hybrid RBF Neural Network and AR Model Based On Binomial Smoothing

Authors: Fengxia Zheng, Shouming Zhong

Abstract:

ANNARIMA that combines both autoregressive integrated moving average (ARIMA) model and artificial neural network (ANN) model is a valuable tool for modeling and forecasting nonlinear time series, yet the over-fitting problem is more likely to occur in neural network models. This paper provides a hybrid methodology that combines both radial basis function (RBF) neural network and auto regression (AR) model based on binomial smoothing (BS) technique which is efficient in data processing, which is called BSRBFAR. This method is examined by using the data of Canadian Lynx data. Empirical results indicate that the over-fitting problem can be eased using RBF neural network based on binomial smoothing which is called BS-RBF, and the hybrid model–BS-RBFAR can be an effective way to improve forecasting accuracy achieved by BSRBF used separately.

Keywords: Binomial smoothing (BS), hybrid, Canadian Lynx data, forecasting accuracy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3622
115 Using “Eckel” Model to Measure Income Smoothing Practices: The Case of French Companies

Authors: Feddaoui Amina

Abstract:

Income smoothing represents an attempt on the part of the company's management to reduce variations in earnings through the manipulation of the accounting principles. In this study, we aimed to measure income smoothing practices in a sample of 30 French joint stock companies during the period (2007-2009), we used Dummy variables method and “ECKEL” model to measure income smoothing practices and Binomial test accourding to SPSS program, to confirm or refute our hypothesis. This study concluded that there are no significant statistical indicators of income smoothing practices in the sample studied of French companies during the period (2007-2009), so the income series in the same sample studied of is characterized by stability and non-volatility without any intervention of management through accounting manipulation. However, this type of accounting manipulation should be taken into account and efforts should be made by control bodies to apply Eckel model and generalize its use at the global level.

Keywords: Income, smoothing, “Eckel”, French companies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 950
114 Moment Estimators of the Parameters of Zero-One Inflated Negative Binomial Distribution

Authors: Rafid Saeed Abdulrazak Alshkaki

Abstract:

In this paper, zero-one inflated negative binomial distribution is considered, along with some of its structural properties, then its parameters were estimated using the method of moments. It is found that the method of moments to estimate the parameters of the zero-one inflated negative binomial models is not a proper method and may give incorrect conclusions.

Keywords: Zero one inflated models, negative binomial distribution, moments estimator, non-negative integer sampling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1043
113 Peakwise Smoothing of Data Models using Wavelets

Authors: D Sudheer Reddy, N Gopal Reddy, P V Radhadevi, J Saibaba, Geeta Varadan

Abstract:

Smoothing or filtering of data is first preprocessing step for noise suppression in many applications involving data analysis. Moving average is the most popular method of smoothing the data, generalization of this led to the development of Savitzky-Golay filter. Many window smoothing methods were developed by convolving the data with different window functions for different applications; most widely used window functions are Gaussian or Kaiser. Function approximation of the data by polynomial regression or Fourier expansion or wavelet expansion also gives a smoothed data. Wavelets also smooth the data to great extent by thresholding the wavelet coefficients. Almost all smoothing methods destroys the peaks and flatten them when the support of the window is increased. In certain applications it is desirable to retain peaks while smoothing the data as much as possible. In this paper we present a methodology called as peak-wise smoothing that will smooth the data to any desired level without losing the major peak features.

Keywords: smoothing, moving average, peakwise smoothing, spatialdensity models, planar shape models, wavelets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1710
112 Estimation of Train Operation Using an Exponential Smoothing Method

Authors: Taiyo Matsumura, Kuninori Takahashi, Takashi Ono

Abstract:

The purpose of this research is to improve the convenience of waiting for trains at level crossings and stations and to prevent accidents resulting from forcible entry into level crossings, by providing level crossing users and passengers with information that tells them when the next train will pass through or arrive. For this paper, we proposed methods for estimating operation by means of an average value method, variable response smoothing method, and exponential smoothing method, on the basis of open data, which has low accuracy, but for which performance schedules are distributed in real time. We then examined the accuracy of the estimations. The results showed that the application of an exponential smoothing method is valid.

Keywords: Exponential smoothing method, open data, operation estimation, train schedule.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 671
111 A Comparison of the Nonparametric Regression Models using Smoothing Spline and Kernel Regression

Authors: Dursun Aydin

Abstract:

This paper study about using of nonparametric models for Gross National Product data in Turkey and Stanford heart transplant data. It is discussed two nonparametric techniques called smoothing spline and kernel regression. The main goal is to compare the techniques used for prediction of the nonparametric regression models. According to the results of numerical studies, it is concluded that smoothing spline regression estimators are better than those of the kernel regression.

Keywords: Kernel regression, Nonparametric models, Prediction, Smoothing spline.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3059
110 Microarrays Denoising via Smoothing of Coefficients in Wavelet Domain

Authors: Mario Mastriani, Alberto E. Giraldez

Abstract:

We describe a novel method for removing noise (in wavelet domain) of unknown variance from microarrays. The method is based on a smoothing of the coefficients of the highest subbands. Specifically, we decompose the noisy microarray into wavelet subbands, apply smoothing within each highest subband, and reconstruct a microarray from the modified wavelet coefficients. This process is applied a single time, and exclusively to the first level of decomposition, i.e., in most of the cases, it is not necessary a multirresoltuion analysis. Denoising results compare favorably to the most of methods in use at the moment.

Keywords: Directional smoothing, denoising, edge preservation, microarrays, thresholding, wavelets

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1464
109 Estimation of Bayesian Sample Size for Binomial Proportions Using Areas P-tolerance with Lowest Posterior Loss

Authors: H. Bevrani, N. Najafi

Abstract:

This paper uses p-tolerance with the lowest posterior loss, quadratic loss function, average length criteria, average coverage criteria, and worst outcome criterion for computing of sample size to estimate proportion in Binomial probability function with Beta prior distribution. The proposed methodology is examined, and its effectiveness is shown.

Keywords: Bayesian inference, Beta-binomial Distribution, LPLcriteria, quadratic loss function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715
108 Discrete Polynomial Moments and Savitzky-Golay Smoothing

Authors: Paul O'Leary, Matthew Harker

Abstract:

This paper presents unified theory for local (Savitzky- Golay) and global polynomial smoothing. The algebraic framework can represent any polynomial approximation and is seamless from low degree local, to high degree global approximations. The representation of the smoothing operator as a projection onto orthonormal basis functions enables the computation of: the covariance matrix for noise propagation through the filter; the noise gain and; the frequency response of the polynomial filters. A virtually perfect Gram polynomial basis is synthesized, whereby polynomials of degree d = 1000 can be synthesized without significant errors. The perfect basis ensures that the filters are strictly polynomial preserving. Given n points and a support length ls = 2m + 1 then the smoothing operator is strictly linear phase for the points xi, i = m+1. . . n-m. The method is demonstrated on geometric surfaces data lying on an invariant 2D lattice.

Keywords: Gram polynomials, Savitzky-Golay Smoothing, Discrete Polynomial Moments

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2727
107 Forecasting Unemployment Rate in Selected European Countries Using Smoothing Methods

Authors: Ksenija Dumičić, Anita Čeh Časni, Berislav Žmuk

Abstract:

The aim of this paper is to select the most accurate forecasting method for predicting the future values of the unemployment rate in selected European countries. In order to do so, several forecasting techniques adequate for forecasting time series with trend component, were selected, namely: double exponential smoothing (also known as Holt`s method) and Holt-Winters` method which accounts for trend and seasonality. The results of the empirical analysis showed that the optimal model for forecasting unemployment rate in Greece was Holt-Winters` additive method. In the case of Spain, according to MAPE, the optimal model was double exponential smoothing model. Furthermore, for Croatia and Italy the best forecasting model for unemployment rate was Holt-Winters` multiplicative model, whereas in the case of Portugal the best model to forecast unemployment rate was Double exponential smoothing model. Our findings are in line with European Commission unemployment rate estimates.

Keywords: European Union countries, exponential smoothing methods, forecast accuracy unemployment rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3728
106 The Study of the Discrete Risk Model with Random Income

Authors: Peichen Zhao

Abstract:

In this paper, we extend the compound binomial model to the case where the premium income process, based on a binomial process, is no longer a linear function. First, a mathematically recursive formula is derived for non ruin probability, and then, we examine the expected discounted penalty function, satisfy a defect renewal equation. Third, the asymptotic estimate for the expected discounted penalty function is then given. Finally, we give two examples of ruin quantities to illustrate applications of the recursive formula and the asymptotic estimate for penalty function.

Keywords: Discounted penalty function, compound binomial process, recursive formula, discrete renewal equation, asymptotic estimate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1376
105 Proposal of Additional Fuzzy Membership Functions in Smoothing Transition Autoregressive Models

Authors: Ε. Giovanis

Abstract:

In this paper we present, propose and examine additional membership functions for the Smoothing Transition Autoregressive (STAR) models. More specifically, we present the tangent hyperbolic, Gaussian and Generalized bell functions. Because Smoothing Transition Autoregressive (STAR) models follow fuzzy logic approach, more fuzzy membership functions should be tested. Furthermore, fuzzy rules can be incorporated or other training or computational methods can be applied as the error backpropagation or genetic algorithm instead to nonlinear squares. We examine two macroeconomic variables of US economy, the inflation rate and the 6-monthly treasury bills interest rates.

Keywords: Forecast , Fuzzy membership functions, Smoothingtransition, Time-series

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1490
104 Application of Adaptive Neuro-Fuzzy Inference System in Smoothing Transition Autoregressive Models

Authors: Ε. Giovanis

Abstract:

In this paper we propose and examine an Adaptive Neuro-Fuzzy Inference System (ANFIS) in Smoothing Transition Autoregressive (STAR) modeling. Because STAR models follow fuzzy logic approach, in the non-linear part fuzzy rules can be incorporated or other training or computational methods can be applied as the error backpropagation algorithm instead to nonlinear squares. Furthermore, additional fuzzy membership functions can be examined, beside the logistic and exponential, like the triangle, Gaussian and Generalized Bell functions among others. We examine two macroeconomic variables of US economy, the inflation rate and the 6-monthly treasury bills interest rates.

Keywords: Forecasting, Neuro-Fuzzy, Smoothing transition, Time-series

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1591
103 The Hyperbolic Smoothing Approach for Automatic Calibration of Rainfall-Runoff Models

Authors: Adilson Elias Xavier, Otto Corrêa Rotunno Filho, Paulo Canedo de Magalhães

Abstract:

This paper addresses the issue of automatic parameter estimation in conceptual rainfall-runoff (CRR) models. Due to threshold structures commonly occurring in CRR models, the associated mathematical optimization problems have the significant characteristic of being strongly non-differentiable. In order to face this enormous task, the resolution method proposed adopts a smoothing strategy using a special C∞ differentiable class function. The final estimation solution is obtained by solving a sequence of differentiable subproblems which gradually approach the original conceptual problem. The use of this technique, called Hyperbolic Smoothing Method (HSM), makes possible the application of the most powerful minimization algorithms, and also allows for the main difficulties presented by the original CRR problem to be overcome. A set of computational experiments is presented for the purpose of illustrating both the reliability and the efficiency of the proposed approach.

Keywords: Rainfall-runoff models, optimization procedure, automatic parameter calibration, hyperbolic smoothing method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 354
102 An Improved Illumination Normalization based on Anisotropic Smoothing for Face Recognition

Authors: Sanghoon Kim, Sun-Tae Chung, Souhwan Jung, Seongwon Cho

Abstract:

Robust face recognition under various illumination environments is very difficult and needs to be accomplished for successful commercialization. In this paper, we propose an improved illumination normalization method for face recognition. Illumination normalization algorithm based on anisotropic smoothing is well known to be effective among illumination normalization methods but deteriorates the intensity contrast of the original image, and incurs less sharp edges. The proposed method in this paper improves the previous anisotropic smoothing-based illumination normalization method so that it increases the intensity contrast and enhances the edges while diminishing the effect of illumination variations. Due to the result of these improvements, face images preprocessed by the proposed illumination normalization method becomes to have more distinctive feature vectors (Gabor feature vectors) for face recognition. Through experiments of face recognition based on Gabor feature vector similarity, the effectiveness of the proposed illumination normalization method is verified.

Keywords: Illumination Normalization, Face Recognition, Anisotropic smoothing, Gabor feature vector.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1507
101 Statistical and Land Planning Study of Tourist Arrivals in Greece during 2005-2016

Authors: Dimitra Alexiou

Abstract:

During the last 10 years, in spite of the economic crisis, the number of tourists arriving in Greece has increased, particularly during the tourist season from April to October. In this paper, the number of annual tourist arrivals is studied to explore their preferences with regard to the month of travel, the selected destinations, as well the amount of money spent. The collected data are processed with statistical methods, yielding numerical and graphical results. From the computation of statistical parameters and the forecasting with exponential smoothing, useful conclusions are arrived at that can be used by the Greek tourism authorities, as well as by tourist organizations, for planning purposes for the coming years. The results of this paper and the computed forecast can also be used for decision making by private tourist enterprises that are investing in Greece. With regard to the statistical methods, the method of Simple Exponential Smoothing of time series of data is employed. The search for a best forecast for 2017 and 2018 provides the value of the smoothing coefficient. For all statistical computations and graphics Microsoft Excel is used.

Keywords: Tourism, statistical methods, exponential smoothing, land spatial planning, economy, Microsoft Excel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 664
100 Despeckling of Synthetic Aperture Radar Images Using Inner Product Spaces in Undecimated Wavelet Domain

Authors: Syed Musharaf Ali, Muhammad Younus Javed, Naveed Sarfraz Khattak, Athar Mohsin, UmarFarooq

Abstract:

This paper introduces the effective speckle reduction of synthetic aperture radar (SAR) images using inner product spaces in undecimated wavelet domain. There are two major areas in projection onto span algorithm where improvement can be made. First is the use of undecimated wavelet transformation instead of discrete wavelet transformation. And second area is the use of smoothing filter namely directional smoothing filter which is an additional step. Proposed method does not need any noise estimation and thresholding technique. More over proposed method gives good results on both single polarimetric and fully polarimetric SAR images.

Keywords: Directional Smoothing, Inner product, Length ofvector, Undecimated wavelet transformation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563
99 Using Artificial Neural Network to Predict Collisions on Horizontal Tangents of 3D Two-Lane Highways

Authors: Omer F. Cansiz, Said M. Easa

Abstract:

The purpose of this study is mainly to predict collision frequency on the horizontal tangents combined with vertical curves using artificial neural network methods. The proposed ANN models are compared with existing regression models. First, the variables that affect collision frequency were investigated. It was found that only the annual average daily traffic, section length, access density, the rate of vertical curvature, smaller curve radius before and after the tangent were statistically significant according to related combinations. Second, three statistical models (negative binomial, zero inflated Poisson and zero inflated negative binomial) were developed using the significant variables for three alignment combinations. Third, ANN models are developed by applying the same variables for each combination. The results clearly show that the ANN models have the lowest mean square error value than those of the statistical models. Similarly, the AIC values of the ANN models are smaller to those of the regression models for all the combinations. Consequently, the ANN models have better statistical performances than statistical models for estimating collision frequency. The ANN models presented in this paper are recommended for evaluating the safety impacts 3D alignment elements on horizontal tangents.

Keywords: Collision frequency, horizontal tangent, 3D two-lane highway, negative binomial, zero inflated Poisson, artificial neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1604
98 Fermat’s Last Theorem a Simple Demonstration

Authors: Jose William Porras Ferreira

Abstract:

This paper presents two solutions to the Fermat’s Last Theorem (FLT). The first one using some algebraic basis related to the Pythagorean theorem, expression of equations, an analysis of their behavior, when compared with power  and power  and using " the “Well Ordering Principle” of natural numbers it is demonstrated that in Fermat equation . The second one solution is using the connection between  and power  through the Pascal’s triangle or  Newton’s binomial coefficients, where de Fermat equation do not fulfill the first coefficient, then it is impossible that:

zn=xn+yn for n>2 and (x, y, z) E Z+ - {0}

 

Keywords: Fermat’s Last Theorem, Pythagorean Theorem, Newton Binomial Coefficients, Pascal’s Triangle, Well Ordering Principle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2944
97 Intelligent Assistive Methods for Diagnosis of Rheumatoid Arthritis Using Histogram Smoothing and Feature Extraction of Bone Images

Authors: SP. Chokkalingam, K. Komathy

Abstract:

Advances in the field of image processing envision a new era of evaluation techniques and application of procedures in various different fields. One such field being considered is the biomedical field for prognosis as well as diagnosis of diseases. This plethora of methods though provides a wide range of options to select from, it also proves confusion in selecting the apt process and also in finding which one is more suitable. Our objective is to use a series of techniques on bone scans, so as to detect the occurrence of rheumatoid arthritis (RA) as accurately as possible. Amongst other techniques existing in the field our proposed system tends to be more effective as it depends on new methodologies that have been proved to be better and more consistent than others. Computer aided diagnosis will provide more accurate and infallible rate of consistency that will help to improve the efficiency of the system. The image first undergoes histogram smoothing and specification, morphing operation, boundary detection by edge following algorithm and finally image subtraction to determine the presence of rheumatoid arthritis in a more efficient and effective way. Using preprocessing noises are removed from images and using segmentation, region of interest is found and Histogram smoothing is applied for a specific portion of the images. Gray level co-occurrence matrix (GLCM) features like Mean, Median, Energy, Correlation, Bone Mineral Density (BMD) and etc. After finding all the features it stores in the database. This dataset is trained with inflamed and noninflamed values and with the help of neural network all the new images are checked properly for their status and Rough set is implemented for further reduction.

Keywords: Computer Aided Diagnosis, Edge Detection, Histogram Smoothing, Rheumatoid Arthritis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2435
96 Hourly Electricity Load Forecasting: An Empirical Application to the Italian Railways

Authors: M. Centra

Abstract:

Due to the liberalization of countless electricity markets, load forecasting has become crucial to all public utilities for which electricity is a strategic variable. With the goal of contributing to the forecasting process inside public utilities, this paper addresses the issue of applying the Holt-Winters exponential smoothing technique and the time series analysis for forecasting the hourly electricity load curve of the Italian railways. The results of the analysis confirm the accuracy of the two models and therefore the relevance of forecasting inside public utilities.

Keywords: ARIMA models, Exponential smoothing, Electricity, Load forecasting, Rail transportation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2585
95 Modelling Dengue Fever (DF) and Dengue Haemorrhagic Fever (DHF) Outbreak Using Poisson and Negative Binomial Model

Authors: W. Y. Wan Fairos, W. H. Wan Azaki, L. Mohamad Alias, Y. Bee Wah

Abstract:

Dengue fever has become a major concern for health authorities all over the world particularly in the tropical countries. These countries, in particular are experiencing the most worrying outbreak of dengue fever (DF) and dengue haemorrhagic fever (DHF). The DF and DHF epidemics, thus, have become the main causes of hospital admissions and deaths in Malaysia. This paper, therefore, attempts to examine the environmental factors that may influence the recent dengue outbreak. The aim of this study is twofold, firstly is to establish a statistical model to describe the relationship between the number of dengue cases and a range of explanatory variables and secondly, to identify the lag operator for explanatory variables which affect the dengue incidence the most. The explanatory variables involved include the level of cloud cover, percentage of relative humidity, amount of rainfall, maximum temperature, minimum temperature and wind speed. The Poisson and Negative Binomial regression analyses were used in this study. The results of the analyses on the 915 observations (daily data taken from July 2006 to Dec 2008), reveal that the climatic factors comprising of daily temperature and wind speed were found to significantly influence the incidence of dengue fever after 2 and 3 weeks of their occurrences. The effect of humidity, on the other hand, appears to be significant only after 2 weeks.

Keywords: Dengue Fever, Dengue Hemorrhagic Fever, Negative Binomial Regression model, Poisson Regression model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2764
94 Air Pollution and Respiratory-Related Restricted Activity Days in Tunisia

Authors: Mokhtar Kouki Inès Rekik

Abstract:

This paper focuses on the assessment of the air pollution and morbidity relationship in Tunisia. Air pollution is measured by ozone air concentration and the morbidity is measured by the number of respiratory-related restricted activity days during the 2-week period prior to the interview. Socioeconomic data are also collected in order to adjust for any confounding covariates. Our sample is composed by 407 Tunisian respondents; 44.7% are women, the average age is 35.2, near 69% are living in a house built after 1980, and 27.8% have reported at least one day of respiratory-related restricted activity. The model consists on the regression of the number of respiratory-related restricted activity days on the air quality measure and the socioeconomic covariates. In order to correct for zero-inflation and heterogeneity, we estimate several models (Poisson, negative binomial, zero inflated Poisson, Poisson hurdle, negative binomial hurdle and finite mixture Poisson models). Bootstrapping and post-stratification techniques are used in order to correct for any sample bias. According to the Akaike information criteria, the hurdle negative binomial model has the greatest goodness of fit. The main result indicates that, after adjusting for socioeconomic data, the ozone concentration increases the probability of positive number of restricted activity days.

Keywords: Bootstrapping, hurdle negbin model, overdispersion, ozone concentration, respiratory-related restricted activity days.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2089
93 An EWMA p Chart Based On Improved Square Root Transformation

Authors: S. Sukparungsee

Abstract:

Generally, the traditional Shewhart p chart has been developed by for charting the binomial data. This chart has been developed using the normal approximation with condition as low defect level and the small to moderate sample size. In real applications, however, are away from these assumptions due to skewness in the exact distribution. In this paper, a modified Exponentially Weighted Moving Average (EWMA) control chat for detecting a change in binomial data by improving square root transformations, namely ISRT p EWMA control chart. The numerical results show that ISRT p EWMA chart is superior to ISRT p chart for small to moderate shifts, otherwise, the latter is better for large shifts.

Keywords: Number of defects, Exponentially Weighted Moving Average, Average Run Length, Square root transformations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2430
92 A Study on Exclusive Breastfeeding using Over-dispersed Statistical Models

Authors: Naushad Mamode Khan, Cheika Jahangeer, Maleika Heenaye-Mamode Khan

Abstract:

Breastfeeding is an important concept in the maternal life of a woman. In this paper, we focus on exclusive breastfeeding. Exclusive breastfeeding is the feeding of a baby on no other milk apart from breast milk. This type of breastfeeding is very important during the first six months because it supports optimal growth and development during infancy and reduces the risk of obliterating diseases and problems. Moreover, in Mauritius, exclusive breastfeeding has decreased the incidence and/or severity of diarrhea, lower respiratory infection and urinary tract infection. In this paper, we give an overview of exclusive breastfeeding in Mauritius and the factors influencing it. We further analyze the local practices of exclusive breastfeeding using the Generalized Poisson regression model and the negative-binomial model since the data are over-dispersed.

Keywords: Exclusive breast feeding, regression model, generalized poisson, negative binomial.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1547
91 Child Homicide Victimization and Community Context: A Research Note

Authors: Bohsiu Wu

Abstract:

Among serious crimes, child homicide is a rather rare event. However, the killing of children stirs up a special type of emotion in society that pales other criminal acts. This study examines the relevancy of three possible community-level explanations for child homicide: social deprivation, female empowerment, and social isolation. The social deprivation hypothesis posits that child homicide results from lack of resources in communities. The female empowerment hypothesis argues that a higher female status translates into a higher level of capability to prevent child homicide. Finally, the social isolation hypothesis regards child homicide as a result of lack of social connectivity. Child homicide data, aggregated by US postal ZIP codes in California from 1990 to 1999, were analyzed with a negative binomial regression. The results of the negative binomial analysis demonstrate that social deprivation is the most salient and consistent predictor among all other factors in explaining child homicide victimization at the ZIP-code level. Both social isolation and female labor force participation are weak predictors of child homicide victimization across communities. Further, results from the negative binomial regression show that it is the communities with a higher, not lower, degree of female labor force participation that are associated with a higher count of child homicide. It is possible that poor communities with a higher level of female employment have a lesser capacity to provide the necessary care and protection for the children. Policies aiming at reducing social deprivation and strengthening female empowerment possess the potential to reduce child homicide in the community.

Keywords: Child homicide, deprivation, empowerment, isolation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 644
90 Orthogonal Regression for Nonparametric Estimation of Errors-in-Variables Models

Authors: Anastasiia Yu. Timofeeva

Abstract:

Two new algorithms for nonparametric estimation of errors-in-variables models are proposed. The first algorithm is based on penalized regression spline. The spline is represented as a piecewise-linear function and for each linear portion orthogonal regression is estimated. This algorithm is iterative. The second algorithm involves locally weighted regression estimation. When the independent variable is measured with error such estimation is a complex nonlinear optimization problem. The simulation results have shown the advantage of the second algorithm under the assumption that true smoothing parameters values are known. Nevertheless the use of some indexes of fit to smoothing parameters selection gives the similar results and has an oversmoothing effect.

Keywords: Grade point average, orthogonal regression, penalized regression spline, locally weighted regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2095
89 Forecasting Rainfall in Thailand: A Case Study of Nakhon Ratchasima Province

Authors: N. Sopipan

Abstract:

In this paper, we study the rainfall using a time series for weather stations in Nakhon Ratchasima province in Thailand by various statistical methods to enable us to analyse the behaviour of rainfall in the study areas. Time-series analysis is an important tool in modelling and forecasting rainfall. The ARIMA and Holt-Winter models were built on the basis of exponential smoothing. All the models proved to be adequate. Therefore it is possible to give information that can help decision makers establish strategies for the proper planning of agriculture, drainage systems and other water resource applications in Nakhon Ratchasima province. We obtained the best performance from forecasting with the ARIMA Model(1,0,1)(1,0,1)12.

Keywords: ARIMA Models, Exponential Smoothing, Holt- Winter model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2639
88 Fast Calculation for Particle Interactions in SPH Simulations: Outlined Sub-domain Technique

Authors: Buntara Sthenly Gan, Naohiro Kawada

Abstract:

A simple and easy algorithm is presented for a fast calculation of kernel functions which required in fluid simulations using the Smoothed Particle Hydrodynamic (SPH) method. Present proposed algorithm improves the Linked-list algorithm and adopts the Pair-Wise Interaction technique, which are widely used for evaluating kernel functions in fluid simulations using the SPH method. The algorithm is easy to be implemented without any complexities in programming. Some benchmark examples are used to show the simulation time saved by using the proposed algorithm. Parametric studies on the number of divisions for sub-domains, smoothing length and total amount of particles are conducted to show the effectiveness of the present technique. A compact formulation is proposed for practical usage.

Keywords: Technique, fluid simulation, smoothing particle hydrodynamic (SPH), particle interaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1592
87 A Comparison of the Sum of Squares in Linear and Partial Linear Regression Models

Authors: Dursun Aydın

Abstract:

In this paper, estimation of the linear regression model is made by ordinary least squares method and the partially linear regression model is estimated by penalized least squares method using smoothing spline. Then, it is investigated that differences and similarity in the sum of squares related for linear regression and partial linear regression models (semi-parametric regression models). It is denoted that the sum of squares in linear regression is reduced to sum of squares in partial linear regression models. Furthermore, we indicated that various sums of squares in the linear regression are similar to different deviance statements in partial linear regression. In addition to, coefficient of the determination derived in linear regression model is easily generalized to coefficient of the determination of the partial linear regression model. For this aim, it is made two different applications. A simulated and a real data set are considered to prove the claim mentioned here. In this way, this study is supported with a simulation and a real data example.

Keywords: Partial Linear Regression Model, Linear RegressionModel, Residuals, Deviance, Smoothing Spline.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1834