Search results for: most likelihood amplitude estimation (MLQAE)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3013

Search results for: most likelihood amplitude estimation (MLQAE)

3013 Digital Material Characterization Using the Quantum Fourier Transform

Authors: Felix Givois, Nicolas R. Gauger, Matthias Kabel

Abstract:

The efficient digital material characterization is of great interest to many fields of application. It consists of the following three steps. First, a 3D reconstruction of 2D scans must be performed. Then, the resulting gray-value image of the material sample is enhanced by image processing methods. Finally, partial differential equations (PDE) are solved on the segmented image, and by averaging the resulting solutions fields, effective properties like stiffness or conductivity can be computed. Due to the high resolution of current CT images, the latter is typically performed with matrix-free solvers. Among them, a solver that uses the explicit formula of the Green-Eshelby operator in Fourier space has been proposed by Moulinec and Suquet. Its algorithmic, most complex part is the Fast Fourier Transformation (FFT). In our talk, we will discuss the potential quantum advantage that can be obtained by replacing the FFT with the Quantum Fourier Transformation (QFT). We will especially show that the data transfer for noisy intermediate-scale quantum (NISQ) devices can be improved by using appropriate boundary conditions for the PDE, which also allows using semi-classical versions of the QFT. In the end, we will compare the results of the QFT-based algorithm for simple geometries with the results of the FFT-based homogenization method.

Keywords: most likelihood amplitude estimation (MLQAE), numerical homogenization, quantum Fourier transformation (QFT), NISQ devises

Procedia PDF Downloads 78
3012 Frequency Offset Estimation Schemes Based on ML for OFDM Systems in Non-Gaussian Noise Environments

Authors: Keunhong Chae, Seokho Yoon

Abstract:

In this paper, frequency offset (FO) estimation schemes robust to the non-Gaussian noise environments are proposed for orthogonal frequency division multiplexing (OFDM) systems. First, a maximum-likelihood (ML) estimation scheme in non-Gaussian noise environments is proposed, and then, the complexity of the ML estimation scheme is reduced by employing a reduced set of candidate values. In numerical results, it is demonstrated that the proposed schemes provide a significant performance improvement over the conventional estimation scheme in non-Gaussian noise environments while maintaining the performance similar to the estimation performance in Gaussian noise environments.

Keywords: frequency offset estimation, maximum-likelihood, non-Gaussian noise environment, OFDM, training symbol

Procedia PDF Downloads 353
3011 ML-Based Blind Frequency Offset Estimation Schemes for OFDM Systems in Non-Gaussian Noise Environments

Authors: Keunhong Chae, Seokho Yoon

Abstract:

This paper proposes frequency offset (FO) estimation schemes robust to the non-Gaussian noise for orthogonal frequency division multiplexing (OFDM) systems. A maximum-likelihood (ML) scheme and a low-complexity estimation scheme are proposed by applying the probability density function of the cyclic prefix of OFDM symbols to the ML criterion. From simulation results, it is confirmed that the proposed schemes offer a significant FO estimation performance improvement over the conventional estimation scheme in non-Gaussian noise environments.

Keywords: frequency offset, cyclic prefix, maximum-likelihood, non-Gaussian noise, OFDM

Procedia PDF Downloads 476
3010 Residual Life Estimation Based on Multi-Phase Nonlinear Wiener Process

Authors: Hao Chen, Bo Guo, Ping Jiang

Abstract:

Residual life (RL) estimation based on multi-phase nonlinear Wiener process was studied in this paper, which is significant for complicated products with small samples. Firstly, nonlinear Wiener model with random parameter was introduced and multi-phase nonlinear Wiener model was proposed to model degradation process of products that were nonlinear and separated into different phases. Then the multi-phase RL probability density function based on the presented model was derived approximately in a closed form and parameters estimation was achieved with the method of maximum likelihood estimation (MLE). Finally, the method was applied to estimate the RL of high voltage plus capacitor. Compared with the other three different models by log-likelihood function (Log-LF) and Akaike information criterion (AIC), the results show that the proposed degradation model can capture degradation process of high voltage plus capacitors in a better way and provide a more reliable result.

Keywords: multi-phase nonlinear wiener process, residual life estimation, maximum likelihood estimation, high voltage plus capacitor

Procedia PDF Downloads 453
3009 The Generalized Pareto Distribution as a Model for Sequential Order Statistics

Authors: Mahdy ‎Esmailian, Mahdi ‎Doostparast, Ahmad ‎Parsian

Abstract:

‎In this article‎, ‎sequential order statistics (SOS) censoring type II samples coming from the generalized Pareto distribution are considered‎. ‎Maximum likelihood (ML) estimators of the unknown parameters are derived on the basis of the available multiple SOS data‎. ‎Necessary conditions for existence and uniqueness of the derived ML estimates are given‎. Due to complexity in the proposed likelihood function‎, ‎a useful re-parametrization is suggested‎. ‎For illustrative purposes‎, ‎a Monte Carlo simulation study is conducted and an illustrative example is analysed‎.

Keywords: bayesian estimation‎, generalized pareto distribution‎, ‎maximum likelihood estimation‎, sequential order statistics

Procedia PDF Downloads 509
3008 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models

Authors: I. V. Pinto, M. R. Sooriyarachchi

Abstract:

It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.

Keywords: goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, penalized quasi-likelihood, power, quasi-likelihood, type-I error

Procedia PDF Downloads 142
3007 Forecasting the Volatility of Geophysical Time Series with Stochastic Volatility Models

Authors: Maria C. Mariani, Md Al Masum Bhuiyan, Osei K. Tweneboah, Hector G. Huizar

Abstract:

This work is devoted to the study of modeling geophysical time series. A stochastic technique with time-varying parameters is used to forecast the volatility of data arising in geophysics. In this study, the volatility is defined as a logarithmic first-order autoregressive process. We observe that the inclusion of log-volatility into the time-varying parameter estimation significantly improves forecasting which is facilitated via maximum likelihood estimation. This allows us to conclude that the estimation algorithm for the corresponding one-step-ahead suggested volatility (with ±2 standard prediction errors) is very feasible since it possesses good convergence properties.

Keywords: Augmented Dickey Fuller Test, geophysical time series, maximum likelihood estimation, stochastic volatility model

Procedia PDF Downloads 315
3006 Frequency Analysis of Minimum Ecological Flow and Gage Height in Indus River Using Maximum Likelihood Estimation

Authors: Tasir Khan, Yejuan Wan, Kalim Ullah

Abstract:

Hydrological frequency analysis has been conducted to estimate the minimum flow elevation of the Indus River in Pakistan to protect the ecosystem. The Maximum likelihood estimation (MLE) technique is used to estimate the best-fitted distribution for Minimum Ecological Flows at nine stations of the Indus River in Pakistan. The four selected distributions, Generalized Extreme Value (GEV) distribution, Generalized Logistics (GLO) distribution, Generalized Pareto (GPA) distribution, and Pearson type 3 (PE3) are fitted in all sites, usually used in hydro frequency analysis. Compare the performance of these distributions by using the goodness of fit tests, such as the Kolmogorov Smirnov test, Anderson darling test, and chi-square test. The study concludes that the Maximum Likelihood Estimation (MLE) method recommended that GEV and GPA are the most suitable distributions which can be effectively applied to all the proposed sites. The quantiles are estimated for the return periods from 5 to 1000 years by using MLE, estimations methods. The MLE is the robust method for larger sample sizes. The results of these analyses can be used for water resources research, including water quality management, designing irrigation systems, determining downstream flow requirements for hydropower, and the impact of long-term drought on the country's aquatic system.

Keywords: minimum ecological flow, frequency distribution, indus river, maximum likelihood estimation

Procedia PDF Downloads 77
3005 Modelling Hydrological Time Series Using Wakeby Distribution

Authors: Ilaria Lucrezia Amerise

Abstract:

The statistical modelling of precipitation data for a given portion of territory is fundamental for the monitoring of climatic conditions and for Hydrogeological Management Plans (HMP). This modelling is rendered particularly complex by the changes taking place in the frequency and intensity of precipitation, presumably to be attributed to the global climate change. This paper applies the Wakeby distribution (with 5 parameters) as a theoretical reference model. The number and the quality of the parameters indicate that this distribution may be the appropriate choice for the interpolations of the hydrological variables and, moreover, the Wakeby is particularly suitable for describing phenomena producing heavy tails. The proposed estimation methods for determining the value of the Wakeby parameters are the same as those used for density functions with heavy tails. The commonly used procedure is the classic method of moments weighed with probabilities (probability weighted moments, PWM) although this has often shown difficulty of convergence, or rather, convergence to a configuration of inappropriate parameters. In this paper, we analyze the problem of the likelihood estimation of a random variable expressed through its quantile function. The method of maximum likelihood, in this case, is more demanding than in the situations of more usual estimation. The reasons for this lie, in the sampling and asymptotic properties of the estimators of maximum likelihood which improve the estimates obtained with indications of their variability and, therefore, their accuracy and reliability. These features are highly appreciated in contexts where poor decisions, attributable to an inefficient or incomplete information base, can cause serious damages.

Keywords: generalized extreme values, likelihood estimation, precipitation data, Wakeby distribution

Procedia PDF Downloads 137
3004 Point Estimation for the Type II Generalized Logistic Distribution Based on Progressively Censored Data

Authors: Rana Rimawi, Ayman Baklizi

Abstract:

Skewed distributions are important models that are frequently used in applications. Generalized distributions form a class of skewed distributions and gain widespread use in applications because of their flexibility in data analysis. More specifically, the Generalized Logistic Distribution with its different types has received considerable attention recently. In this study, based on progressively type-II censored data, we will consider point estimation in type II Generalized Logistic Distribution (Type II GLD). We will develop several estimators for its unknown parameters, including maximum likelihood estimators (MLE), Bayes estimators and linear estimators (BLUE). The estimators will be compared using simulation based on the criteria of bias and Mean square error (MSE). An illustrative example of a real data set will be given.

Keywords: point estimation, type II generalized logistic distribution, progressive censoring, maximum likelihood estimation

Procedia PDF Downloads 197
3003 Exponentiated Transmuted Weibull Distribution: A Generalization of the Weibull Probability Distribution

Authors: Abd El Hady N. Ebraheim

Abstract:

This paper introduces a new generalization of the two parameter Weibull distribution. To this end, the quadratic rank transmutation map has been used. This new distribution is named exponentiated transmuted Weibull (ETW) distribution. The ETW distribution has the advantage of being capable of modeling various shapes of aging and failure criteria. Furthermore, eleven lifetime distributions such as the Weibull, exponentiated Weibull, Rayleigh and exponential distributions, among others follow as special cases. The properties of the new model are discussed and the maximum likelihood estimation is used to estimate the parameters. Explicit expressions are derived for the quantiles. The moments of the distribution are derived, and the order statistics are examined.

Keywords: exponentiated, inversion method, maximum likelihood estimation, transmutation map

Procedia PDF Downloads 565
3002 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models

Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah

Abstract:

In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.

Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model

Procedia PDF Downloads 241
3001 Maximum Likelihood Estimation Methods on a Two-Parameter Rayleigh Distribution under Progressive Type-Ii Censoring

Authors: Daniel Fundi Murithi

Abstract:

Data from economic, social, clinical, and industrial studies are in some way incomplete or incorrect due to censoring. Such data may have adverse effects if used in the estimation problem. We propose the use of Maximum Likelihood Estimation (MLE) under a progressive type-II censoring scheme to remedy this problem. In particular, maximum likelihood estimates (MLEs) for the location (µ) and scale (λ) parameters of two Parameter Rayleigh distribution are realized under a progressive type-II censoring scheme using the Expectation-Maximization (EM) and the Newton-Raphson (NR) algorithms. These algorithms are used comparatively because they iteratively produce satisfactory results in the estimation problem. The progressively type-II censoring scheme is used because it allows the removal of test units before the termination of the experiment. Approximate asymptotic variances and confidence intervals for the location and scale parameters are derived/constructed. The efficiency of EM and the NR algorithms is compared given root mean squared error (RMSE), bias, and the coverage rate. The simulation study showed that in most sets of simulation cases, the estimates obtained using the Expectation-maximization algorithm had small biases, small variances, narrower/small confidence intervals width, and small root of mean squared error compared to those generated via the Newton-Raphson (NR) algorithm. Further, the analysis of a real-life data set (data from simple experimental trials) showed that the Expectation-Maximization (EM) algorithm performs better compared to Newton-Raphson (NR) algorithm in all simulation cases under the progressive type-II censoring scheme.

Keywords: expectation-maximization algorithm, maximum likelihood estimation, Newton-Raphson method, two-parameter Rayleigh distribution, progressive type-II censoring

Procedia PDF Downloads 163
3000 Parameter Estimation for the Mixture of Generalized Gamma Model

Authors: Wikanda Phaphan

Abstract:

Mixture generalized gamma distribution is a combination of two distributions: generalized gamma distribution and length biased generalized gamma distribution. These two distributions were presented by Suksaengrakcharoen and Bodhisuwan in 2014. The findings showed that probability density function (pdf) had fairly complexities, so it made problems in estimating parameters. The problem occurred in parameter estimation was that we were unable to calculate estimators in the form of critical expression. Thus, we will use numerical estimation to find the estimators. In this study, we presented a new method of the parameter estimation by using the expectation – maximization algorithm (EM), the conjugate gradient method, and the quasi-Newton method. The data was generated by acceptance-rejection method which is used for estimating α, β, λ and p. λ is the scale parameter, p is the weight parameter, α and β are the shape parameters. We will use Monte Carlo technique to find the estimator's performance. Determining the size of sample equals 10, 30, 100; the simulations were repeated 20 times in each case. We evaluated the effectiveness of the estimators which was introduced by considering values of the mean squared errors and the bias. The findings revealed that the EM-algorithm had proximity to the actual values determined. Also, the maximum likelihood estimators via the conjugate gradient and the quasi-Newton method are less precision than the maximum likelihood estimators via the EM-algorithm.

Keywords: conjugate gradient method, quasi-Newton method, EM-algorithm, generalized gamma distribution, length biased generalized gamma distribution, maximum likelihood method

Procedia PDF Downloads 219
2999 Inference for Compound Truncated Poisson Lognormal Model with Application to Maximum Precipitation Data

Authors: M. Z. Raqab, Debasis Kundu, M. A. Meraou

Abstract:

In this paper, we have analyzed maximum precipitation data during a particular period of time obtained from different stations in the Global Historical Climatological Network of the USA. One important point to mention is that some stations are shut down on certain days for some reason or the other. Hence, the maximum values are recorded by excluding those readings. It is assumed that the number of stations that operate follows zero-truncated Poisson random variables, and the daily precipitation follows a lognormal random variable. We call this model a compound truncated Poisson lognormal model. The proposed model has three unknown parameters, and it can take a variety of shapes. The maximum likelihood estimators can be obtained quite conveniently using Expectation-Maximization (EM) algorithm. Approximate maximum likelihood estimators are also derived. The associated confidence intervals also can be obtained from the observed Fisher information matrix. Simulation results have been performed to check the performance of the EM algorithm, and it is observed that the EM algorithm works quite well in this case. When we analyze the precipitation data set using the proposed model, it is observed that the proposed model provides a better fit than some of the existing models.

Keywords: compound Poisson lognormal distribution, EM algorithm, maximum likelihood estimation, approximate maximum likelihood estimation, Fisher information, skew distribution

Procedia PDF Downloads 108
2998 A Flexible Pareto Distribution Using α-Power Transformation

Authors: Shumaila Ehtisham

Abstract:

In Statistical Distribution Theory, considering an additional parameter to classical distributions is a usual practice. In this study, a new distribution referred to as α-Power Pareto distribution is introduced by including an extra parameter. Several properties of the proposed distribution including explicit expressions for the moment generating function, mode, quantiles, entropies and order statistics are obtained. Unknown parameters have been estimated by using maximum likelihood estimation technique. Two real datasets have been considered to examine the usefulness of the proposed distribution. It has been observed that α-Power Pareto distribution outperforms while compared to different variants of Pareto distribution on the basis of model selection criteria.

Keywords: α-power transformation, maximum likelihood estimation, moment generating function, Pareto distribution

Procedia PDF Downloads 215
2997 Reliability and Probability Weighted Moment Estimation for Three Parameter Mukherjee-Islam Failure Model

Authors: Ariful Islam, Showkat Ahmad Lone

Abstract:

The Mukherjee-Islam Model is commonly used as a simple life time distribution to assess system reliability. The model exhibits a better fit for failure information and provides more appropriate information about hazard rate and other reliability measures as shown by various authors. It is possible to introduce a location parameter at a time (i.e., a time before which failure cannot occur) which makes it a more useful failure distribution than the existing ones. Even after shifting the location of the distribution, it represents a decreasing, constant and increasing failure rate. It has been shown to represent the appropriate lower tail of the distribution of random variables having fixed lower bound. This study presents the reliability computations and probability weighted moment estimation of three parameter model. A comparative analysis is carried out between three parameters finite range model and some existing bathtub shaped curve fitting models. Since probability weighted moment method is used, the results obtained can also be applied on small sample cases. Maximum likelihood estimation method is also applied in this study.

Keywords: comparative analysis, maximum likelihood estimation, Mukherjee-Islam failure model, probability weighted moment estimation, reliability

Procedia PDF Downloads 273
2996 Credit Risk Prediction Based on Bayesian Estimation of Logistic Regression Model with Random Effects

Authors: Sami Mestiri, Abdeljelil Farhat

Abstract:

The aim of this current paper is to predict the credit risk of banks in Tunisia, over the period (2000-2005). For this purpose, two methods for the estimation of the logistic regression model with random effects: Penalized Quasi Likelihood (PQL) method and Gibbs Sampler algorithm are applied. By using the information on a sample of 528 Tunisian firms and 26 financial ratios, we show that Bayesian approach improves the quality of model predictions in terms of good classification as well as by the ROC curve result.

Keywords: forecasting, credit risk, Penalized Quasi Likelihood, Gibbs Sampler, logistic regression with random effects, curve ROC

Procedia PDF Downloads 542
2995 Parameter Estimation of Gumbel Distribution with Maximum-Likelihood Based on Broyden Fletcher Goldfarb Shanno Quasi-Newton

Authors: Dewi Retno Sari Saputro, Purnami Widyaningsih, Hendrika Handayani

Abstract:

Extreme data on an observation can occur due to unusual circumstances in the observation. The data can provide important information that can’t be provided by other data so that its existence needs to be further investigated. The method for obtaining extreme data is one of them using maxima block method. The distribution of extreme data sets taken with the maxima block method is called the distribution of extreme values. Distribution of extreme values is Gumbel distribution with two parameters. The parameter estimation of Gumbel distribution with maximum likelihood method (ML) is difficult to determine its exact value so that it is necessary to solve the approach. The purpose of this study was to determine the parameter estimation of Gumbel distribution with quasi-Newton BFGS method. The quasi-Newton BFGS method is a numerical method used for nonlinear function optimization without constraint so that the method can be used for parameter estimation from Gumbel distribution whose distribution function is in the form of exponential doubel function. The quasi-New BFGS method is a development of the Newton method. The Newton method uses the second derivative to calculate the parameter value changes on each iteration. Newton's method is then modified with the addition of a step length to provide a guarantee of convergence when the second derivative requires complex calculations. In the quasi-Newton BFGS method, Newton's method is modified by updating both derivatives on each iteration. The parameter estimation of the Gumbel distribution by a numerical approach using the quasi-Newton BFGS method is done by calculating the parameter values that make the distribution function maximum. In this method, we need gradient vector and hessian matrix. This research is a theory research and application by studying several journals and textbooks. The results of this study obtained the quasi-Newton BFGS algorithm and estimation of Gumbel distribution parameters. The estimation method is then applied to daily rainfall data in Purworejo District to estimate the distribution parameters. This indicates that the high rainfall that occurred in Purworejo District decreased its intensity and the range of rainfall that occurred decreased.

Keywords: parameter estimation, Gumbel distribution, maximum likelihood, broyden fletcher goldfarb shanno (BFGS)quasi newton

Procedia PDF Downloads 324
2994 Comparative Analysis of Two Approaches to Joint Signal Detection, ToA and AoA Estimation in Multi-Element Antenna Arrays

Authors: Olesya Bolkhovskaya, Alexey Davydov, Alexander Maltsev

Abstract:

In this paper two approaches to joint signal detection, time of arrival (ToA) and angle of arrival (AoA) estimation in multi-element antenna array are investigated. Two scenarios were considered: first one, when the waveform of the useful signal is known a priori and, second one, when the waveform of the desired signal is unknown. For first scenario, the antenna array signal processing based on multi-element matched filtering (MF) with the following non-coherent detection scheme and maximum likelihood (ML) parameter estimation blocks is exploited. For second scenario, the signal processing based on the antenna array elements covariance matrix estimation with the following eigenvector analysis and ML parameter estimation blocks is applied. The performance characteristics of both signal processing schemes are thoroughly investigated and compared for different useful signals and noise parameters.

Keywords: antenna array, signal detection, ToA, AoA estimation

Procedia PDF Downloads 496
2993 An Estimating Parameter of the Mean in Normal Distribution by Maximum Likelihood, Bayes, and Markov Chain Monte Carlo Methods

Authors: Autcha Araveeporn

Abstract:

This paper is to compare the parameter estimation of the mean in normal distribution by Maximum Likelihood (ML), Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML estimator is estimated by the average of data, the Bayes method is considered from the prior distribution to estimate Bayes estimator, and MCMC estimator is approximated by Gibbs sampling from posterior distribution. These methods are also to estimate a parameter then the hypothesis testing is used to check a robustness of the estimators. Data are simulated from normal distribution with the true parameter of mean 2, and variance 4, 9, and 16 when the sample sizes is set as 10, 20, 30, and 50. From the results, it can be seen that the estimation of MLE, and MCMC are perceivably different from the true parameter when the sample size is 10 and 20 with variance 16. Furthermore, the Bayes estimator is estimated from the prior distribution when mean is 1, and variance is 12 which showed the significant difference in mean with variance 9 at the sample size 10 and 20.

Keywords: Bayes method, Markov chain Monte Carlo method, maximum likelihood method, normal distribution

Procedia PDF Downloads 356
2992 Efficient Estimation for the Cox Proportional Hazards Cure Model

Authors: Khandoker Akib Mohammad

Abstract:

While analyzing time-to-event data, it is possible that a certain fraction of subjects will never experience the event of interest, and they are said to be cured. When this feature of survival models is taken into account, the models are commonly referred to as cure models. In the presence of covariates, the conditional survival function of the population can be modelled by using the cure model, which depends on the probability of being uncured (incidence) and the conditional survival function of the uncured subjects (latency), and a combination of logistic regression and Cox proportional hazards (PH) regression is used to model the incidence and latency respectively. In this paper, we have shown the asymptotic normality of the profile likelihood estimator via asymptotic expansion of the profile likelihood and obtain the explicit form of the variance estimator with an implicit function in the profile likelihood. We have also shown the efficient score function based on projection theory and the profile likelihood score function are equal. Our contribution in this paper is that we have expressed the efficient information matrix as the variance of the profile likelihood score function. A simulation study suggests that the estimated standard errors from bootstrap samples (SMCURE package) and the profile likelihood score function (our approach) are providing similar and comparable results. The numerical result of our proposed method is also shown by using the melanoma data from SMCURE R-package, and we compare the results with the output obtained from the SMCURE package.

Keywords: Cox PH model, cure model, efficient score function, EM algorithm, implicit function, profile likelihood

Procedia PDF Downloads 143
2991 Bayesian Estimation under Different Loss Functions Using Gamma Prior for the Case of Exponential Distribution

Authors: Md. Rashidul Hasan, Atikur Rahman Baizid

Abstract:

The Bayesian estimation approach is a non-classical estimation technique in statistical inference and is very useful in real world situation. The aim of this paper is to study the Bayes estimators of the parameter of exponential distribution under different loss functions and then compared among them as well as with the classical estimator named maximum likelihood estimator (MLE). In our real life, we always try to minimize the loss and we also want to gather some prior information (distribution) about the problem to solve it accurately. Here the gamma prior is used as the prior distribution of exponential distribution for finding the Bayes estimator. In our study, we also used different symmetric and asymmetric loss functions such as squared error loss function, quadratic loss function, modified linear exponential (MLINEX) loss function and non-linear exponential (NLINEX) loss function. Finally, mean square error (MSE) of the estimators are obtained and then presented graphically.

Keywords: Bayes estimator, maximum likelihood estimator (MLE), modified linear exponential (MLINEX) loss function, Squared Error (SE) loss function, non-linear exponential (NLINEX) loss function

Procedia PDF Downloads 384
2990 On Parameter Estimation of Simultaneous Linear Functional Relationship Model for Circular Variables

Authors: N. A. Mokhtar, A. G. Hussin, Y. Z. Zubairi

Abstract:

This paper proposes a new simultaneous simple linear functional relationship model by assuming equal error variances. We derive the maximum likelihood estimate of the parameters in the simultaneous model and the covariance. We show by simulation study the small bias values of the parameters suggest the suitability of the estimation method. As an illustration, the proposed simultaneous model is applied to real data of the wind direction and wave direction measured by two different instruments.

Keywords: simultaneous linear functional relationship model, Fisher information matrix, parameter estimation, circular variables

Procedia PDF Downloads 366
2989 Estimation of Stress-Strength Parameter for Burr Type XII Distribution Based on Progressive Type-II Censoring

Authors: A. M. Abd-Elfattah, M. H. Abu-Moussa

Abstract:

In this paper, the estimation of stress-strength parameter R = P(Y < X) is considered when X; Y the strength and stress respectively are two independent random variables of Burr Type XII distribution. The samples taken for X and Y are progressively censoring of type II. The maximum likelihood estimator (MLE) of R is obtained when the common parameter is unknown. But when the common parameter is known the MLE, uniformly minimum variance unbiased estimator (UMVUE) and the Bayes estimator of R = P(Y < X) are obtained. The exact con dence interval of R based on MLE is obtained. The performance of the proposed estimators is compared using the computer simulation.

Keywords: Burr Type XII distribution, progressive type-II censoring, stress-strength model, unbiased estimator, maximum-likelihood estimator, uniformly minimum variance unbiased estimator, confidence intervals, Bayes estimator

Procedia PDF Downloads 456
2988 A Survey on Quasi-Likelihood Estimation Approaches for Longitudinal Set-ups

Authors: Naushad Mamode Khan

Abstract:

The Com-Poisson (CMP) model is one of the most popular discrete generalized linear models (GLMS) that handles both equi-, over- and under-dispersed data. In longitudinal context, an integer-valued autoregressive (INAR(1)) process that incorporates covariate specification has been developed to model longitudinal CMP counts. However, the joint likelihood CMP function is difficult to specify and thus restricts the likelihood based estimating methodology. The joint generalized quasilikelihood approach (GQL-I) was instead considered but is rather computationally intensive and may not even estimate the regression effects due to a complex and frequently ill conditioned covariance structure. This paper proposes a new GQL approach for estimating the regression parameters (GQLIII) that are based on a single score vector representation. The performance of GQL-III is compared with GQL-I and separate marginal GQLs (GQL-II) through some simulation experiments and is proved to yield equally efficient estimates as GQL-I and is far more computationally stable.

Keywords: longitudinal, com-Poisson, ill-conditioned, INAR(1), GLMS, GQL

Procedia PDF Downloads 354
2987 Tenants Use Less Input on Rented Plots: Evidence from Northern Ethiopia

Authors: Desta Brhanu Gebrehiwot

Abstract:

The study aims to investigate the impact of land tenure arrangements on fertilizer use per hectare in Northern Ethiopia. Household and Plot level data are used for analysis. Land tenure contracts such as sharecropping and fixed rent arrangements have endogeneity. Different unobservable characteristics may affect renting-out decisions. Thus, the appropriate method of analysis was the instrumental variable estimation technic. Therefore, the family of instrumental variable estimation methods two-stage least-squares regression (2SLS, the generalized method of moments (GMM), Limited information maximum likelihood (LIML), and instrumental variable Tobit (IV-Tobit) was used. Besides, a method to handle a binary endogenous variable is applied, which uses a two-step estimation. In the first step probit model includes instruments, and in the second step, maximum likelihood estimation (MLE) (“etregress” command in Stata 14) was used. There was lower fertilizer use per hectare on sharecropped and fixed rented plots relative to owner-operated. The result supports the Marshallian inefficiency principle in sharecropping. The difference in fertilizer use per hectare could be explained by a lack of incentivized detailed contract forms, such as giving more proportion of the output to the tenant under sharecropping contracts, which motivates to use of more fertilizer in rented plots to maximize the production because most sharecropping arrangements share output equally between tenants and landlords.

Keywords: tenure-contracts, endogeneity, plot-level data, Ethiopia, fertilizer

Procedia PDF Downloads 86
2986 A New Distribution and Application on the Lifetime Data

Authors: Gamze Ozel, Selen Cakmakyapan

Abstract:

We introduce a new model called the Marshall-Olkin Rayleigh distribution which extends the Rayleigh distribution using Marshall-Olkin transformation and has increasing and decreasing shapes for the hazard rate function. Various structural properties of the new distribution are derived including explicit expressions for the moments, generating and quantile function, some entropy measures, and order statistics are presented. The model parameters are estimated by the method of maximum likelihood and the observed information matrix is determined. The potentiality of the new model is illustrated by means of real life data set.

Keywords: Marshall-Olkin distribution, Rayleigh distribution, estimation, maximum likelihood

Procedia PDF Downloads 501
2985 Introduction of Robust Multivariate Process Capability Indices

Authors: Behrooz Khalilloo, Hamid Shahriari, Emad Roghanian

Abstract:

Process capability indices (PCIs) are important concepts of statistical quality control and measure the capability of processes and how much processes are meeting certain specifications. An important issue in statistical quality control is parameter estimation. Under the assumption of multivariate normality, the distribution parameters, mean vector and variance-covariance matrix must be estimated, when they are unknown. Classic estimation methods like method of moment estimation (MME) or maximum likelihood estimation (MLE) makes good estimation of the population parameters when data are not contaminated. But when outliers exist in the data, MME and MLE make weak estimators of the population parameters. So we need some estimators which have good estimation in the presence of outliers. In this work robust M-estimators for estimating these parameters are used and based on robust parameter estimators, robust process capability indices are introduced. The performances of these robust estimators in the presence of outliers and their effects on process capability indices are evaluated by real and simulated multivariate data. The results indicate that the proposed robust capability indices perform much better than the existing process capability indices.

Keywords: multivariate process capability indices, robust M-estimator, outlier, multivariate quality control, statistical quality control

Procedia PDF Downloads 283
2984 Platform Virtual for Joint Amplitude Measurement Based in MEMS

Authors: Mauro Callejas-Cuervo, Andrea C. Alarcon-Aldana, Andres F. Ruiz-Olaya, Juan C. Alvarez

Abstract:

Motion capture (MC) is the construction of a precise and accurate digital representation of a real motion. Systems have been used in the last years in a wide range of applications, from films special effects and animation, interactive entertainment, medicine, to high competitive sport where a maximum performance and low injury risk during training and competition is seeking. This paper presents an inertial and magnetic sensor based technological platform, intended for particular amplitude monitoring and telerehabilitation processes considering an efficient cost/technical considerations compromise. Our platform particularities offer high social impact possibilities by making telerehabilitation accessible to large population sectors in marginal socio-economic sector, especially in underdeveloped countries that in opposition to developed countries specialist are scarce, and high technology is not available or inexistent. This platform integrates high-resolution low-cost inertial and magnetic sensors with adequate user interfaces and communication protocols to perform a web or other communication networks available diagnosis service. The amplitude information is generated by sensors then transferred to a computing device with adequate interfaces to make it accessible to inexperienced personnel, providing a high social value. Amplitude measurements of the platform virtual system presented a good fit to its respective reference system. Analyzing the robotic arm results (estimation error RMSE 1=2.12° and estimation error RMSE 2=2.28°), it can be observed that during arm motion in any sense, the estimation error is negligible; in fact, error appears only during sense inversion what can easily be explained by the nature of inertial sensors and its relation to acceleration. Inertial sensors present a time constant delay which acts as a first order filter attenuating signals at large acceleration values as is the case for a change of sense in motion. It can be seen a damped response of platform virtual in other images where error analysis show that at maximum amplitude an underestimation of amplitude is present whereas at minimum amplitude estimations an overestimation of amplitude is observed. This work presents and describes the platform virtual as a motion capture system suitable for telerehabilitation with the cost - quality and precision - accessibility relations optimized. These particular characteristics achieved by efficiently using the state of the art of accessible generic technology in sensors and hardware, and adequate software for capture, transmission analysis and visualization, provides the capacity to offer good telerehabilitation services, reaching large more or less marginal populations where technologies and specialists are not available but accessible with basic communication networks.

Keywords: inertial sensors, joint amplitude measurement, MEMS, telerehabilitation

Procedia PDF Downloads 259