Search results for: unbiased estimator
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 261

Search results for: unbiased estimator

201 X̄ and S Control Charts based on Weighted Standard Deviation Method

Authors: Derya Karagöz

Abstract:

A Shewhart chart based on normality assumption is not appropriate for skewed distributions since its Type-I error rate is inflated. This study presents X̄ and S control charts for monitoring the process variability for skewed distributions. We propose Weighted Standard Deviation (WSD) X̄ and S control charts. Standard deviation estimator is applied to monitor the process variability for estimating the process standard deviation, in the case of the W SD X̄ and S control charts as this estimator is simple and easy to compute. Unlike the Shewhart control chart, the proposed charts provide asymmetric limits in accordance with the direction and degree of skewness to construct the upper and lower limits. The performances of the proposed charts are compared with other heuristic charts for skewed distributions by using Simulation study. The Simulation studies show that the proposed control charts have good properties for skewed distributions and large sample sizes.

Keywords: weighted standard deviation, MAD, skewed distributions, S control charts

Procedia PDF Downloads 368
200 New Segmentation of Piecewise Linear Regression Models Using Reversible Jump MCMC Algorithm

Authors: Suparman

Abstract:

Piecewise linear regression models are very flexible models for modeling the data. If the piecewise linear regression models are matched against the data, then the parameters are generally not known. This paper studies the problem of parameter estimation of piecewise linear regression models. The method used to estimate the parameters of picewise linear regression models is Bayesian method. But the Bayes estimator can not be found analytically. To overcome these problems, the reversible jump MCMC algorithm is proposed. Reversible jump MCMC algorithm generates the Markov chain converges to the limit distribution of the posterior distribution of the parameters of picewise linear regression models. The resulting Markov chain is used to calculate the Bayes estimator for the parameters of picewise linear regression models.

Keywords: regression, piecewise, Bayesian, reversible Jump MCMC

Procedia PDF Downloads 489
199 Segmentation of Piecewise Polynomial Regression Model by Using Reversible Jump MCMC Algorithm

Authors: Suparman

Abstract:

Piecewise polynomial regression model is very flexible model for modeling the data. If the piecewise polynomial regression model is matched against the data, its parameters are not generally known. This paper studies the parameter estimation problem of piecewise polynomial regression model. The method which is used to estimate the parameters of the piecewise polynomial regression model is Bayesian method. Unfortunately, the Bayes estimator cannot be found analytically. Reversible jump MCMC algorithm is proposed to solve this problem. Reversible jump MCMC algorithm generates the Markov chain that converges to the limit distribution of the posterior distribution of piecewise polynomial regression model parameter. The resulting Markov chain is used to calculate the Bayes estimator for the parameters of piecewise polynomial regression model.

Keywords: piecewise regression, bayesian, reversible jump MCMC, segmentation

Procedia PDF Downloads 340
198 New Estimation in Autoregressive Models with Exponential White Noise by Using Reversible Jump MCMC Algorithm

Authors: Suparman Suparman

Abstract:

A white noise in autoregressive (AR) model is often assumed to be normally distributed. In application, the white noise usually do not follows a normal distribution. This paper aims to estimate a parameter of AR model that has a exponential white noise. A Bayesian method is adopted. A prior distribution of the parameter of AR model is selected and then this prior distribution is combined with a likelihood function of data to get a posterior distribution. Based on this posterior distribution, a Bayesian estimator for the parameter of AR model is estimated. Because the order of AR model is considered a parameter, this Bayesian estimator cannot be explicitly calculated. To resolve this problem, a method of reversible jump Markov Chain Monte Carlo (MCMC) is adopted. A result is a estimation of the parameter AR model can be simultaneously calculated.

Keywords: autoregressive (AR) model, exponential white Noise, bayesian, reversible jump Markov Chain Monte Carlo (MCMC)

Procedia PDF Downloads 328
197 Fathers' Knowledge and Attitude towards Breastfeeding: A Cross Sectional Study

Authors: Jacqueline R. Llamas, Agnes Regal

Abstract:

Objective: To determine the breastfeeding knowledge and attitudes of fathers seen at the University of Santo Tomas Hospital. Design: Cross-sectional design. Setting: University of Santo Tomas Hospital (USTH). Participants: 156 fathers who were accompanying their wives/children at the USTH. Findings: Outcome of the Iowa Infant Feeding Attitude Scale showed fathers to be generally unbiased whether their child be fed breast milk or milk formula. About 85% agreed that breast milk is the ideal food for babies, 79% believed that breastfed babies are healthier than formula fed and 55% of them do not believe that breast milk lacks iron. About 80% agreed that it is easily digested, 87% are aware of the economical value and 57% agreed of its convenience. Breastfeeding support was noted when 55% of the fathers would encourage mothers to breastfeed so as not to miss the joys of motherhood, 91% believed that breastfeeding increased mother-infant bonding. About 57% do not feel left out whenever the mothers breastfeed. However, 46.6% support the decision of their wives to switch to formula feeding once they go back to work, 42% only find breastfeeding in public to be acceptable and 57% will not allow breast feeding to mothers who drink alcohol. Conclusion: In the study, although fathers’ attitude toward breastfeeding is unbiased towards breastfeeding or formula feeding, the majority of the fathers appreciate breastfeeding and its benefits. Also, how the father’s level of education, age, profession, household income and number of children had an effect on their attitude towards breastfeeding.

Keywords: father, breastfeeding, breast milk, knowledge

Procedia PDF Downloads 391
196 A Nonlinear Dynamical System with Application

Authors: Abdullah Eqal Al Mazrooei

Abstract:

In this paper, a nonlinear dynamical system is presented. This system is a bilinear class. The bilinear systems are very important kind of nonlinear systems because they have many applications in real life. They are used in biology, chemistry, manufacturing, engineering, and economics where linear models are ineffective or inadequate. They have also been recently used to analyze and forecast weather conditions. Bilinear systems have three advantages: First, they define many problems which have a great applied importance. Second, they give us approximations to nonlinear systems. Thirdly, they have a rich geometric and algebraic structures, which promises to be a fruitful field of research for scientists and applications. The type of nonlinearity that is treated and analyzed consists of bilinear interaction between the states vectors and the system input. By using some properties of the tensor product, these systems can be transformed to linear systems. But, here we discuss the nonlinearity when the state vector is multiplied by itself. So, this model will be able to handle evolutions according to the Lotka-Volterra models or the Lorenz weather models, thus enabling a wider and more flexible application of such models. Here we apply by using an estimator to estimate temperatures. The results prove the efficiency of the proposed system.

Keywords: Lorenz models, nonlinear systems, nonlinear estimator, state-space model

Procedia PDF Downloads 233
195 A Novel Software Model for Enhancement of System Performance and Security through an Optimal Placement of PMU and FACTS

Authors: R. Kiran, B. R. Lakshmikantha, R. V. Parimala

Abstract:

Secure operation of power systems requires monitoring of the system operating conditions. Phasor measurement units (PMU) are the device, which uses synchronized signals from the GPS satellites, and provide the phasors information of voltage and currents at a given substation. The optimal locations for the PMUs must be determined, in order to avoid redundant use of PMUs. The objective of this paper is to make system observable by using minimum number of PMUs & the implementation of stability software at 22OkV grid for on-line estimation of the power system transfer capability based on voltage and thermal limitations and for security monitoring. This software utilizes State Estimator (SE) and synchrophasor PMU data sets for determining the power system operational margin under normal and contingency conditions. This software improves security of transmission system by continuously monitoring operational margin expressed in MW or in bus voltage angles, and alarms the operator if the margin violates a pre-defined threshold.

Keywords: state estimator (SE), flexible ac transmission systems (FACTS), optimal location, phasor measurement units (PMU)

Procedia PDF Downloads 386
194 Weighted Rank Regression with Adaptive Penalty Function

Authors: Kang-Mo Jung

Abstract:

The use of regularization for statistical methods has become popular. The least absolute shrinkage and selection operator (LASSO) framework has become the standard tool for sparse regression. However, it is well known that the LASSO is sensitive to outliers or leverage points. We consider a new robust estimation which is composed of the weighted loss function of the pairwise difference of residuals and the adaptive penalty function regulating the tuning parameter for each variable. Rank regression is resistant to regression outliers, but not to leverage points. By adopting a weighted loss function, the proposed method is robust to leverage points of the predictor variable. Furthermore, the adaptive penalty function gives us good statistical properties in variable selection such as oracle property and consistency. We develop an efficient algorithm to compute the proposed estimator using basic functions in program R. We used an optimal tuning parameter based on the Bayesian information criterion (BIC). Numerical simulation shows that the proposed estimator is effective for analyzing real data set and contaminated data.

Keywords: adaptive penalty function, robust penalized regression, variable selection, weighted rank regression

Procedia PDF Downloads 431
193 Median-Based Nonparametric Estimation of Returns in Mean-Downside Risk Portfolio Frontier

Authors: H. Ben Salah, A. Gannoun, C. de Peretti, A. Trabelsi

Abstract:

The Downside Risk (DSR) model for portfolio optimisation allows to overcome the drawbacks of the classical mean-variance model concerning the asymetry of returns and the risk perception of investors. This model optimization deals with a positive definite matrix that is endogenous with respect to portfolio weights. This aspect makes the problem far more difficult to handle. For this purpose, Athayde (2001) developped a new recurcive minimization procedure that ensures the convergence to the solution. However, when a finite number of observations is available, the portfolio frontier presents an appearance which is not very smooth. In order to overcome that, Athayde (2003) proposed a mean kernel estimation of the returns, so as to create a smoother portfolio frontier. This technique provides an effect similar to the case in which we had continuous observations. In this paper, taking advantage on the the robustness of the median, we replace the mean estimator in Athayde's model by a nonparametric median estimator of the returns. Then, we give a new version of the former algorithm (of Athayde (2001, 2003)). We eventually analyse the properties of this improved portfolio frontier and apply this new method on real examples.

Keywords: Downside Risk, Kernel Method, Median, Nonparametric Estimation, Semivariance

Procedia PDF Downloads 459
192 Environmentally Adaptive Acoustic Echo Suppression for Barge-in Speech Recognition

Authors: Jong Han Joo, Jung Hoon Lee, Young Sun Kim, Jae Young Kang, Seung Ho Choi

Abstract:

In this study, we propose a novel technique for acoustic echo suppression (AES) during speech recognition under barge-in conditions. Conventional AES methods based on spectral subtraction apply fixed weights to the estimated echo path transfer function (EPTF) at the current signal segment and to the EPTF estimated until the previous time interval. We propose a new approach that adaptively updates weight parameters in response to abrupt changes in the acoustic environment due to background noises or double-talk. Furthermore, we devised a voice activity detector and an initial time-delay estimator for barge-in speech recognition in communication networks. The initial time delay is estimated using log-spectral distance measure, as well as cross-correlation coefficients. The experimental results show that the developed techniques can be successfully applied in barge-in speech recognition systems.

Keywords: acoustic echo suppression, barge-in, speech recognition, echo path transfer function, initial delay estimator, voice activity detector

Procedia PDF Downloads 344
191 Home Range and Spatial Interaction Modelling of Black Bears

Authors: Fekadu L. Bayisa, Elvan Ceyhan, Todd D. Steury

Abstract:

Interaction between individuals within the same species is an important component of population dynamics. An interaction can be either static (based on spatial overlap) or dynamic (based on movement interactions). Using GPS collar data, we can quantify both static and dynamic interactions between black bears. The goal of this work is to determine the level of black bear interactions using the 95% and 50% home ranges, as well as to model black bear spatial interactions, which could be attraction, avoidance/repulsion, or a lack of interaction at all, to gain new insights and improve our understanding of ecological processes. Recent methodological developments in home range estimation, inhomogeneous multitype/cross-type summary statistics, and envelope testing methods are explored to study the nature of black bear interactions. Our findings, in general, indicate that the black bears of one type in our data set tend to cluster around another type.

Keywords: autocorrelated kernel density estimator, cross-type summary function, inhomogeneous multitype Poisson process, kernel density estimator, minimum convex polygon, pointwise and global envelope tests

Procedia PDF Downloads 49
190 Stereoselective Glycosylation and Functionalization of Unbiased Site of Sweet System via Dual-Catalytic Transition Metal Systems/Wittig Reaction

Authors: Mukul R. Gupta, Rajkumar Gandhi, Rajitha Sachan, Naveen K. Khare

Abstract:

The field of glycoscience has burgeoned in the last several decades, leading to the identification of many glycosides which could serve critical roles in a wide range of biological processes. This has prompted a resurgence in synthetic interest, with a particular focus on new approaches to construct the selective glycosidic bond. Despite the numerous elegant strategies and methods developed for the formation of glycosidic bonds, stereoselective construction of glycosides remains challenging. Here, we have recently developed the novel Hexafluoroisopropanol (HFIP) catalyzed stereoselective glycosylation methods by using KDN imidate glycosyl donor and a variety of alcohols in excellent yield. This method is broadly applicable to a wide range of substrates and with excellent selectivity of glycoside. Also, herein we are reporting the functionalization of the unbiased side of newly formed glycosides by dual-catalytic transition metal systems (Ru- or Fe-). We are using the innovative Reverse & Catalyst strategy, i.e., a reversible activation reaction by one catalyst with a functionalization reaction by another catalyst, together with enabling functionalization of substrates at their inherently unreactive sites. As well, we are targeting the diSia derivative synthesis by Wittig reaction. This synthetic method is applicable in mild conditions, functional group tolerance of the dual-catalytic systems and also highlights the potential of the multicatalytic approach to address challenging transformations to avoid multistep procedures in carbohydrate synthesis.

Keywords: KDN, stereoselective glycosylation, dual-catalytic functionalization, Wittig reaction

Procedia PDF Downloads 163
189 Carbohydrate Intake Estimation in Type I Diabetic Patients Described by UVA/Padova Model

Authors: David A. Padilla, Rodolfo Villamizar

Abstract:

In recent years, closed loop control strategies have been developed in order to establish a healthy glucose profile in type 1 diabetic mellitus (T1DM) patients. However, the controller itself is unable to define a suitable reference trajectory for glucose. In this paper, a control strategy Is proposed where the shape of the reference trajectory is generated bases in the amount of carbohydrates present during the digestive process, due to the effect of carbohydrate intake. Since there no exists a sensor to measure the amount of carbohydrates consumed, an estimator is proposed. Thus this paper presents the entire process of designing a carbohydrate estimator, which allows estimate disturbance for a predictive controller (MPC) in a T1MD patient, the estimation will be used to establish a profile of reference and improve the response of the controller by providing the estimated information of ingested carbohydrates. The dynamics of the diabetic model used are due to the equations described by the UVA/Padova model of the T1DMS simulator, the system was developed and simulated in Simulink, taking into account the noise and limitations of the glucose control system actuators.

Keywords: estimation, glucose control, predictive controller, MPC, UVA/Padova

Procedia PDF Downloads 231
188 A Comparative Study of Additive and Nonparametric Regression Estimators and Variable Selection Procedures

Authors: Adriano Z. Zambom, Preethi Ravikumar

Abstract:

One of the biggest challenges in nonparametric regression is the curse of dimensionality. Additive models are known to overcome this problem by estimating only the individual additive effects of each covariate. However, if the model is misspecified, the accuracy of the estimator compared to the fully nonparametric one is unknown. In this work the efficiency of completely nonparametric regression estimators such as the Loess is compared to the estimators that assume additivity in several situations, including additive and non-additive regression scenarios. The comparison is done by computing the oracle mean square error of the estimators with regards to the true nonparametric regression function. Then, a backward elimination selection procedure based on the Akaike Information Criteria is proposed, which is computed from either the additive or the nonparametric model. Simulations show that if the additive model is misspecified, the percentage of time it fails to select important variables can be higher than that of the fully nonparametric approach. A dimension reduction step is included when nonparametric estimator cannot be computed due to the curse of dimensionality. Finally, the Boston housing dataset is analyzed using the proposed backward elimination procedure and the selected variables are identified.

Keywords: additive model, nonparametric regression, variable selection, Akaike Information Criteria

Procedia PDF Downloads 240
187 The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination

Authors: O. Abiodun Adeyinka, B. Adeyemo Adesesan

Abstract:

The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.

Keywords: confidence interval, handwriting, kernel density estimator, KDE, logistic regression LoR, repeatability, reproducibility

Procedia PDF Downloads 95
186 An Unbiased Profiling of Immune Repertoire via Sequencing and Analyzing T-Cell Receptor Genes

Authors: Yi-Lin Chen, Sheng-Jou Hung, Tsunglin Liu

Abstract:

Adaptive immune system recognizes a wide range of antigens via expressing a large number of structurally distinct T cell and B cell receptor genes. The distinct receptor genes arise from complex rearrangements called V(D)J recombination, and constitute the immune repertoire. A common method of profiling immune repertoire is via amplifying recombined receptor genes using multiple primers and high-throughput sequencing. This multiplex-PCR approach is efficient; however, the resulting repertoire can be distorted because of primer bias. To eliminate primer bias, 5’ RACE is an alternative amplification approach. However, the application of RACE approach is limited by its low efficiency (i.e., the majority of data are non-regular receptor sequences, e.g., containing intronic segments) and lack of the convenient tool for analysis. We propose a computational tool that can correctly identify non-regular receptor sequences in RACE data via aligning receptor sequences against the whole gene instead of only the exon regions as done in all other tools. Using our tool, the remaining regular data allow for an accurate profiling of immune repertoire. In addition, a RACE approach is improved to yield a higher fraction of regular T-cell receptor sequences. Finally, we quantify the degree of primer bias of a multiplex-PCR approach via comparing it to the RACE approach. The results reveal significant differences in frequency of VJ combination by the two approaches. Together, we provide a new experimental and computation pipeline for an unbiased profiling of immune repertoire. As immune repertoire profiling has many applications, e.g., tracing bacterial and viral infection, detection of T cell lymphoma and minimal residual disease, monitoring cancer immunotherapy, etc., our work should benefit scientists who are interested in the applications.

Keywords: immune repertoire, T-cell receptor, 5' RACE, high-throughput sequencing, sequence alignment

Procedia PDF Downloads 161
185 Analysis of Filtering in Stochastic Systems on Continuous- Time Memory Observations in the Presence of Anomalous Noises

Authors: S. Rozhkova, O. Rozhkova, A. Harlova, V. Lasukov

Abstract:

For optimal unbiased filter as mean-square and in the case of functioning anomalous noises in the observation memory channel, we have proved insensitivity of filter to inaccurate knowledge of the anomalous noise intensity matrix and its equivalence to truncated filter plotted only by non anomalous components of an observation vector.

Keywords: mathematical expectation, filtration, anomalous noise, memory

Procedia PDF Downloads 334
184 Efficient Principal Components Estimation of Large Factor Models

Authors: Rachida Ouysse

Abstract:

This paper proposes a constrained principal components (CnPC) estimator for efficient estimation of large-dimensional factor models when errors are cross sectionally correlated and the number of cross-sections (N) may be larger than the number of observations (T). Although principal components (PC) method is consistent for any path of the panel dimensions, it is inefficient as the errors are treated to be homoskedastic and uncorrelated. The new CnPC exploits the assumption of bounded cross-sectional dependence, which defines Chamberlain and Rothschild’s (1983) approximate factor structure, as an explicit constraint and solves a constrained PC problem. The CnPC method is computationally equivalent to the PC method applied to a regularized form of the data covariance matrix. Unlike maximum likelihood type methods, the CnPC method does not require inverting a large covariance matrix and thus is valid for panels with N ≥ T. The paper derives a convergence rate and an asymptotic normality result for the CnPC estimators of the common factors. We provide feasible estimators and show in a simulation study that they are more accurate than the PC estimator, especially for panels with N larger than T, and the generalized PC type estimators, especially for panels with N almost as large as T.

Keywords: high dimensionality, unknown factors, principal components, cross-sectional correlation, shrinkage regression, regularization, pseudo-out-of-sample forecasting

Procedia PDF Downloads 121
183 Housing Price Dynamics: Comparative Study of 1980-1999 and the New Millenium

Authors: Janne Engblom, Elias Oikarinen

Abstract:

The understanding of housing price dynamics is of importance to a great number of agents: to portfolio investors, banks, real estate brokers and construction companies as well as to policy makers and households. A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models is dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Common Correlated Effects estimator (CCE) of dynamic panel data which also accounts for cross-sectional dependence which is caused by common structures of the economy. In presence of cross-sectional dependence standard OLS gives biased estimates. In this study, U.S housing price dynamics were examined empirically using the dynamic CCE estimator with first-difference of housing price as the dependent and first-differences of per capita income, interest rate, housing stock and lagged price together with deviation of housing prices from their long-run equilibrium level as independents. These deviations were also estimated from the data. The aim of the analysis was to provide estimates with comparisons of estimates between 1980-1999 and 2000-2012. Based on data of 50 U.S cities over 1980-2012 differences of short-run housing price dynamics estimates were mostly significant when two time periods were compared. Significance tests of differences were provided by the model containing interaction terms of independents and time dummy variable. Residual analysis showed very low cross-sectional correlation of the model residuals compared with the standard OLS approach. This means a good fit of CCE estimator model. Estimates of the dynamic panel data model were in line with the theory of housing price dynamics. Results also suggest that dynamics of a housing market is evolving over time.

Keywords: dynamic model, panel data, cross-sectional dependence, interaction model

Procedia PDF Downloads 230
182 Efficient Estimation for the Cox Proportional Hazards Cure Model

Authors: Khandoker Akib Mohammad

Abstract:

While analyzing time-to-event data, it is possible that a certain fraction of subjects will never experience the event of interest, and they are said to be cured. When this feature of survival models is taken into account, the models are commonly referred to as cure models. In the presence of covariates, the conditional survival function of the population can be modelled by using the cure model, which depends on the probability of being uncured (incidence) and the conditional survival function of the uncured subjects (latency), and a combination of logistic regression and Cox proportional hazards (PH) regression is used to model the incidence and latency respectively. In this paper, we have shown the asymptotic normality of the profile likelihood estimator via asymptotic expansion of the profile likelihood and obtain the explicit form of the variance estimator with an implicit function in the profile likelihood. We have also shown the efficient score function based on projection theory and the profile likelihood score function are equal. Our contribution in this paper is that we have expressed the efficient information matrix as the variance of the profile likelihood score function. A simulation study suggests that the estimated standard errors from bootstrap samples (SMCURE package) and the profile likelihood score function (our approach) are providing similar and comparable results. The numerical result of our proposed method is also shown by using the melanoma data from SMCURE R-package, and we compare the results with the output obtained from the SMCURE package.

Keywords: Cox PH model, cure model, efficient score function, EM algorithm, implicit function, profile likelihood

Procedia PDF Downloads 109
181 Low Complexity Carrier Frequency Offset Estimation for Cooperative Orthogonal Frequency Division Multiplexing Communication Systems without Cyclic Prefix

Authors: Tsui-Tsai Lin

Abstract:

Cooperative orthogonal frequency division multiplexing (OFDM) transmission, which possesses the advantages of better connectivity, expanded coverage, and resistance to frequency selective fading, has been a more powerful solution for the physical layer in wireless communications. However, such a hybrid scheme suffers from the carrier frequency offset (CFO) effects inherited from the OFDM-based systems, which lead to a significant degradation in performance. In addition, insertion of a cyclic prefix (CP) at each symbol block head for combating inter-symbol interference will lead to a reduction in spectral efficiency. The design on the CFO estimation for the cooperative OFDM system without CP is a suspended problem. This motivates us to develop a low complexity CFO estimator for the cooperative OFDM decode-and-forward (DF) communication system without CP over the multipath fading channel. Especially, using a block-type pilot, the CFO estimation is first derived in accordance with the least square criterion. A reliable performance can be obtained through an exhaustive two-dimensional (2D) search with a penalty of heavy computational complexity. As a remedy, an alternative solution realized with an iteration approach is proposed for the CFO estimation. In contrast to the 2D-search estimator, the iterative method enjoys the advantage of the substantially reduced implementation complexity without sacrificing the estimate performance. Computer simulations have been presented to demonstrate the efficacy of the proposed CFO estimation.

Keywords: cooperative transmission, orthogonal frequency division multiplexing (OFDM), carrier frequency offset, iteration

Procedia PDF Downloads 242
180 Quantitative Ranking Evaluation of Wine Quality

Authors: A. Brunel, A. Kernevez, F. Leclere, J. Trenteseaux

Abstract:

Today, wine quality is only evaluated by wine experts with their own different personal tastes, even if they may agree on some common features. So producers do not have any unbiased way to independently assess the quality of their products. A tool is here proposed to evaluate wine quality by an objective ranking based upon the variables entering wine elaboration, and analysed through principal component analysis (PCA) method. Actual climatic data are compared by measuring the relative distance between each considered wine, out of which the general ranking is performed.

Keywords: wine, grape, weather conditions, rating, climate, principal component analysis, metric analysis

Procedia PDF Downloads 284
179 Karachi Electric Power Technical and Financial Performance Evaluation after Privatization

Authors: Fawad Azeem

Abstract:

This paper deals with the comparative analysis of Karachi Electric before and after privatization. Technical as well as financial analysis has been done based on the available KE’s stats for last decade. Karachi Electric has evolved as a better entity in terms of its financial and technical achievements. On the other hand, human resources have been seriously affected due to mass firing of employees from the organizations. Study and analysis show that transparent and unbiased privatization practices on institutions like KE that were in serious trouble can upsurge the standards of the institution. Further, for the betterment of the social circle privatization must not affect the employment opportunities.

Keywords: Karachi Electric, power, energy, privatization

Procedia PDF Downloads 327
178 Stereo Motion Tracking

Authors: Yudhajit Datta, Hamsi Iyer, Jonathan Bandi, Ankit Sethia

Abstract:

Motion Tracking and Stereo Vision are complicated, albeit well-understood problems in computer vision. Existing softwares that combine the two approaches to perform stereo motion tracking typically employ complicated and computationally expensive procedures. The purpose of this study is to create a simple and effective solution capable of combining the two approaches. The study aims to explore a strategy to combine the two techniques of two-dimensional motion tracking using Kalman Filter; and depth detection of object using Stereo Vision. In conventional approaches objects in the scene of interest are observed using a single camera. However for Stereo Motion Tracking; the scene of interest is observed using video feeds from two calibrated cameras. Using two simultaneous measurements from the two cameras a calculation for the depth of the object from the plane containing the cameras is made. The approach attempts to capture the entire three-dimensional spatial information of each object at the scene and represent it through a software estimator object. In discrete intervals, the estimator tracks object motion in the plane parallel to plane containing cameras and updates the perpendicular distance value of the object from the plane containing the cameras as depth. The ability to efficiently track the motion of objects in three-dimensional space using a simplified approach could prove to be an indispensable tool in a variety of surveillance scenarios. The approach may find application from high security surveillance scenes such as premises of bank vaults, prisons or other detention facilities; to low cost applications in supermarkets and car parking lots.

Keywords: kalman filter, stereo vision, motion tracking, matlab, object tracking, camera calibration, computer vision system toolbox

Procedia PDF Downloads 296
177 Enhancement of Primary User Detection in Cognitive Radio by Scattering Transform

Authors: A. Moawad, K. C. Yao, A. Mansour, R. Gautier

Abstract:

The detecting of an occupied frequency band is a major issue in cognitive radio systems. The detection process becomes difficult if the signal occupying the band of interest has faded amplitude due to multipath effects. These effects make it hard for an occupying user to be detected. This work mitigates the missed-detection problem in the context of cognitive radio in frequency-selective fading channel by proposing blind channel estimation method that is based on scattering transform. By initially applying conventional energy detection, the missed-detection probability is evaluated, and if it is greater than or equal to 50%, channel estimation is applied on the received signal followed by channel equalization to reduce the channel effects. In the proposed channel estimator, we modify the Morlet wavelet by using its first derivative for better frequency resolution. A mathematical description of the modified function and its frequency resolution is formulated in this work. The improved frequency resolution is required to follow the spectral variation of the channel. The channel estimation error is evaluated in the mean-square sense for different channel settings, and energy detection is applied to the equalized received signal. The simulation results show improvement in reducing the missed-detection probability as compared to the detection based on principal component analysis. This improvement is achieved at the expense of increased estimator complexity, which depends on the number of wavelet filters as related to the channel taps. Also, the detection performance shows an improvement in detection probability for low signal-to-noise scenarios over principal component analysis- based energy detection.

Keywords: channel estimation, cognitive radio, scattering transform, spectrum sensing

Procedia PDF Downloads 175
176 Household's Willingness to Pay for Safe Non-Timber Forest Products at Morikouali-Ye Community Forest in Cameroon

Authors: Eke Balla Sophie Michelle

Abstract:

Forest provides a wide range of environmental goods and services among which, biodiversity or consumption goods and constitute public goods. Despite the importance of non-timber forest products (NTFPs) in sustaining livelihood and poverty smoothening in rural communities, they are highly depleted and poorly conserved. Yokadouma is a town where NTFPs is a renewable resource in active exploitation. It has been found that such exploitation is done in the same conditions as other localities that have experienced a rapid depletion of their NTFPs in destination to cities across Cameroon, Central Africa, and overseas. Given these realities, it is necessary to access the consequences of this overexploitation through negative effects on both the population and the environment. Therefore, to enhance participatory conservation initiatives, this study determines the household’s willingness to pay in community forest (CF) of Morikouali-ye, eastern region of Cameroon, for sustainable exploitation of NTFPs using contingent valuation method (CVM) through two approaches, one parametric (Logit model) and the other non-parametric (estimator of the Turnbull lower bound). The results indicate that five species are the most collected in the study area: Irvingia gabonensis, the Ricinodendron heudelotii, Gnetum, the Jujube and bark, their sale contributes significantly to 41 % of total household income. The average willingness to pay through the Logit model and the Turnbull estimator is 6845.2861 FCFA and 4940 FCFA respectively per household per year with a social cost of degradation estimated at 3237820.3253 FCFA years. The probability to pay increases with income, gender, number of women in the household, age, the commercial activity of NTFPs and decreases with the concept of sustainable development.

Keywords: non timber forest product, contingent valuation method, willingness to pay, sustainable development

Procedia PDF Downloads 410
175 A Review of Quantitative Psychology in Our Life

Authors: Shubham Tandon, Rajni Goel

Abstract:

The prime objective of our review paper is to study the quantitative psychology impact on our daily life. Quantitative techniques have been studied with the aim of discovering solutions in an advanced way. To get the unbiased and correct results, statistics and other useful mathematical aspects have been reviewed. So, many psychologists use quantitative techniques while working in the area of psychology with the aim of discovering solutions in an advanced way. This ensures their accurate outcomes as those will make use of precise criteria in knowing the minds and conditions of any person. Also, proper experimentation and observational tools are taken care of to avoid some possibilities of invalid data.

Keywords: quantitative psychology, psychologists, statistics, person, results, minds

Procedia PDF Downloads 68
174 Fractional-Order Modeling of GaN High Electron Mobility Transistors for Switching Applications

Authors: Anwar H. Jarndal, Ahmed S. Elwakil

Abstract:

In this paper, a fraction-order model for pad parasitic effect of GaN HEMT on Si substrate is developed and validated. Open de-embedding structure is used to characterize and de-embed substrate loading parasitic effects. Unbiased device measurements are implemented to extract parasitic inductances and resistances. The model shows very good simulation for S-parameter measurements under different bias conditions. It has been found that this approach can improve the simulation of intrinsic part of the transistor, which is very important for small- and large-signal modeling process.

Keywords: fractional-order modeling, GaNHEMT, si-substrate, open de-embedding structure

Procedia PDF Downloads 327
173 Trends of Seasonal and Annual Rainfall in the South-Central Climatic Zone of Bangladesh Using Mann-Kendall Trend Test

Authors: M. T. Islam, S. H. Shakif, R. Hasan, S. H. Kobi

Abstract:

Investigation of rainfall trends is crucial considering climate change, food security, and the economy of a particular region. This research aims to study seasonal and annual precipitation trends and their abrupt changes over time in the south-central climatic zone of Bangladesh using monthly time series data of 50 years (1970-2019). A trend-free pre-whitening method has been employed to make necessary adjustments for autocorrelations in the rainfall data. Trends in rainfall and their intensity have been observed using the non-parametric Mann-Kendall test and Theil-Sen estimator. Significant changes and fluctuation points in the data series have been detected using the sequential Mann-Kendall test at the 95% confidence limit. The study findings show that most of the rainfall stations in the study area have a decreasing precipitation pattern throughout all seasons. The maximum decline in the rainfall intensity has been found for the Tangail station (-8.24 mm/year) during monsoon. Madaripur and Chandpur stations have shown slight positive trends in post-monsoon rainfall. In terms of annual precipitation, a negative rainfall pattern has been identified in each station, with a maximum decrement (-) of 14.48 mm/year at Chandpur. However, all the trends are statistically non-significant within the 95% confidence interval, and their monotonic association with time ranges from very weak to weak. From the sequential Mann-Kendall test, the year of changing points for annual and seasonal downward precipitation trends occur mostly after the 90s for Dhaka and Barishal stations. For Chandpur, the fluctuation points arrive after the mid-70s in most cases.

Keywords: trend analysis, Mann-Kendall test, Theil-Sen estimator, sequential Mann-Kendall test, rainfall trend

Procedia PDF Downloads 53
172 Genomic Prediction Reliability Using Haplotypes Defined by Different Methods

Authors: Sohyoung Won, Heebal Kim, Dajeong Lim

Abstract:

Genomic prediction is an effective way to measure the abilities of livestock for breeding based on genomic estimated breeding values, statistically predicted values from genotype data using best linear unbiased prediction (BLUP). Using haplotypes, clusters of linked single nucleotide polymorphisms (SNPs), as markers instead of individual SNPs can improve the reliability of genomic prediction since the probability of a quantitative trait loci to be in strong linkage disequilibrium (LD) with markers is higher. To efficiently use haplotypes in genomic prediction, finding optimal ways to define haplotypes is needed. In this study, 770K SNP chip data was collected from Hanwoo (Korean cattle) population consisted of 2506 cattle. Haplotypes were first defined in three different ways using 770K SNP chip data: haplotypes were defined based on 1) length of haplotypes (bp), 2) the number of SNPs, and 3) k-medoids clustering by LD. To compare the methods in parallel, haplotypes defined by all methods were set to have comparable sizes; in each method, haplotypes defined to have an average number of 5, 10, 20 or 50 SNPs were tested respectively. A modified GBLUP method using haplotype alleles as predictor variables was implemented for testing the prediction reliability of each haplotype set. Also, conventional genomic BLUP (GBLUP) method, which uses individual SNPs were tested to evaluate the performance of the haplotype sets on genomic prediction. Carcass weight was used as the phenotype for testing. As a result, using haplotypes defined by all three methods showed increased reliability compared to conventional GBLUP. There were not many differences in the reliability between different haplotype defining methods. The reliability of genomic prediction was highest when the average number of SNPs per haplotype was 20 in all three methods, implying that haplotypes including around 20 SNPs can be optimal to use as markers for genomic prediction. When the number of alleles generated by each haplotype defining methods was compared, clustering by LD generated the least number of alleles. Using haplotype alleles for genomic prediction showed better performance, suggesting improved accuracy in genomic selection. The number of predictor variables was decreased when the LD-based method was used while all three haplotype defining methods showed similar performances. This suggests that defining haplotypes based on LD can reduce computational costs and allows efficient prediction. Finding optimal ways to define haplotypes and using the haplotype alleles as markers can provide improved performance and efficiency in genomic prediction.

Keywords: best linear unbiased predictor, genomic prediction, haplotype, linkage disequilibrium

Procedia PDF Downloads 110