Search results for: Blind estimation parameters
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4589

Search results for: Blind estimation parameters

4349 Change Detection and Non Stationary Signals Tracking by Adaptive Filtering

Authors: Mounira RouaÐùnia, Noureddine Doghmane

Abstract:

In this paper we consider the problem of change detection and non stationary signals tracking. Using parametric estimation of signals based on least square lattice adaptive filters we consider for change detection statistical parametric methods using likelihood ratio and hypothesis tests. In order to track signals dynamics, we introduce a compensation procedure in the adaptive estimation. This will improve the adaptive estimation performances and fasten it-s convergence after changes detection.

Keywords: Change detection, Hypothesis test, likelihood ratioleast square lattice adaptive filters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1596
4348 Mathematical Modeling of an Avalanche Release and Estimation of Flow Parameters by Numerical Method

Authors: Mahmoud Zarrini

Abstract:

Avalanche release of snow has been modeled in the present studies. Snow is assumed to be represented by semi-solid and the governing equations have been studied from the concept of continuum approach. The dynamical equations have been solved for two different zones [starting zone and track zone] by using appropriate initial and boundary conditions. Effect of density (ρ), Eddy viscosity (η), Slope angle (θ), Slab depth (R) on the flow parameters have been observed in the present studies. Numerical methods have been employed for computing the non linear differential equations. One of the most interesting and fundamental innovation in the present studies is getting initial condition for the computation of velocity by numerical approach. This information of the velocity has obtained through the concept of fracture mechanics applicable to snow. The results on the flow parameters have found to be in qualitative agreement with the published results.

Keywords: Snow avalanche, fracture mechanics, avalanche velocity, avalanche zones.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1722
4347 Higher Order Statistics for Identification of Minimum Phase Channels

Authors: Mohammed Zidane, Said Safi, Mohamed Sabri, Ahmed Boumezzough

Abstract:

This paper describes a blind algorithm, which is compared with two another algorithms proposed in the literature, for estimating of the minimum phase channel parameters. In order to identify blindly the impulse response of these channels, we have used Higher Order Statistics (HOS) to build our algorithm. The simulation results in noisy environment, demonstrate that the proposed method could estimate the phase and magnitude with high accuracy of these channels blindly and without any information about the input, except that the input excitation is identically and independent distribute (i.i.d) and non-Gaussian.

Keywords: System Identification, Higher Order Statistics, Communication Channels.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1631
4346 Performance Analysis of MUSIC, Root-MUSIC and ESPRIT DOA Estimation Algorithm

Authors: N. P. Waweru, D. B. O. Konditi, P. K. Langat

Abstract:

Direction of Arrival estimation refers to defining a mathematical function called a pseudospectrum that gives an indication of the angle a signal is impinging on the antenna array. This estimation is an efficient method of improving the quality of service in a communication system by focusing the reception and transmission only in the estimated direction thereby increasing fidelity with a provision to suppress interferers. This improvement is largely dependent on the performance of the algorithm employed in the estimation. Many DOA algorithms exists amongst which are MUSIC, Root-MUSIC and ESPRIT. In this paper, performance of these three algorithms is analyzed in terms of complexity, accuracy as assessed and characterized by the CRLB and memory requirements in various environments and array sizes. It is found that the three algorithms are high resolution and dependent on the operating environment and the array size. 

Keywords: Direction of Arrival, Autocorrelation matrix, Eigenvalue decomposition, MUSIC, ESPRIT, CRLB.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8636
4345 Kalman Filter Gain Elimination in Linear Estimation

Authors: Nicholas D. Assimakis

Abstract:

In linear estimation, the traditional Kalman filter uses the Kalman filter gain in order to produce estimation and prediction of the n-dimensional state vector using the m-dimensional measurement vector. The computation of the Kalman filter gain requires the inversion of an m x m matrix in every iteration. In this paper, a variation of the Kalman filter eliminating the Kalman filter gain is proposed. In the time varying case, the elimination of the Kalman filter gain requires the inversion of an n x n matrix and the inversion of an m x m matrix in every iteration. In the time invariant case, the elimination of the Kalman filter gain requires the inversion of an n x n matrix in every iteration. The proposed Kalman filter gain elimination algorithm may be faster than the conventional Kalman filter, depending on the model dimensions.

Keywords: Discrete time, linear estimation, Kalman filter, Kalman filter gain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 573
4344 Estimation of Structural Parameters in Time Domain Using One Dimensional Piezo Zirconium Titanium Patch Model

Authors: N. Jinesh, K. Shankar

Abstract:

This article presents a method of using the one dimensional piezo-electric patch on beam model for structural identification. A hybrid element constituted of one dimensional beam element and a PZT sensor is used with reduced material properties. This model is convenient and simple for identification of beams. Accuracy of this element is first verified against a corresponding 3D finite element model (FEM). The structural identification is carried out as an inverse problem whereby parameters are identified by minimizing the deviation between the predicted and measured voltage response of the patch, when subjected to excitation. A non-classical optimization algorithm Particle Swarm Optimization is used to minimize this objective function. The signals are polluted with 5% Gaussian noise to simulate experimental noise. The proposed method is applied on beam structure and identified parameters are stiffness and damping. The model is also validated experimentally.

Keywords: Structural identification, PZT patches, inverse problem, particle swarm optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 893
4343 Studies on Seasonal Variations of Physico- Chemical Parameters of Fish Farm at Govt. Nursery Unit, Muzaffargarh, Punjab, Pakistan

Authors: Muhammad Naeem, Abdus Salam, Muhammad Ashraf, Muhammad Imran, Mehtab Ahmad, Muhammad Jamshed Khan, Muhammad Mazhar Ayaz, Muzaffar Ali, Arshad Ali, Memoona Qayyum Abir Ishtiaq

Abstract:

The present study was designed to demonstrate the seasonal variations in physico-chemical parameters of fish farm at Govt. Nursery Unit, Muzaffargarh, Department of Fisheries Govt. of Punjab, Pakistan for a period of eight months from January to August 2008. Water samples were collected on fifteen days basis and have been analyzed for estimation of Air temperature, Water temperature, Light penetration, pH, Total dissolved oxygen, Clouds, Carbonates, Bicarbonates, Total carbonates, Total dissolved solids, Chlorides, Calcium and Hardness. Seasonal fluctuations were observed in all the physico-chemical parameters of fish farm. The overall physicochemical parameters of fish pond water remained within the tolerable range throughout the study period.

Keywords: Freshwater, Fish farm, Water quality, Seasonal variation, Chemical factor

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2129
4342 Application of GIS and Statistical Multivariate Techniques for Estimation of Soil Erosion and Sediment Yield

Authors: Masoud Nasri, Ali Gholami, Ali Najafi

Abstract:

In recent years, most of the regions in the world are exposed to degradation and erosion caused by increasing population and over use of land resources. The understanding of the most important factors on soil erosion and sediment yield are the main keys for decision making and planning. In this study, the sediment yield and soil erosion were estimated and the priority of different soil erosion factors used in the MPSIAC method of soil erosion estimation is evaluated in AliAbad watershed in southwest of Isfahan Province, Iran. Different information layers of the parameters were created using a GIS technique. Then, a multivariate procedure was applied to estimate sediment yield and to find the most important factors of soil erosion in the model. The results showed that land use, geology, land and soil cover are the most important factors describing the soil erosion estimated by MPSIAC model.

Keywords: land degradation, Soil erosion, Sediment yield, Aliabad, GIS technique, Land use.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1646
4341 Hazard Rate Estimation of Temporal Point Process, Case Study: Earthquake Hazard Rate in Nusatenggara Region

Authors: Sunusi N., Kresna A. J., Islamiyati A., Raupong

Abstract:

Hazard rate estimation is one of the important topics in forecasting earthquake occurrence. Forecasting earthquake occurrence is a part of the statistical seismology where the main subject is the point process. Generally, earthquake hazard rate is estimated based on the point process likelihood equation called the Hazard Rate Likelihood of Point Process (HRLPP). In this research, we have developed estimation method, that is hazard rate single decrement HRSD. This method was adapted from estimation method in actuarial studies. Here, one individual associated with an earthquake with inter event time is exponentially distributed. The information of epicenter and time of earthquake occurrence are used to estimate hazard rate. At the end, a case study of earthquake hazard rate will be given. Furthermore, we compare the hazard rate between HRLPP and HRSD method.

Keywords: Earthquake forecast, Hazard Rate, Likelihood point process, Point process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1439
4340 A Robust Frequency Offset Estimation Scheme for OFDM System with Cyclic Delay Diversity

Authors: Won-Jae Shin, Young-Hwan You

Abstract:

Cyclic delay diversity (CDD) is a simple technique to intentionally increase frequency selectivity of channels for orthogonal frequency division multiplexing (OFDM).This paper proposes a residual carrier frequency offset (RFO) estimation scheme for OFDMbased broadcasting system using CDD. In order to improve the RFO estimation, this paper addresses a decision scheme of the amount of cyclic delay and pilot pattern used to estimate the RFO. By computer simulation, the proposed estimator is shown to benefit form propoerly chosen delay parameter and perform robustly.

Keywords: OFDM, cyclic delay diversity, FM system, synchronization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1719
4339 Application of an Analytical Model to Obtain Daily Flow Duration Curves for Different Hydrological Regimes in Switzerland

Authors: Ana Clara Santos, Maria Manuela Portela, Bettina Schaefli

Abstract:

This work assesses the performance of an analytical model framework to generate daily flow duration curves, FDCs, based on climatic characteristics of the catchments and on their streamflow recession coefficients. According to the analytical model framework, precipitation is considered to be a stochastic process, modeled as a marked Poisson process, and recession is considered to be deterministic, with parameters that can be computed based on different models. The analytical model framework was tested for three case studies with different hydrological regimes located in Switzerland: pluvial, snow-dominated and glacier. For that purpose, five time intervals were analyzed (the four meteorological seasons and the civil year) and two developments of the model were tested: one considering a linear recession model and the other adopting a nonlinear recession model. Those developments were combined with recession coefficients obtained from two different approaches: forward and inverse estimation. The performance of the analytical framework when considering forward parameter estimation is poor in comparison with the inverse estimation for both, linear and nonlinear models. For the pluvial catchment, the inverse estimation shows exceptional good results, especially for the nonlinear model, clearing suggesting that the model has the ability to describe FDCs. For the snow-dominated and glacier catchments the seasonal results are better than the annual ones suggesting that the model can describe streamflows in those conditions and that future efforts should focus on improving and combining seasonal curves instead of considering single annual ones.

Keywords: Analytical streamflow distribution, stochastic process, linear and non-linear recession, hydrological modelling, daily discharges.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 602
4338 Estimating Cost of R&D Activities for Feasibility Study of Public R&D Investment

Authors: Ie-jung Choi

Abstract:

Since the feasibility study of R&D programs have been initiated for efficient public R&D investments, year 2008, feasibility studies have improved in terms of precision. Although experience related to these studies of R&D programs have increased to a certain point, still methodological improvement is required. The feasibility studies of R&D programs are consisted of various viewpoints, such as technology, policy, and economics. This research is to provide improvement methods to the economic perspective; especially the cost estimation process of R&D activities. First of all, the fundamental concept of cost estimation is reviewed. After the review, a statistical and econometric analysis method is applied as empirical analysis. Conclusively, limitations and further research directions are provided.

Keywords: Cost Estimation, R&D Program, Feasibility AnalysisStudy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1587
4337 Low-complexity Integer Frequency Offset Synchronization for OFDMA System

Authors: Young-Jae Kim, Young-Hwan You

Abstract:

This paper presents a integer frequency offset (IFO) estimation scheme for the 3GPP long term evolution (LTE) downlink system. Firstly, the conventional joint detection method for IFO and sector cell index (CID) information is introduced. Secondly, an IFO estimation without explicit sector CID information is proposed, which can operate jointly with the proposed IFO estimation and reduce the time delay in comparison with the conventional joint method. Also, the proposed method is computationally efficient and has almost similar performance in comparison with the conventional method over the Pedestrian and Vehicular channel models.

Keywords: LTE, OFDMA, primary synchronization signal (PSS), IFO, CID

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2273
4336 Human Pose Estimation using Active Shape Models

Authors: Changhyuk Jang, Keechul Jung

Abstract:

Human pose estimation can be executed using Active Shape Models. The existing techniques for applying to human-body research using Active Shape Models, such as human detection, primarily take the form of silhouette of human body. This technique is not able to estimate accurately for human pose to concern two arms and legs, as the silhouette of human body represents the shape as out of round. To solve this problem, we applied the human body model as stick-figure, “skeleton". The skeleton model of human body can give consideration to various shapes of human pose. To obtain effective estimation result, we applied background subtraction and deformed matching algorithm of primary Active Shape Models in the fitting process. The images which were used to make the model were 600 human bodies, and the model has 17 landmark points which indicate body junction and key features of human pose. The maximum iteration for the fitting process was 30 times and the execution time was less than .03 sec.

Keywords: Active shape models, skeleton, pose estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2367
4335 Estimation of the Mean of the Selected Population

Authors: Kalu Ram Meena, Aditi Kar Gangopadhyay, Satrajit Mandal

Abstract:

Two normal populations with different means and same variance are considered, where the variance is known. The population with the smaller sample mean is selected. Various estimators are constructed for the mean of the selected normal population. Finally, they are compared with respect to the bias and MSE risks by the mehod of Monte-Carlo simulation and their performances are analysed with the help of graphs.

Keywords: Estimation after selection, Brewster-Zidek technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1363
4334 Application of Generalized Autoregressive Score Model to Stock Returns

Authors: Katleho Daniel Makatjane, Diteboho Lawrence Xaba, Ntebogang Dinah Moroke

Abstract:

The current study investigates the behaviour of time-varying parameters that are based on the score function of the predictive model density at time t. The mechanism to update the parameters over time is the scaled score of the likelihood function. The results revealed that there is high persistence of time-varying, as the location parameter is higher and the skewness parameter implied the departure of scale parameter from the normality with the unconditional parameter as 1.5. The results also revealed that there is a perseverance of the leptokurtic behaviour in stock returns which implies the returns are heavily tailed. Prior to model estimation, the White Neural Network test exposed that the stock price can be modelled by a GAS model. Finally, we proposed further researches specifically to model the existence of time-varying parameters with a more detailed model that encounters the heavy tail distribution of the series and computes the risk measure associated with the returns.

Keywords: Generalized autoregressive score model, stock returns, time-varying.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 996
4333 Asymmetrical Informative Estimation for Macroeconomic Model: Special Case in the Tourism Sector of Thailand

Authors: Chukiat Chaiboonsri, Satawat Wannapan

Abstract:

This paper used an asymmetric informative concept to apply in the macroeconomic model estimation of the tourism sector in Thailand. The variables used to statistically analyze are Thailand international and domestic tourism revenues, the expenditures of foreign and domestic tourists, service investments by private sectors, service investments by the government of Thailand, Thailand service imports and exports, and net service income transfers. All of data is a time-series index which was observed between 2002 and 2015. Empirically, the tourism multiplier and accelerator were estimated by two statistical approaches. The first was the result of the Generalized Method of Moments model (GMM) based on the assumption which the tourism market in Thailand had perfect information (Symmetrical data). The second was the result of the Maximum Entropy Bootstrapping approach (MEboot) based on the process that attempted to deal with imperfect information and reduced uncertainty in data observations (Asymmetrical data). In addition, the tourism leakages were investigated by a simple model based on the injections and leakages concept. The empirical findings represented the parameters computed from the MEboot approach which is different from the GMM method. However, both of the MEboot estimation and GMM model suggests that Thailand’s tourism sectors are in a period capable of stimulating the economy.

Keywords: Thailand tourism, maximum entropy bootstrapping approach, macroeconomic model, asymmetric information.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1207
4332 Development of Cooling Load Demand Program for Building in Malaysia

Authors: Zamri Noranai, Dayang Siti Zainab Abang Bujang, Rosli Asmawi, Hamidon Salleh, Mohammad Zainal Md Yusof

Abstract:

Air conditioning is mainly to be used as human comfort medium. It has been use more often in country in which the daily temperatures are high. In scientific, air conditioning is defined as a process of controlling the moisture, cooling, heating and cleaning air. Without proper estimation of cooling load, big amount of waste energy been used because of unsuitable of air conditioning system are not considering to overcoming heat gains from surrounding. This is due to the size of the room is too big and the air conditioning has to use more energy to cool the room and the air conditioning is too small for the room. The studies are basically to develop a program to calculate cooling load. Through this study it is easy to calculate cooling load estimation. Furthermore it-s help to compare the cooling load estimation by hourly and yearly. Base on the last study that been done, the developed software are not user-friendly. For individual without proper knowledge of calculating cooling load estimation might be problem. Easy excess and user-friendly should be the main objective to design something. This program will allow cooling load able be estimate by any users rather than estimation by using rule of thumb. Several of limitation of case study is judged to sure it-s meeting to Malaysia building specification. Finally validation is done by comparison manual calculation and by developed program.

Keywords: Building, Energy and Coaling Load

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2897
4331 Verified Experiment: Intelligent Fuzzy Weighted Input Estimation Method to Inverse Heat Conduction Problem

Authors: Chen-Yu Wang, Tsung-Chien Chen, Ming-Hui Lee, Jen-Feng Huang

Abstract:

In this paper, the innovative intelligent fuzzy weighted input estimation method (FWIEM) can be applied to the inverse heat transfer conduction problem (IHCP) to estimate the unknown time-varying heat flux efficiently as presented. The feasibility of this method can be verified by adopting the temperature measurement experiment. We would like to focus attention on the heat flux estimation to three kinds of samples (Copper, Iron and Steel/AISI 304) with the same 3mm thickness. The temperature measurements are then regarded as the inputs into the FWIEM to estimate the heat flux. The experiment results show that the proposed algorithm can estimate the unknown time-varying heat flux on-line.

Keywords: Fuzzy Weighted Input Estimation Method, IHCP andHeat Flux.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1494
4330 An Efficient Separation for Convolutive Mixtures

Authors: Salah Al-Din I. Badran, Samad Ahmadi, Dylan Menzies, Ismail Shahin

Abstract:

This paper describes a new efficient blind source separation method; in this method we uses a non-uniform filter bank and a new structure with different sub-bands. This method provides a reduced permutation and increased convergence speed comparing to the full-band algorithm. Recently, some structures have been suggested to deal with two problems: reducing permutation and increasing the speed of convergence of the adaptive algorithm for correlated input signals. The permutation problem is avoided with the use of adaptive filters of orders less than the full-band adaptive filter, which operate at a sampling rate lower than the sampling rate of the input signal. The decomposed signals by analysis bank filter are less correlated in each sub-band than the input signal at full-band, and can promote better rates of convergence.

Keywords: Blind source separation (BSS), estimates, full-band, mixtures, Sub-band.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1732
4329 An Interlacing Technique-Based Blind Video Watermarking Using Wavelet

Authors: B. Sridhar, C. Arun

Abstract:

The rapid growth of multimedia technology demands the secure and efficient access to information. This fast growing lose the confidence of unauthorized duplication. Henceforth the protection of multimedia content is becoming more important. Watermarking solves the issue of unlawful copy of advanced data. In this paper, blind video watermarking technique has been proposed. A luminance layer of selected frames is interlaced into two even and odd rows of an image, further it is deinterlaced and equalizes the coefficients of the two shares. Color watermark is split into different blocks, and the pieces of block are concealed in one of the share under the wavelet transform. Stack the two images into a single image by introducing interlaced even and odd rows in the two shares. Finally, chrominance bands are concatenated with the watermarked luminance band. The safeguard level of the secret information is high, and it is undetectable. Results show that the quality of the video is not changed also yields the better PSNR values.

Keywords: Authentication, data security, deinterlaced, wavelet transform, watermarking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2433
4328 Application of GM (1, 1) Model Group Based on Recursive Solution in China's Energy Demand Forecasting

Authors: Yeqing Guan, Fen Yang

Abstract:

To learn about China-s future energy demand, this paper first proposed GM(1,1) model group based on recursive solutions of parameters estimation, setting up a general solving-algorithm of the model group. This method avoided the problems occurred on the past researches that remodeling, loss of information and large amount of calculation. This paper established respectively all-data-GM(1,1), metabolic GM(1,1) and new information GM (1,1)model according to the historical data of energy consumption in China in the year 2005-2010 and the added data of 2011, then modeling, simulating and comparison of accuracies we got the optimal models and to predict. Results showed that the total energy demand of China will be 37.2221 billion tons of equivalent coal in 2012 and 39.7973 billion tons of equivalent coal in 2013, which are as the same as the overall planning of energy demand in The 12th Five-Year Plan.

Keywords: energy demands, GM(1, 1) model group, least square estimation, prediction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1512
4327 Parameter Estimation of Diode Circuit Using Extended Kalman Filter

Authors: Amit Kumar Gautam, Sudipta Majumdar

Abstract:

This paper presents parameter estimation of a single-phase rectifier using extended Kalman filter (EKF). The state space model has been obtained using Kirchhoff’s current law (KCL) and Kirchhoff’s voltage law (KVL). The capacitor voltage and diode current of the circuit have been estimated using EKF. Simulation results validate the better accuracy of the proposed method as compared to the least mean square method (LMS). Further, EKF has the advantage that it can be used for nonlinear systems.

Keywords: Extended Kalman filter, parameter estimation, single phase rectifier, state space modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 853
4326 Segmentation of Piecewise Polynomial Regression Model by Using Reversible Jump MCMC Algorithm

Authors: Suparman

Abstract:

Piecewise polynomial regression model is very flexible model for modeling the data. If the piecewise polynomial regression model is matched against the data, its parameters are not generally known. This paper studies the parameter estimation problem of piecewise polynomial regression model. The method which is used to estimate the parameters of the piecewise polynomial regression model is Bayesian method. Unfortunately, the Bayes estimator cannot be found analytically. Reversible jump MCMC algorithm is proposed to solve this problem. Reversible jump MCMC algorithm generates the Markov chain that converges to the limit distribution of the posterior distribution of piecewise polynomial regression model parameter. The resulting Markov chain is used to calculate the Bayes estimator for the parameters of piecewise polynomial regression model.

Keywords: Piecewise, Bayesian, reversible jump MCMC, segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1624
4325 Human Growth Curve Estimation through a Combination of Longitudinal and Cross-sectional Data

Authors: Sedigheh Mirzaei S., Debasis Sengupta

Abstract:

Parametric models have been quite popular for studying human growth, particularly in relation to biological parameters such as peak size velocity and age at peak size velocity. Longitudinal data are generally considered to be vital for fittinga parametric model to individual-specific data, and for studying the distribution of these biological parameters in a human population. However, cross-sectional data are easier to obtain than longitudinal data. In this paper, we present a method of combining longitudinal and cross-sectional data for the purpose of estimating the distribution of the biological parameters. We demonstrate, through simulations in the special case ofthePreece Baines model, how estimates based on longitudinal data can be improved upon by harnessing the information contained in cross-sectional data.We study the extent of improvement for different mixes of the two types of data, and finally illustrate the use of the method through data collected by the Indian Statistical Institute.

Keywords: Preece-Baines growth model, MCMC method, Mixed effect model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2092
4324 An Integrated Software Architecture for Bandwidth Adaptive Video Streaming

Authors: T. Arsan

Abstract:

Video streaming over lossy IP networks is very important issues, due to the heterogeneous structure of networks. Infrastructure of the Internet exhibits variable bandwidths, delays, congestions and time-varying packet losses. Because of variable attributes of the Internet, video streaming applications should not only have a good end-to-end transport performance but also have a robust rate control, furthermore multipath rate allocation mechanism. So for providing the video streaming service quality, some other components such as Bandwidth Estimation and Adaptive Rate Controller should be taken into consideration. This paper gives an overview of video streaming concept and bandwidth estimation tools and then introduces special architectures for bandwidth adaptive video streaming. A bandwidth estimation algorithm – pathChirp, Optimized Rate Controllers and Multipath Rate Allocation Algorithm are considered as all-in-one solution for video streaming problem. This solution is directed and optimized by a decision center which is designed for obtaining the maximum quality at the receiving side.

Keywords: Adaptive Video Streaming, Bandwidth Estimation, QoS, Software Architecture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1391
4323 Monte Carlo Estimation of Heteroscedasticity and Periodicity Effects in a Panel Data Regression Model

Authors: Nureni O. Adeboye, Dawud A. Agunbiade

Abstract:

This research attempts to investigate the effects of heteroscedasticity and periodicity in a Panel Data Regression Model (PDRM) by extending previous works on balanced panel data estimation within the context of fitting PDRM for Banks audit fee. The estimation of such model was achieved through the derivation of Joint Lagrange Multiplier (LM) test for homoscedasticity and zero-serial correlation, a conditional LM test for zero serial correlation given heteroscedasticity of varying degrees as well as conditional LM test for homoscedasticity given first order positive serial correlation via a two-way error component model. Monte Carlo simulations were carried out for 81 different variations, of which its design assumed a uniform distribution under a linear heteroscedasticity function. Each of the variation was iterated 1000 times and the assessment of the three estimators considered are based on Variance, Absolute bias (ABIAS), Mean square error (MSE) and the Root Mean Square (RMSE) of parameters estimates. Eighteen different models at different specified conditions were fitted, and the best-fitted model is that of within estimator when heteroscedasticity is severe at either zero or positive serial correlation value. LM test results showed that the tests have good size and power as all the three tests are significant at 5% for the specified linear form of heteroscedasticity function which established the facts that Banks operations are severely heteroscedastic in nature with little or no periodicity effects.

Keywords: Audit fee, heteroscedasticity, Lagrange multiplier test, periodicity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 688
4322 Implementation of SU-MIMO and MU-MIMOGTD-System under Imperfect CSI Knowledge

Authors: Parit Kanjanavirojkul, Kiatwarakorn Keeratishananond, Prapun Suksompong

Abstract:

We study the performance of compressed beamforming weights feedback technique in generalized triangular decomposition (GTD) based MIMO system. GTD is a beamforming technique that enjoys QoS flexibility. The technique, however, will perform at its optimum only when the full knowledge of channel state information (CSI) is available at the transmitter. This would be impossible in the real system, where there are channel estimation error and limited feedback. We suggest a way to implement the quantized beamforming weights feedback, which can significantly reduce the feedback data, on GTD-based MIMO system and investigate the performance of the system. Interestingly, we found that compressed beamforming weights feedback does not degrade the BER performance of the system at low input power, while the channel estimation error and quantization do. For comparison, GTD is more sensitive to compression and quantization, while SVD is more sensitive to the channel estimation error. We also explore the performance of GTDbased MU-MIMO system, and find that the BER performance starts to degrade largely at around -20 dB channel estimation error.

Keywords: MIMO, MU-MIMO, GTD, Imperfect CSI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1901
4321 On the Modeling and State Estimation for Dynamic Power System

Authors: A. Thabet, M. Boutayeb, M. N. Abdelkrim

Abstract:

This paper investigates a method for the state estimation of nonlinear systems described by a class of differential-algebraic equation (DAE) models using the extended Kalman filter. The method involves the use of a transformation from a DAE to ordinary differential equation (ODE). A relevant dynamic power system model using decoupled techniques will be proposed. The estimation technique consists of a state estimator based on the EKF technique as well as the local stability analysis. High performances are illustrated through a simulation study applied on IEEE 13 buses test system.

Keywords: Power system, Dynamic decoupled model, Extended Kalman Filter, Convergence analysis, Time computing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2692
4320 Estimation of Skew Angle in Binary Document Images Using Hough Transform

Authors: Nandini N., Srikanta Murthy K., G. Hemantha Kumar

Abstract:

This paper includes two novel techniques for skew estimation of binary document images. These algorithms are based on connected component analysis and Hough transform. Both these methods focus on reducing the amount of input data provided to Hough transform. In the first method, referred as word centroid approach, the centroids of selected words are used for skew detection. In the second method, referred as dilate & thin approach, the selected characters are blocked and dilated to get word blocks and later thinning is applied. The final image fed to Hough transform has the thinned coordinates of word blocks in the image. The methods have been successful in reducing the computational complexity of Hough transform based skew estimation algorithms. Promising experimental results are also provided to prove the effectiveness of the proposed methods.

Keywords: Dilation, Document processing, Hough transform, Optical Character Recognition, Skew estimation, and Thinning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3224