Search results for: Pitman estimator.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 131

Search results for: Pitman estimator.

41 Does Corporate Governance or Transparency Affect Foreign Direct Investment?

Authors: Haksoon Kim

Abstract:

The paper investigates the relationship between the foreign direct investment (FDI) and the corporate governance or transparency by investigating the country-level FDI flows, FDI inward performance, corporate governance and transparency variables. From the regression analysis with Newey-West estimator of 28 country panel data from 1990- 2002, we find strong positive relationships between corporate governance or transparency level of hosting countries and FDI inward performance within hosting countries. A strong positive relationship is found between anti-director rights level or number of analysts of hosting countries and FDI inward performance within hosting countries. Also, we find a positive relationship between the number of analysts of hosting countries and FDI inflows. The empirical results are consistent with stock market liberalizations and corporate governance explanations of reasons for FDI.

Keywords: corporate governance, corporate transparency, FDIflows, FDI inward performance

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2761
40 A Method for 3D Mesh Adaptation in FEA

Authors: S. Sfarni, E. Bellenger, J. Fortin, M. Guessasma

Abstract:

The use of the mechanical simulation (in particular the finite element analysis) requires the management of assumptions in order to analyse a real complex system. In finite element analysis (FEA), two modeling steps require assumptions to be able to carry out the computations and to obtain some results: the building of the physical model and the building of the simulation model. The simplification assumptions made on the analysed system in these two steps can generate two kinds of errors: the physical modeling errors (mathematical model, domain simplifications, materials properties, boundary conditions and loads) and the mesh discretization errors. This paper proposes a mesh adaptive method based on the use of an h-adaptive scheme in combination with an error estimator in order to choose the mesh of the simulation model. This method allows us to choose the mesh of the simulation model in order to control the cost and the quality of the finite element analysis.

Keywords: Finite element, discretization errors, adaptivity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1478
39 Effects of Market Share and Diversification on Nonlife Insurers- Performance

Authors: M. Pervan, T. Pavic Kramaric

Abstract:

The aim of this paper is to investigate the influence of market share and diversification on the nonlife insurers- performance. The underlying relationships have been investigated in different industries and different disciplines (economics, management...), still, no consistency exists either in the magnitude or statistical significance of the relationship between market share (and diversification as well) on one side and companies- performance on the other side. Moreover, the direction of the relationship is also somewhat questionable. While some authors find this relationship to be positive, the others reveal its negative association. In order to test the influence of market share and diversification on companies- performance in Croatian nonlife insurance industry for the period from 1999 to 2009, we designed an empirical model in which we included the following independent variables: firms- profitability from previous years, market share, diversification and control variables (i.e. ownership, industrial concentration, GDP per capita, inflation). Using the two-step generalized method of moments (GMM) estimator we found evidence of a positive and statistically significant influence of both, market share and diversification, on insurers- profitability.

Keywords: Diversification, market share, nonlife insurance

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1687
38 A Novel Multiresolution based Optimization Scheme for Robust Affine Parameter Estimation

Authors: J.Dinesh Peter

Abstract:

This paper describes a new method for affine parameter estimation between image sequences. Usually, the parameter estimation techniques can be done by least squares in a quadratic way. However, this technique can be sensitive to the presence of outliers. Therefore, parameter estimation techniques for various image processing applications are robust enough to withstand the influence of outliers. Progressively, some robust estimation functions demanding non-quadratic and perhaps non-convex potentials adopted from statistics literature have been used for solving these. Addressing the optimization of the error function in a factual framework for finding a global optimal solution, the minimization can begin with the convex estimator at the coarser level and gradually introduce nonconvexity i.e., from soft to hard redescending non-convex estimators when the iteration reaches finer level of multiresolution pyramid. Comparison has been made to find the performance of the results of proposed method with the results found individually using two different estimators.

Keywords: Image Processing, Affine parameter estimation, Outliers, Robust Statistics, Robust M-estimators

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454
37 Linear Quadratic Gaussian/Loop Transfer Recover Control Flight Control on a Nonlinear Model

Authors: T. Sanches, K. Bousson

Abstract:

As part of the development of a 4D autopilot system for unmanned aerial vehicles (UAVs), i.e. a time-dependent robust trajectory generation and control algorithm, this work addresses the problem of optimal path control based on the flight sensors data output that may be unreliable due to noise on data acquisition and/or transmission under certain circumstances. Although several filtering methods, such as the Kalman-Bucy filter or the Linear Quadratic Gaussian/Loop Transfer Recover Control (LQG/LTR), are available, the utter complexity of the control system, together with the robustness and reliability required of such a system on a UAV for airworthiness certifiable autonomous flight, required the development of a proper robust filter for a nonlinear system, as a way of further mitigate errors propagation to the control system and improve its ,performance. As such, a nonlinear algorithm based upon the LQG/LTR, is validated through computational simulation testing, is proposed on this paper.

Keywords: Autonomous flight, LQG/LTR, nonlinear state estimator, robust flight control and stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 695
36 A Self Adaptive Genetic Based Algorithm for the Identification and Elimination of Bad Data

Authors: A. A. Hossam-Eldin, E. N. Abdallah, M. S. El-Nozahy

Abstract:

The identification and elimination of bad measurements is one of the basic functions of a robust state estimator as bad data have the effect of corrupting the results of state estimation according to the popular weighted least squares method. However this is a difficult problem to handle especially when dealing with multiple errors from the interactive conforming type. In this paper, a self adaptive genetic based algorithm is proposed. The algorithm utilizes the results of the classical linearized normal residuals approach to tune the genetic operators thus instead of making a randomized search throughout the whole search space it is more likely to be a directed search thus the optimum solution is obtained at very early stages(maximum of 5 generations). The algorithm utilizes the accumulating databases of already computed cases to reduce the computational burden to minimum. Tests are conducted with reference to the standard IEEE test systems. Test results are very promising.

Keywords: Bad Data, Genetic Algorithms, Linearized Normal residuals, Observability, Power System State Estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1346
35 Variogram Fitting Based on the Wilcoxon Norm

Authors: Hazem Al-Mofleh, John Daniels, Joseph McKean

Abstract:

Within geostatistics research, effective estimation of the variogram points has been examined, particularly in developing robust alternatives. The parametric fit of these variogram points which eventually defines the kriging weights, however, has not received the same attention from a robust perspective. This paper proposes the use of the non-linear Wilcoxon norm over weighted non-linear least squares as a robust variogram fitting alternative. First, we introduce the concept of variogram estimation and fitting. Then, as an alternative to non-linear weighted least squares, we discuss the non-linear Wilcoxon estimator. Next, the robustness properties of the non-linear Wilcoxon are demonstrated using a contaminated spatial data set. Finally, under simulated conditions, increasing levels of contaminated spatial processes have their variograms points estimated and fit. In the fitting of these variogram points, both non-linear Weighted Least Squares and non-linear Wilcoxon fits are examined for efficiency. At all levels of contamination (including 0%), using a robust estimation and robust fitting procedure, the non-weighted Wilcoxon outperforms weighted Least Squares.

Keywords: Non-Linear Wilcoxon, robust estimation, Variogram estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 969
34 Computational Intelligence Hybrid Learning Approach to Time Series Forecasting

Authors: Chunshien Li, Jhao-Wun Hu, Tai-Wei Chiang, Tsunghan Wu

Abstract:

Time series forecasting is an important and widely popular topic in the research of system modeling. This paper describes how to use the hybrid PSO-RLSE neuro-fuzzy learning approach to the problem of time series forecasting. The PSO algorithm is used to update the premise parameters of the proposed prediction system, and the RLSE is used to update the consequence parameters. Thanks to the hybrid learning (HL) approach for the neuro-fuzzy system, the prediction performance is excellent and the speed of learning convergence is much faster than other compared approaches. In the experiments, we use the well-known Mackey-Glass chaos time series. According to the experimental results, the prediction performance and accuracy in time series forecasting by the proposed approach is much better than other compared approaches, as shown in Table IV. Excellent prediction performance by the proposed approach has been observed.

Keywords: forecasting, hybrid learning (HL), Neuro-FuzzySystem (NFS), particle swarm optimization (PSO), recursiveleast-squares estimator (RLSE), time series

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1559
33 Belt Conveyor Dynamics in Transient Operation for Speed Control

Authors: D. He, Y. Pang, G. Lodewijks

Abstract:

Belt conveyors play an important role in continuous dry bulk material transport, especially at the mining industry. Speed control is expected to reduce the energy consumption of belt conveyors. Transient operation is the operation of increasing or decreasing conveyor speed for speed control. According to literature review, current research rarely takes the conveyor dynamics in transient operation into account. However, in belt conveyor speed control, the conveyor dynamic behaviors are significantly important since the poor dynamics might result in risks. In this paper, the potential risks in transient operation will be analyzed. An existing finite element model will be applied to build a conveyor model, and simulations will be carried out to analyze the conveyor dynamics. In order to realize the soft speed regulation, Harrison’s sinusoid acceleration profile will be applied, and Lodewijks estimator will be built to approximate the required acceleration time. A long inclined belt conveyor will be studied with two major simulations. The conveyor dynamics will be given.

Keywords: Belt conveyor, speed control, transient operation, dynamics

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2331
32 Inferences on Compound Rayleigh Parameters with Progressively Type-II Censored Samples

Authors: Abdullah Y. Al-Hossain

Abstract:

This paper considers inference under progressive type II censoring with a compound Rayleigh failure time distribution. The maximum likelihood (ML), and Bayes methods are used for estimating the unknown parameters as well as some lifetime parameters, namely reliability and hazard functions. We obtained Bayes estimators using the conjugate priors for two shape and scale parameters. When the two parameters are unknown, the closed-form expressions of the Bayes estimators cannot be obtained. We use Lindley.s approximation to compute the Bayes estimates. Another Bayes estimator has been obtained based on continuous-discrete joint prior for the unknown parameters. An example with the real data is discussed to illustrate the proposed method. Finally, we made comparisons between these estimators and the maximum likelihood estimators using a Monte Carlo simulation study.

Keywords: Progressive type II censoring, compound Rayleigh failure time distribution, maximum likelihood estimation, Bayes estimation, Lindley's approximation method, Monte Carlo simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2390
31 An Evaluation Method of Accelerated Storage Life Test for Typical Mechanical and Electronic Products

Authors: Jinyong Yao, Hongzhi Li, Chao Du, Jiao Li

Abstract:

Reliability of long-term storage products is related to the availability of the whole system, and the evaluation of storage life is of great necessity. These products are usually highly reliable and little failure information can be collected. In this paper, an analytical method based on data from accelerated storage life test is proposed to evaluate the reliability index of the long-term storage products. Firstly, singularities are eliminated by data normalization and residual analysis. Secondly, with the preprocessed data, the degradation path model is built to obtain the pseudo life values. Then by life distribution hypothesis, we can get the estimator of parameters in high stress levels and verify failure mechanism consistency. Finally, the life distribution under the normal stress level is extrapolated via the acceleration model and evaluation of the actual average life is available. An application example with the camera stabilization device is provided to illustrate the methodology we proposed.

Keywords: Accelerated storage life test, failure mechanism consistency, life distribution, reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2285
30 Stator-Flux-Oriented Based Encoderless Direct Torque Control for Synchronous Reluctance Machines Using Sliding Mode Approach

Authors: J. Soltani, H. Abootorabi Zarchi, Gh. R. Arab Markadeh

Abstract:

In this paper a sliding-mode torque and flux control is designed for encoderless synchronous reluctance motor drive. The sliding-mode plus PI controllers are designed in the stator-flux field oriented reference frame which is able to track the mentioned reference signals with a minimum pulsations in the state condition. In addition, with these controllers a fast dynamic response is also achieved for the drive system. The proposed control scheme is robust subject to parameters variation except to stator resistance. To solve this problem a simple estimator is used for on-line detecting of this parameter. Moreover, the rotor position and speed are estimated by on-line obtaining of the stator-flux-space vector. The effectiveness and capability of the proposed control approach is verified by both the simulation and experimental results.

Keywords: Synchronous Reluctance Motor, Direct Torque and Flux Control, Sliding Mode, Field-Oriented Frame, Encoderless.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2577
29 Random Projections for Dimensionality Reduction in ICA

Authors: Sabrina Gaito, Andrea Greppi, Giuliano Grossi

Abstract:

In this paper we present a technique to speed up ICA based on the idea of reducing the dimensionality of the data set preserving the quality of the results. In particular we refer to FastICA algorithm which uses the Kurtosis as statistical property to be maximized. By performing a particular Johnson-Lindenstrauss like projection of the data set, we find the minimum dimensionality reduction rate ¤ü, defined as the ratio between the size k of the reduced space and the original one d, which guarantees a narrow confidence interval of such estimator with high confidence level. The derived dimensionality reduction rate depends on a system control parameter β easily computed a priori on the basis of the observations only. Extensive simulations have been done on different sets of real world signals. They show that actually the dimensionality reduction is very high, it preserves the quality of the decomposition and impressively speeds up FastICA. On the other hand, a set of signals, on which the estimated reduction rate is greater than 1, exhibits bad decomposition results if reduced, thus validating the reliability of the parameter β. We are confident that our method will lead to a better approach to real time applications.

Keywords: Independent Component Analysis, FastICA algorithm, Higher-order statistics, Johnson-Lindenstrauss lemma.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1890
28 Power System Security Constrained Economic Dispatch Using Real Coded Quantum Inspired Evolution Algorithm

Authors: A. K. Al-Othman, F. S. Al-Fares, K. M. EL-Nagger

Abstract:

This paper presents a new optimization technique based on quantum computing principles to solve a security constrained power system economic dispatch problem (SCED). The proposed technique is a population-based algorithm, which uses some quantum computing elements in coding and evolving groups of potential solutions to reach the optimum following a partially directed random approach. The SCED problem is formulated as a constrained optimization problem in a way that insures a secure-economic system operation. Real Coded Quantum-Inspired Evolution Algorithm (RQIEA) is then applied to solve the constrained optimization formulation. Simulation results of the proposed approach are compared with those reported in literature. The outcome is very encouraging and proves that RQIEA is very applicable for solving security constrained power system economic dispatch problem (SCED).

Keywords: State Estimation, Fuzzy Linear Regression, FuzzyLinear State Estimator (FLSE) and Measurements Uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715
27 Performances Comparison of Neural Architectures for On-Line Speed Estimation in Sensorless IM Drives

Authors: K.Sedhuraman, S.Himavathi, A.Muthuramalingam

Abstract:

The performance of sensor-less controlled induction motor drive depends on the accuracy of the estimated speed. Conventional estimation techniques being mathematically complex require more execution time resulting in poor dynamic response. The nonlinear mapping capability and powerful learning algorithms of neural network provides a promising alternative for on-line speed estimation. The on-line speed estimator requires the NN model to be accurate, simpler in design, structurally compact and computationally less complex to ensure faster execution and effective control in real time implementation. This in turn to a large extent depends on the type of Neural Architecture. This paper investigates three types of neural architectures for on-line speed estimation and their performance is compared in terms of accuracy, structural compactness, computational complexity and execution time. The suitable neural architecture for on-line speed estimation is identified and the promising results obtained are presented.

Keywords: Sensorless IM drives, rotor speed estimators, artificial neural network, feed- forward architecture, single neuron cascaded architecture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1458
26 Second Order Sliding Mode Observer Using MRAS Theory for Sensorless Control of Multiphase Induction Machine

Authors: Mohammad Jafarifar

Abstract:

This paper presents a speed estimation scheme based on second-order sliding-mode Super Twisting Algorithm (STA) and Model Reference Adaptive System (MRAS) estimation theory for Sensorless control of multiphase induction machine. A stator current observer is designed based on the STA, which is utilized to take the place of the reference voltage model of the standard MRAS algorithm. The observer is insensitive to the variation of rotor resistance and magnetizing inductance when the states arrive at the sliding mode. Derivatives of rotor flux are obtained and designed as the state of MRAS, thus eliminating the integration. Compared with the first-order sliding-mode speed estimator, the proposed scheme makes full use of the auxiliary sliding-mode surface, thus alleviating the chattering behavior without increasing the complexity. Simulation results show the robustness and effectiveness of the proposed scheme.

Keywords: Multiphase induction machine, field oriented control, sliding mode, super twisting algorithm, MRAS algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2294
25 Perceptual JPEG Compliant Coding by Using DCT-Based Visibility Thresholds of Color Images

Authors: Kuo-Cheng Liu

Abstract:

Effective estimation of just noticeable distortion (JND) for images is helpful to increase the efficiency of a compression algorithm in which both the statistical redundancy and the perceptual redundancy should be accurately removed. In this paper, we design a DCT-based model for estimating JND profiles of color images. Based on a mathematical model of measuring the base detection threshold for each DCT coefficient in the color component of color images, the luminance masking adjustment, the contrast masking adjustment, and the cross masking adjustment are utilized for luminance component, and the variance-based masking adjustment based on the coefficient variation in the block is proposed for chrominance components. In order to verify the proposed model, the JND estimator is incorporated into the conventional JPEG coder to improve the compression performance. A subjective and fair viewing test is designed to evaluate the visual quality of the coding image under the specified viewing condition. The simulation results show that the JPEG coder integrated with the proposed DCT-based JND model gives better coding bit rates at visually lossless quality for a variety of color images.

Keywords: Just-noticeable distortion (JND), discrete cosine transform (DCT), JPEG.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2581
24 Application of Mutual Information based Least dependent Component Analysis (MILCA) for Removal of Ocular Artifacts from Electroencephalogram

Authors: V Krishnaveni, S Jayaraman, K Ramadoss

Abstract:

The electrical potentials generated during eye movements and blinks are one of the main sources of artifacts in Electroencephalogram (EEG) recording and can propagate much across the scalp, masking and distorting brain signals. In recent times, signal separation algorithms are used widely for removing artifacts from the observed EEG data. In this paper, a recently introduced signal separation algorithm Mutual Information based Least dependent Component Analysis (MILCA) is employed to separate ocular artifacts from EEG. The aim of MILCA is to minimize the Mutual Information (MI) between the independent components (estimated sources) under a pure rotation. Performance of this algorithm is compared with eleven popular algorithms (Infomax, Extended Infomax, Fast ICA, SOBI, TDSEP, JADE, OGWE, MS-ICA, SHIBBS, Kernel-ICA, and RADICAL) for the actual independence and uniqueness of the estimated source components obtained for different sets of EEG data with ocular artifacts by using a reliable MI Estimator. Results show that MILCA is best in separating the ocular artifacts and EEG and is recommended for further analysis.

Keywords: Electroencephalogram, Ocular Artifacts (OA), Independent Component Analysis (ICA), Mutual Information (MI), Mutual Information based Least dependent Component Analysis(MILCA)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2193
23 Enhancing the Performance of Wireless Sensor Networks Using Low Power Design

Authors: N. Mahendran, R. Madhuranthi

Abstract:

Wireless sensor networks (WSNs), are constantly in demand to process information more rapidly with less energy and area cost. Presently, processor based solutions have difficult to achieve high processing speed with low-power consumption. This paper presents a simple and accurate data processing scheme for low power wireless sensor node, based on reduced number of processing element (PE). The presented model provides a simple recursive structure (SRS) to process the sampled data in the wireless sensor environment and to reduce the power consumption in wireless sensor node. Based on this model, to process the incoming samples and produce a smaller amount of data sufficient to reconstruct the original signal. The ModelSim simulator used to simulate SRS structure. Functional simulation is carried out for the validation of the presented architecture. Xilinx Power Estimator (XPE) tool is used to measure the power consumption. The experimental results show the average power consumption of 91 mW; this is 42% improvement compared to the folded tree architecture.

Keywords: Power consumption, energy efficiency, low power WSN node, recursive structure, sleep/wake scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1014
22 Computing Transition Intensity Using Time-Homogeneous Markov Jump Process: Case of South African HIV/AIDS Disposition

Authors: A. Bayaga

Abstract:

This research provides a technical account of estimating Transition Probability using Time-homogeneous Markov Jump Process applying by South African HIV/AIDS data from the Statistics South Africa. It employs Maximum Likelihood Estimator (MLE) model to explore the possible influence of Transition Probability of mortality cases in which case the data was based on actual Statistics South Africa. This was conducted via an integrated demographic and epidemiological model of South African HIV/AIDS epidemic. The model was fitted to age-specific HIV prevalence data and recorded death data using MLE model. Though the previous model results suggest HIV in South Africa has declined and AIDS mortality rates have declined since 2002 – 2013, in contrast, our results differ evidently with the generally accepted HIV models (Spectrum/EPP and ASSA2008) in South Africa. However, there is the need for supplementary research to be conducted to enhance the demographic parameters in the model and as well apply it to each of the nine (9) provinces of South Africa.

Keywords: AIDS mortality rates, Epidemiological model, Time-homogeneous Markov Jump Process, Transition Probability, Statistics South Africa.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2171
21 Robotic End-Effector Impedance Control without Expensive Torque/Force Sensor

Authors: Shiuh-Jer Huang, Yu-Chi Liu, Su-Hai Hsiang

Abstract:

A novel low-cost impedance control structure is proposed for monitoring the contact force between end-effector and environment without installing an expensive force/torque sensor. Theoretically, the end-effector contact force can be estimated from the superposition of each joint control torque. There have a nonlinear matrix mapping function between each joint motor control input and end-effector actuating force/torques vector. This new force control structure can be implemented based on this estimated mapping matrix. First, the robot end-effector is manipulated to specified positions, then the force controller is actuated based on the hall sensor current feedback of each joint motor. The model-free fuzzy sliding mode control (FSMC) strategy is employed to design the position and force controllers, respectively. All the hardware circuits and software control programs are designed on an Altera Nios II embedded development kit to constitute an embedded system structure for a retrofitted Mitsubishi 5 DOF robot. Experimental results show that PI and FSMC force control algorithms can achieve reasonable contact force monitoring objective based on this hardware control structure.

Keywords: Robot, impedance control, fuzzy sliding mode control, contact force estimator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4018
20 3D Locomotion and Fractal Analysis of Goldfish for Acute Toxicity Bioassay

Authors: Kittiwann Nimkerdphol, Masahiro Nakagawa

Abstract:

Biological reactions of individuals of a testing animal to toxic substance are unique and can be used as an indication of the existing of toxic substance. However, to distinguish such phenomenon need a very complicate system and even more complicate to analyze data in 3 dimensional. In this paper, a system to evaluate in vitro biological activities to acute toxicity of stochastic self-affine non-stationary signal of 3D goldfish swimming by using fractal analysis is introduced. Regular digital camcorders are utilized by proposed algorithm 3DCCPC to effectively capture and construct 3D movements of the fish. A Critical Exponent Method (CEM) has been adopted as a fractal estimator. The hypothesis was that the swimming of goldfish to acute toxic would show the fractal property which related to the toxic concentration. The experimental results supported the hypothesis by showing that the swimming of goldfish under the different toxic concentration has fractal properties. It also shows that the fractal dimension of the swimming related to the pH value of FD Ôëê 0.26pH + 0.05. With the proposed system, the fish is allowed to swim freely in all direction to react to the toxic. In addition, the trajectories are precisely evaluated by fractal analysis with critical exponent method and hence the results exhibit with much higher degree of confidence.

Keywords: 3D locomotion, bioassay, critical exponent method, CEM, fractal analysis, goldfish.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1724
19 Maternal Health Outcome and Economic Growth in Sub-Saharan Africa: A Dynamic Panel Analysis

Authors: Okwan Frank

Abstract:

Maternal health outcome is one of the major population development challenges in Sub-Saharan Africa. The region has the highest maternal mortality ratio, despite the progressive economic growth in the region during the global economic crisis. It has been hypothesized that increase in economic growth will reduce the level of maternal mortality. The purpose of this study is to investigate the existence of the negative relationship between health outcome proxy by maternal mortality ratio and economic growth in Sub-Saharan Africa. The study used the Pooled Mean Group estimator of ARDL Autoregressive Distributed Lag (ARDL) and the Kao test for cointegration to examine the short-run and long-run relationship between maternal mortality and economic growth. The results of the cointegration test showed the existence of a long-run relationship between the variables considered for the study. The long-run result of the Pooled Mean group estimates confirmed the hypothesis of an inverse relationship between maternal health outcome proxy by maternal mortality ratio and economic growth proxy by Gross Domestic Product (GDP) per capita. Thus increasing economic growth by investing in the health care systems to reduce pregnancy and childbirth complications will help reduce maternal mortality in the sub-region.

Keywords: Economic growth, maternal mortality, pool mean group, Sub-Saharan Africa.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 587
18 High Performance of Direct Torque and Flux Control of a Double Stator Induction Motor Drive with a Fuzzy Stator Resistance Estimator

Authors: K. Kouzi

Abstract:

In order to have stable and high performance of direct torque and flux control (DTFC) of double star induction motor drive (DSIM), proper on-line adaptation of the stator resistance is very important. This is inevitably due to the variation of the stator resistance during operating conditions, which introduces error in estimated flux position and the magnitude of the stator flux. Error in the estimated stator flux deteriorates the performance of the DTFC drive. Also, the effect of error in estimation is very important especially at low speed. Due to this, our aim is to overcome the sensitivity of the DTFC to the stator resistance variation by proposing on-line fuzzy estimation stator resistance. The fuzzy estimation method is based on an on-line stator resistance correction through the variations of the stator current estimation error and its variations. The fuzzy logic controller gives the future stator resistance increment at the output. The main advantage of the suggested algorithm control is to avoid the drive instability that may occur in certain situations and ensure the tracking of the actual stator resistance. The validity of the technique and the improvement of the whole system performance are proved by the results.

Keywords: Direct torque control, dual stator induction motor, fuzzy logic estimation, stator resistance adaptation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1162
17 A Renovated Cook's Distance Based On The Buckley-James Estimate In Censored Regression

Authors: Nazrina Aziz, Dong Q. Wang

Abstract:

There have been various methods created based on the regression ideas to resolve the problem of data set containing censored observations, i.e. the Buckley-James method, Miller-s method, Cox method, and Koul-Susarla-Van Ryzin estimators. Even though comparison studies show the Buckley-James method performs better than some other methods, it is still rarely used by researchers mainly because of the limited diagnostics analysis developed for the Buckley-James method thus far. Therefore, a diagnostic tool for the Buckley-James method is proposed in this paper. It is called the renovated Cook-s Distance, (RD* i ) and has been developed based on the Cook-s idea. The renovated Cook-s Distance (RD* i ) has advantages (depending on the analyst demand) over (i) the change in the fitted value for a single case, DFIT* i as it measures the influence of case i on all n fitted values Yˆ∗ (not just the fitted value for case i as DFIT* i) (ii) the change in the estimate of the coefficient when the ith case is deleted, DBETA* i since DBETA* i corresponds to the number of variables p so it is usually easier to look at a diagnostic measure such as RD* i since information from p variables can be considered simultaneously. Finally, an example using Stanford Heart Transplant data is provided to illustrate the proposed diagnostic tool.

Keywords: Buckley-James estimators, censored regression, censored data, diagnostic analysis, product-limit estimator, renovated Cook's Distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1438
16 Trajectory Guided Recognition of Hand Gestures having only Global Motions

Authors: M. K. Bhuyan, P. K. Bora, D. Ghosh

Abstract:

One very interesting field of research in Pattern Recognition that has gained much attention in recent times is Gesture Recognition. In this paper, we consider a form of dynamic hand gestures that are characterized by total movement of the hand (arm) in space. For these types of gestures, the shape of the hand (palm) during gesturing does not bear any significance. In our work, we propose a model-based method for tracking hand motion in space, thereby estimating the hand motion trajectory. We employ the dynamic time warping (DTW) algorithm for time alignment and normalization of spatio-temporal variations that exist among samples belonging to the same gesture class. During training, one template trajectory and one prototype feature vector are generated for every gesture class. Features used in our work include some static and dynamic motion trajectory features. Recognition is accomplished in two stages. In the first stage, all unlikely gesture classes are eliminated by comparing the input gesture trajectory to all the template trajectories. In the next stage, feature vector extracted from the input gesture is compared to all the class prototype feature vectors using a distance classifier. Experimental results demonstrate that our proposed trajectory estimator and classifier is suitable for Human Computer Interaction (HCI) platform.

Keywords: Hand gesture, human computer interaction, key video object plane, dynamic time warping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2742
15 Objects Extraction by Cooperating Optical Flow, Edge Detection and Region Growing Procedures

Authors: C. Lodato, S. Lopes

Abstract:

The image segmentation method described in this paper has been developed as a pre-processing stage to be used in methodologies and tools for video/image indexing and retrieval by content. This method solves the problem of whole objects extraction from background and it produces images of single complete objects from videos or photos. The extracted images are used for calculating the object visual features necessary for both indexing and retrieval processes. The segmentation algorithm is based on the cooperation among an optical flow evaluation method, edge detection and region growing procedures. The optical flow estimator belongs to the class of differential methods. It permits to detect motions ranging from a fraction of a pixel to a few pixels per frame, achieving good results in presence of noise without the need of a filtering pre-processing stage and includes a specialised model for moving object detection. The first task of the presented method exploits the cues from motion analysis for moving areas detection. Objects and background are then refined using respectively edge detection and seeded region growing procedures. All the tasks are iteratively performed until objects and background are completely resolved. The method has been applied to a variety of indoor and outdoor scenes where objects of different type and shape are represented on variously textured background.

Keywords: Image Segmentation, Motion Detection, Object Extraction, Optical Flow

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1756
14 State Estimation Based on Unscented Kalman Filter for Burgers’ Equation

Authors: Takashi Shimizu, Tomoaki Hashimoto

Abstract:

Controlling the flow of fluids is a challenging problem that arises in many fields. Burgers’ equation is a fundamental equation for several flow phenomena such as traffic, shock waves, and turbulence. The optimal feedback control method, so-called model predictive control, has been proposed for Burgers’ equation. However, the model predictive control method is inapplicable to systems whose all state variables are not exactly known. In practical point of view, it is unusual that all the state variables of systems are exactly known, because the state variables of systems are measured through output sensors and limited parts of them can be only available. In fact, it is usual that flow velocities of fluid systems cannot be measured for all spatial domains. Hence, any practical feedback controller for fluid systems must incorporate some type of state estimator. To apply the model predictive control to the fluid systems described by Burgers’ equation, it is needed to establish a state estimation method for Burgers’ equation with limited measurable state variables. To this purpose, we apply unscented Kalman filter for estimating the state variables of fluid systems described by Burgers’ equation. The objective of this study is to establish a state estimation method based on unscented Kalman filter for Burgers’ equation. The effectiveness of the proposed method is verified by numerical simulations.

Keywords: State estimation, fluid systems, observer systems, unscented Kalman filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 742
13 Monte Carlo Estimation of Heteroscedasticity and Periodicity Effects in a Panel Data Regression Model

Authors: Nureni O. Adeboye, Dawud A. Agunbiade

Abstract:

This research attempts to investigate the effects of heteroscedasticity and periodicity in a Panel Data Regression Model (PDRM) by extending previous works on balanced panel data estimation within the context of fitting PDRM for Banks audit fee. The estimation of such model was achieved through the derivation of Joint Lagrange Multiplier (LM) test for homoscedasticity and zero-serial correlation, a conditional LM test for zero serial correlation given heteroscedasticity of varying degrees as well as conditional LM test for homoscedasticity given first order positive serial correlation via a two-way error component model. Monte Carlo simulations were carried out for 81 different variations, of which its design assumed a uniform distribution under a linear heteroscedasticity function. Each of the variation was iterated 1000 times and the assessment of the three estimators considered are based on Variance, Absolute bias (ABIAS), Mean square error (MSE) and the Root Mean Square (RMSE) of parameters estimates. Eighteen different models at different specified conditions were fitted, and the best-fitted model is that of within estimator when heteroscedasticity is severe at either zero or positive serial correlation value. LM test results showed that the tests have good size and power as all the three tests are significant at 5% for the specified linear form of heteroscedasticity function which established the facts that Banks operations are severely heteroscedastic in nature with little or no periodicity effects.

Keywords: Audit fee, heteroscedasticity, Lagrange multiplier test, periodicity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 738
12 Variational EM Inference Algorithm for Gaussian Process Classification Model with Multiclass and Its Application to Human Action Classification

Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park

Abstract:

In this paper, we propose the variational EM inference algorithm for the multi-class Gaussian process classification model that can be used in the field of human behavior recognition. This algorithm can drive simultaneously both a posterior distribution of a latent function and estimators of hyper-parameters in a Gaussian process classification model with multiclass. Our algorithm is based on the Laplace approximation (LA) technique and variational EM framework. This is performed in two steps: called expectation and maximization steps. First, in the expectation step, using the Bayesian formula and LA technique, we derive approximately the posterior distribution of the latent function indicating the possibility that each observation belongs to a certain class in the Gaussian process classification model. Second, in the maximization step, using a derived posterior distribution of latent function, we compute the maximum likelihood estimator for hyper-parameters of a covariance matrix necessary to define prior distribution for latent function. These two steps iteratively repeat until a convergence condition satisfies. Moreover, we apply the proposed algorithm with human action classification problem using a public database, namely, the KTH human action data set. Experimental results reveal that the proposed algorithm shows good performance on this data set.

Keywords: Bayesian rule, Gaussian process classification model with multiclass, Gaussian process prior, human action classification, laplace approximation, variational EM algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1758