Search results for: Background Noise Statistical Modeling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4321

Search results for: Background Noise Statistical Modeling

4021 Semi-automatic Background Detection in Microscopic Images

Authors: Alessandro Bevilacqua, Alessandro Gherardi, Ludovico Carozza, Filippo Piccinini

Abstract:

The last years have seen an increasing use of image analysis techniques in the field of biomedical imaging, in particular in microscopic imaging. The basic step for most of the image analysis techniques relies on a background image free of objects of interest, whether they are cells or histological samples, to perform further analysis, such as segmentation or mosaicing. Commonly, this image consists of an empty field acquired in advance. However, many times achieving an empty field could not be feasible. Or else, this could be different from the background region of the sample really being studied, because of the interaction with the organic matter. At last, it could be expensive, for instance in case of live cell analyses. We propose a non parametric and general purpose approach where the background is built automatically stemming from a sequence of images containing even objects of interest. The amount of area, in each image, free of objects just affects the overall speed to obtain the background. Experiments with different kinds of microscopic images prove the effectiveness of our approach.

Keywords: Microscopy, flat field correction, background estimation, image segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1805
4020 Statistical Process Optimization Through Multi-Response Surface Methodology

Authors: S. Raissi, R- Eslami Farsani

Abstract:

In recent years, response surface methodology (RSM) has brought many attentions of many quality engineers in different industries. Most of the published literature on robust design methodology is basically concerned with optimization of a single response or quality characteristic which is often most critical to consumers. For most products, however, quality is multidimensional, so it is common to observe multiple responses in an experimental situation. Through this paper interested person will be familiarize with this methodology via surveying of the most cited technical papers. It is believed that the proposed procedure in this study can resolve a complex parameter design problem with more than two responses. It can be applied to those areas where there are large data sets and a number of responses are to be optimized simultaneously. In addition, the proposed procedure is relatively simple and can be implemented easily by using ready-made standard statistical packages.

Keywords: Multi-Response Surface Methodology (MRSM), Design of Experiments (DOE), Process modeling, Quality improvement; Robust Design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4407
4019 Modelling and Simulation of the Freezing Systems and Heat Pumps Using Unisim® Design

Authors: C. Patrascioiu

Abstract:

The paper describes the modeling and simulation of the heat pumps domain processes. The main objective of the study is the use of the heat pump in propene–propane distillation processes. The modeling and simulation instrument is the Unisim® Design simulator. The paper is structured in three parts: An overview of the compressing gases, the modeling and simulation of the freezing systems, and the modeling and simulation of the heat pumps. For each of these systems, there are presented the Unisim® Design simulation diagrams, the input–output system structure and the numerical results. Future studies will consider modeling and simulation of the propene–propane distillation process with heat pump.

Keywords: Distillation, heat pump, simulation, Unisim Design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2396
4018 Robust Coherent Noise Suppression by Point Estimation of the Cauchy Location Parameter

Authors: Ephraim Gower, Thato Tsalaile, Monageng Kgwadi, Malcolm Hawksford.

Abstract:

This paper introduces a new point estimation algorithm, with particular focus on coherent noise suppression, given several measurements of the device under test where it is assumed that 1) the noise is first-order stationery and 2) the device under test is linear and time-invariant. The algorithm exploits the robustness of the Pitman estimator of the Cauchy location parameter through the initial scaling of the test signal by a centred Gaussian variable of predetermined variance. It is illustrated through mathematical derivations and simulation results that the proposed algorithm is more accurate and consistently robust to outliers for different tailed density functions than the conventional methods of sample mean (coherent averaging technique) and sample median search.

Keywords: Central limit theorem, Fisher-Cramer Rao, gamma function, Pitman estimator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1884
4017 Power Integrity Analysis of Power Delivery System in High Speed Digital FPGA Board

Authors: Anil Kumar Pandey

Abstract:

Power plane noise is the most significant source of signal integrity (SI) issues in a high-speed digital design. In this paper, power integrity (PI) analysis of multiple power planes in a power delivery system of a 12-layer high-speed FPGA board is presented. All 10 power planes of HSD board are analyzed separately by using 3D Electromagnetic based PI solver, then the transient simulation is performed on combined PI data of all planes along with voltage regulator modules (VRMs) and 70 current drawing chips to get the board level power noise coupling on different high-speed signals. De-coupling capacitors are placed between power planes and ground to reduce power noise coupling with signals.

Keywords: Channel simulation, electromagnetic simulation, power-aware signal integrity analysis, power integrity, PIPro.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2244
4016 Wiener Filter as an Optimal MMSE Interpolator

Authors: Tsai-Sheng Kao

Abstract:

The ideal sinc filter, ignoring the noise statistics, is often applied for generating an arbitrary sample of a bandlimited signal by using the uniformly sampled data. In this article, an optimal interpolator is proposed; it reaches a minimum mean square error (MMSE) at its output in the presence of noise. The resulting interpolator is thus a Wiener filter, and both the optimal infinite impulse response (IIR) and finite impulse response (FIR) filters are presented. The mean square errors (MSE-s) for the interpolator of different length impulse responses are obtained by computer simulations; it shows that the MSE-s of the proposed interpolators with a reasonable length are improved about 0.4 dB under flat power spectra in noisy environment with signal-to-noise power ratio (SNR) equal 10 dB. As expected, the results also demonstrate the improvements for the MSE-s with various fractional delays of the optimal interpolator against the ideal sinc filter under a fixed length impulse response.

Keywords: Interpolator, minimum mean square error, Wiener filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2907
4015 Using Combination of Optimized Recurrent Neural Network with Design of Experiments and Regression for Control Chart Forecasting

Authors: R. Behmanesh, I. Rahimi

Abstract:

recurrent neural network (RNN) is an efficient tool for modeling production control process as well as modeling services. In this paper one RNN was combined with regression model and were employed in order to be checked whether the obtained data by the model in comparison with actual data, are valid for variable process control chart. Therefore, one maintenance process in workshop of Esfahan Oil Refining Co. (EORC) was taken for illustration of models. First, the regression was made for predicting the response time of process based upon determined factors, and then the error between actual and predicted response time as output and also the same factors as input were used in RNN. Finally, according to predicted data from combined model, it is scrutinized for test values in statistical process control whether forecasting efficiency is acceptable. Meanwhile, in training process of RNN, design of experiments was set so as to optimize the RNN.

Keywords: RNN, DOE, regression, control chart.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1625
4014 Smartphone Video Source Identification Based on Sensor Pattern Noise

Authors: Raquel Ramos López, Anissa El-Khattabi, Ana Lucila Sandoval Orozco, Luis Javier García Villalba

Abstract:

An increasing number of mobile devices with integrated cameras has meant that most digital video comes from these devices. These digital videos can be made anytime, anywhere and for different purposes. They can also be shared on the Internet in a short period of time and may sometimes contain recordings of illegal acts. The need to reliably trace the origin becomes evident when these videos are used for forensic purposes. This work proposes an algorithm to identify the brand and model of mobile device which generated the video. Its procedure is as follows: after obtaining the relevant video information, a classification algorithm based on sensor noise and Wavelet Transform performs the aforementioned identification process. We also present experimental results that support the validity of the techniques used and show promising results.

Keywords: Digital video, forensics analysis, key frame, mobile device, PRNU, sensor noise, source identification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1151
4013 A Statistical Approach for Predicting and Optimizing Depth of Cut in AWJ Machining for 6063-T6 Al Alloy

Authors: Farhad Kolahan, A. Hamid Khajavi

Abstract:

In this paper, a set of experimental data has been used to assess the influence of abrasive water jet (AWJ) process parameters in cutting 6063-T6 aluminum alloy. The process variables considered here include nozzle diameter, jet traverse rate, jet pressure and abrasive flow rate. The effects of these input parameters are studied on depth of cut (h); one of most important characteristics of AWJ. The Taguchi method and regression modeling are used in order to establish the relationships between input and output parameters. The adequacy of the model is evaluated using analysis of variance (ANOVA) technique. In the next stage, the proposed model is embedded into a Simulated Annealing (SA) algorithm to optimize the AWJ process parameters. The objective is to determine a suitable set of process parameters that can produce a desired depth of cut, considering the ranges of the process parameters. Computational results prove the effectiveness of the proposed model and optimization procedure.

Keywords: AWJ machining, Mathematical modeling, Simulated Annealing, Optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1730
4012 A Study on Evaluation of Strut Type Suspension Noise Caused by Rubber Degradation

Authors: Gugyong Kim, Sugnsu Kang, Yongjun Lee, Sooncheol Park, Wonwook Jung

Abstract:

When cars are released from the factory, strut noises are very small and therefore it is difficult to perceive them. As the use time and travel distance increase, however, strut noises get larger so as to cause users much uneasiness. The noises generated at the field include engine noises and flow noises and therefore it is difficult to clearly discern the noises generated from struts. This study developed a test method which can reproduce field strut noises in the lab. Using the newly developed noise evaluation test, this study analyzed the effects that insulator performance degradation and failure can have on car noises. The study also confirmed that the insulator durability test by the simple back-and-forth motion cannot completely reflect the state of the parts failure in the field. Based on this, the study also confirmed that field noises can be reproduced through a durability test that considers heat aging.

Keywords: Insulator, noise, performance degradation, strut

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1691
4011 Aquatic Modeling: An Interplay between Scales

Authors: Christina G. Siontorou

Abstract:

This paper presents an integrated knowledge-based approach to multi-scale modeling of aquatic systems, with a view to enhancing predictive power and aiding environmental management and policy-making. The basic phases of this approach have been exemplified in the case of a bay in Saronicos Gulf (Attiki, Greece). The results showed a significant problem with rising phytoplankton blooms linked to excessive microbial growth, arisen mostly due to increased nitrogen inflows; therefore, the nitrification/denitrification processes of the benthic and water column sub-systems have provided the quality variables to be monitored for assessing environmental status. It is thereby demonstrated that the proposed approach facilitates modeling choices and implementation option decisions, while it provides substantial support for knowledge and experience capitalization in long-term water management.

Keywords: Aquatic ecosystem, integrated modeling, multi-scale modeling, ontological platform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2292
4010 A New Heuristic Statistical Methodology for Optimizing Queuing Networks Using Discreet Event Simulation

Authors: Mohamad Mahdavi

Abstract:

Most of the real queuing systems include special properties and constraints, which can not be analyzed directly by using the results of solved classical queuing models. Lack of Markov chains features, unexponential patterns and service constraints, are the mentioned conditions. This paper represents an applied general algorithm for analysis and optimizing the queuing systems. The algorithm stages are described through a real case study. It is consisted of an almost completed non-Markov system with limited number of customers and capacities as well as lots of common exception of real queuing networks. Simulation is used for optimizing this system. So introduced stages over the following article include primary modeling, determining queuing system kinds, index defining, statistical analysis and goodness of fit test, validation of model and optimizing methods of system with simulation.

Keywords: Estimation, queuing system, simulation model, probability distribution, non-Markov chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1582
4009 Parallel Priority Region Approach to Detect Background

Authors: Sallama Athab, Hala Bahjat, Zhang Yinghui

Abstract:

Background detection is essential in video analyses; optimization is often needed in order to achieve real time calculation. Information gathered by dual cameras placed in the front and rear part of an Autonomous Vehicle (AV) is integrated for background detection. In this paper, real time calculation is achieved on the proposed technique by using Priority Regions (PR) and Parallel Processing together where each frame is divided into regions then and each region process is processed in parallel. PR division depends upon driver view limitations. A background detection system is built on the Temporal Difference (TD) and Gaussian Filtering (GF). Temporal Difference and Gaussian Filtering with multi threshold and sigma (weight) value are be based on PR characteristics. The experiment result is prepared on real scene. Comparison of the speed and accuracy with traditional background detection techniques, the effectiveness of PR and parallel processing are also discussed in this paper.

Keywords: Autonomous Vehicle, Background Detection, Dual Camera, Gaussian Filtering, Parallel Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1647
4008 Real Time Monitoring of Long Slender Shaft by Distributed-Lumped Modeling Techniques

Authors: Sina Babadi, K. M. Ebrahimi

Abstract:

The aim of this paper is to determine the stress levels at the end of a long slender shaft such as a drilling assembly used in the oil or gas industry using a mathematical model in real-time. The torsional deflection experienced by this type of drilling shaft (about 4 KM length and 20 cm diameter hollow shaft with a thickness of 1 cm) can only be determined using a distributed modeling technique. The main objective of this project is to calculate angular velocity and torque at the end of the shaft by TLM method and also analyzing of the behavior of the system by transient response. The obtained result is compared with lumped modeling technique the importance of these results will be evident only after the mentioned comparison. Two systems have different transient responses and in this project because of the length of the shaft transient response is very important.

Keywords: Distributed Lumped modeling, Lumped modeling, Drill string, Angular Velocity, Torque.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1421
4007 The Impact of Environmental Dynamism on Strategic Outsourcing Success

Authors: Mohamad Ghozali Hassan, Abdul Aziz Othman, Mohd Azril Ismail

Abstract:

Adapting quickly to environmental dynamism is essential for an organization to develop outsourcing strategic and management in order to sustain competitive advantage. This research used the Partial Least Squares Structural Equation Modeling (PLSSEM) tool to investigate the factors of environmental dynamism impact on the strategic outsourcing success among electrical and electronic manufacturing industries in outsourcing management. Statistical results confirm that the inclusion of customer demand, technological change, and competition level as a new combination concept of environmental dynamism, has positive effects on outsourcing success. Additionally, this research demonstrates the acceptability of PLS-SEM as a statistical analysis to furnish a better understanding of environmental dynamism in outsourcing management in Malaysia. A practical finding contributes to academics and practitioners in the field of outsourcing management.

Keywords: Environmental Dynamism, Customer Demand, Technological Change, Competition Level, Outsourcing Success.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2156
4006 An Approach to Correlate the Statistical-Based Lorenz Method, as a Way of Measuring Heterogeneity, with Kozeny-Carman Equation

Authors: H. Khanfari, M. Johari Fard

Abstract:

Dealing with carbonate reservoirs can be mind-boggling for the reservoir engineers due to various digenetic processes that cause a variety of properties through the reservoir. A good estimation of the reservoir heterogeneity which is defined as the quality of variation in rock properties with location in a reservoir or formation, can better help modeling the reservoir and thus can offer better understanding of the behavior of that reservoir. Most of reservoirs are heterogeneous formations whose mineralogy, organic content, natural fractures, and other properties vary from place to place. Over years, reservoir engineers have tried to establish methods to describe the heterogeneity, because heterogeneity is important in modeling the reservoir flow and in well testing. Geological methods are used to describe the variations in the rock properties because of the similarities of environments in which different beds have deposited in. To illustrate the heterogeneity of a reservoir vertically, two methods are generally used in petroleum work: Dykstra-Parsons permeability variations (V) and Lorenz coefficient (L) that are reviewed briefly in this paper. The concept of Lorenz is based on statistics and has been used in petroleum from that point of view. In this paper, we correlated the statistical-based Lorenz method to a petroleum concept, i.e. Kozeny-Carman equation and derived the straight line plot of Lorenz graph for a homogeneous system. Finally, we applied the two methods on a heterogeneous field in South Iran and discussed each, separately, with numbers and figures. As expected, these methods show great departure from homogeneity. Therefore, for future investment, the reservoir needs to be treated carefully.

Keywords: Carbonate reservoirs, heterogeneity, homogeneous system, Dykstra-Parsons permeability variations (V), Lorenz coefficient (L).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738
4005 The Performance Analysis of Error Saturation Nonlinearity LMS in Impulsive Noise based on Weighted-Energy Conservation

Authors: T Panigrahi, G Panda, Mulgrew

Abstract:

This paper introduces a new approach for the performance analysis of adaptive filter with error saturation nonlinearity in the presence of impulsive noise. The performance analysis of adaptive filters includes both transient analysis which shows that how fast a filter learns and the steady-state analysis gives how well a filter learns. The recursive expressions for mean-square deviation(MSD) and excess mean-square error(EMSE) are derived based on weighted energy conservation arguments which provide the transient behavior of the adaptive algorithm. The steady-state analysis for co-related input regressor data is analyzed, so this approach leads to a new performance results without restricting the input regression data to be white.

Keywords: Error saturation nonlinearity, transient analysis, impulsive noise.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1745
4004 Recognition of Noisy Words Using the Time Delay Neural Networks Approach

Authors: Khenfer-Koummich Fatima, Mesbahi Larbi, Hendel Fatiha

Abstract:

This paper presents a recognition system for isolated words like robot commands. It’s carried out by Time Delay Neural Networks; TDNN. To teleoperate a robot for specific tasks as turn, close, etc… In industrial environment and taking into account the noise coming from the machine. The choice of TDNN is based on its generalization in terms of accuracy, in more it acts as a filter that allows the passage of certain desirable frequency characteristics of speech; the goal is to determine the parameters of this filter for making an adaptable system to the variability of speech signal and to noise especially, for this the back propagation technique was used in learning phase. The approach was applied on commands pronounced in two languages separately: The French and Arabic. The results for two test bases of 300 spoken words for each one are 87%, 97.6% in neutral environment and 77.67%, 92.67% when the white Gaussian noisy was added with a SNR of 35 dB.

Keywords: Neural networks, Noise, Speech Recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1903
4003 Union is Strength in Lossy Image Compression

Authors: Mario Mastriani

Abstract:

In this work, we present a comparison between different techniques of image compression. First, the image is divided in blocks which are organized according to a certain scan. Later, several compression techniques are applied, combined or alone. Such techniques are: wavelets (Haar's basis), Karhunen-Loève Transform, etc. Simulations show that the combined versions are the best, with minor Mean Squared Error (MSE), and higher Peak Signal to Noise Ratio (PSNR) and better image quality, even in the presence of noise.

Keywords: Haar's basis, Image compression, Karhunen-LoèveTransform, Morton's scan, row-rafter scan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715
4002 Methodologies, Systems Development Life Cycle and Modeling Languages in Agile Software Development

Authors: I. D. Arroyo

Abstract:

This article seeks to integrate different concepts from contemporary software engineering with an agile development approach. We seek to clarify some definitions and uses, we make a difference between the Systems Development Life Cycle (SDLC) and the methodologies, we differentiate the types of frameworks such as methodological, philosophical and behavioral, standards and documentation. We define relationships based on the documentation of the development process through formal and ad hoc models, and we define the usefulness of using DevOps and Agile Modeling as integrative methodologies of principles and best practices.

Keywords: Methodologies, SDLC, modeling languages, agile modeling, DevOps, UML, agile software development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 893
4001 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function

Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos

Abstract:

Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.

Keywords: Diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion equation, trends functions, bi-parameters Weibull density function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1936
4000 An Energy Detection-Based Algorithm for Cooperative Spectrum Sensing in Rayleigh Fading Channel

Authors: H. Bakhshi, E. Khayyamian

Abstract:

Cognitive radios have been recognized as one of the most promising technologies dealing with the scarcity of the radio spectrum. In cognitive radio systems, secondary users are allowed to utilize the frequency bands of primary users when the bands are idle. Hence, how to accurately detect the idle frequency bands has attracted many researchers’ interest. Detection performance is sensitive toward noise power and gain fluctuation. Since signal to noise ratio (SNR) between primary user and secondary users are not the same and change over the time, SNR and noise power estimation is essential. In this paper, we present a cooperative spectrum sensing algorithm using SNR estimation to improve detection performance in the real situation.

Keywords: Cognitive radio, cooperative spectrum sensing, energy detection, SNR estimation, spectrum sensing, Rayleigh fading channel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1367
3999 Statistical Models of Network Traffic

Authors: Barath Kumar, Oliver Niggemann, Juergen Jasperneite

Abstract:

Model-based approaches have been applied successfully to a wide range of tasks such as specification, simulation, testing, and diagnosis. But one bottleneck often prevents the introduction of these ideas: Manual modeling is a non-trivial, time-consuming task. Automatically deriving models by observing and analyzing running systems is one possible way to amend this bottleneck. To derive a model automatically, some a-priori knowledge about the model structure–i.e. about the system–must exist. Such a model formalism would be used as follows: (i) By observing the network traffic, a model of the long-term system behavior could be generated automatically, (ii) Test vectors can be generated from the model, (iii) While the system is running, the model could be used to diagnose non-normal system behavior. The main contribution of this paper is the introduction of a model formalism called 'probabilistic regression automaton' suitable for the tasks mentioned above.

Keywords: Model-based approach, Probabilistic regression automata, Statistical models and Timed automata.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1498
3998 Modeling and Analysis of SVPWM Based Dynamic Voltage Restorer

Authors: Ahmed Helal, Sherif Zain Elabideen, Ahmed Lotfy

Abstract:

In this paper the modeling and analysis of Space Vector Pulse Width Modulation (SVPWM) based Dynamic Voltage Restorer (DVR) using PSCAD/EMTDC software will be presented in details. The simulation includes full modeling of the SVPWM technique used to control the DVR inverter. A test power system composed of three phase voltage source, sag generator, DVR and three phase resistive load is used to demonstrate restoration capability of the DVR. The simulation results of the presented DVR proved excellent voltage sag mitigation to protect sensitive loads.

Keywords: Dynamic voltage restorer, power quality, simulationand modeling, voltage sag.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3678
3997 Alternative Methods to Rank the Impact of Object Oriented Metrics in Fault Prediction Modeling using Neural Networks

Authors: Kamaldeep Kaur, Arvinder Kaur, Ruchika Malhotra

Abstract:

The aim of this paper is to rank the impact of Object Oriented(OO) metrics in fault prediction modeling using Artificial Neural Networks(ANNs). Past studies on empirical validation of object oriented metrics as fault predictors using ANNs have focused on the predictive quality of neural networks versus standard statistical techniques. In this empirical study we turn our attention to the capability of ANNs in ranking the impact of these explanatory metrics on fault proneness. In ANNs data analysis approach, there is no clear method of ranking the impact of individual metrics. Five ANN based techniques are studied which rank object oriented metrics in predicting fault proneness of classes. These techniques are i) overall connection weights method ii) Garson-s method iii) The partial derivatives methods iv) The Input Perturb method v) the classical stepwise methods. We develop and evaluate different prediction models based on the ranking of the metrics by the individual techniques. The models based on overall connection weights and partial derivatives methods have been found to be most accurate.

Keywords: Artificial Neural Networks (ANNS), Backpropagation, Fault Prediction Modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1717
3996 Object-Oriented Programming for Modeling and Simulation of Systems in Physiology

Authors: J. Fernandez de Canete

Abstract:

Object-oriented modeling is spreading in current simulation of physiological systems through the use of the individual components of the model and its interconnections to define the underlying dynamic equations. In this paper we describe the use of both the SIMSCAPE and MODELICA simulation environments in the object-oriented modeling of the closed loop cardiovascular system. The performance of the controlled system was analyzed by simulation in light of the existing hypothesis and validation tests previously performed with physiological data. The described approach represents a valuable tool in the teaching of physiology for graduate medical students.

Keywords: Object-Oriented Modeling, SIMSCAPE Simulation Language, MODELICA Simulation Language, Cardiovascular System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2827
3995 Active Surface Tracking Algorithm for All-Fiber Common-Path Fourier-Domain Optical Coherence Tomography

Authors: Bang Young Kim, Sang Hoon Park, Chul Gyu Song

Abstract:

A conventional optical coherence tomography (OCT) system has limited imaging depth, which is 1-2 mm, and suffers unwanted noise such as speckle noise. The motorized-stage-based OCT system, using a common-path Fourier-domain optical coherence tomography (CP-FD-OCT) configuration, provides enhanced imaging depth and less noise so that we can overcome these limitations. Using this OCT systems, OCT images were obtained from an onion, and their subsurface structure was observed. As a result, the images obtained using the developed motorized-stage-based system showed enhanced imaging depth than the conventional system, since it is real-time accurate depth tracking. Consequently, the developed CP-FD-OCT systems and algorithms have good potential for the further development of endoscopic OCT for microsurgery.

Keywords: Common-path OCT, FD-OCT, OCT, Tracking algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1633
3994 Differential Protection for Power Transformer Using Wavelet Transform and PNN

Authors: S. Sendilkumar, B. L. Mathur, Joseph Henry

Abstract:

A new approach for protection of power transformer is presented using a time-frequency transform known as Wavelet transform. Different operating conditions such as inrush, Normal, load, External fault and internal fault current are sampled and processed to obtain wavelet coefficients. Different Operating conditions provide variation in wavelet coefficients. Features like energy and Standard deviation are calculated using Parsevals theorem. These features are used as inputs to PNN (Probabilistic neural network) for fault classification. The proposed algorithm provides more accurate results even in the presence of noise inputs and accurately identifies inrush and fault currents. Overall classification accuracy of the proposed method is found to be 96.45%. Simulation of the fault (with and without noise) was done using MATLAB AND SIMULINK software taking 2 cycles of data window (40 m sec) containing 800 samples. The algorithm was evaluated by using 10 % Gaussian white noise.

Keywords: Power Transformer, differential Protection, internalfault, inrush current, Wavelet Energy, Db9.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3093
3993 Feature Point Reduction for Video Stabilization

Authors: Theerawat Songyot, Tham Manjing, Bunyarit Uyyanonvara, Chanjira Sinthanayothin

Abstract:

Corner detection and optical flow are common techniques for feature-based video stabilization. However, these algorithms are computationally expensive and should be performed at a reasonable rate. This paper presents an algorithm for discarding irrelevant feature points and maintaining them for future use so as to improve the computational cost. The algorithm starts by initializing a maintained set. The feature points in the maintained set are examined against its accuracy for modeling. Corner detection is required only when the feature points are insufficiently accurate for future modeling. Then, optical flows are computed from the maintained feature points toward the consecutive frame. After that, a motion model is estimated based on the simplified affine motion model and least square method, with outliers belonging to moving objects presented. Studentized residuals are used to eliminate such outliers. The model estimation and elimination processes repeat until no more outliers are identified. Finally, the entire algorithm repeats along the video sequence with the points remaining from the previous iteration used as the maintained set. As a practical application, an efficient video stabilization can be achieved by exploiting the computed motion models. Our study shows that the number of times corner detection needs to perform is greatly reduced, thus significantly improving the computational cost. Moreover, optical flow vectors are computed for only the maintained feature points, not for outliers, thus also reducing the computational cost. In addition, the feature points after reduction can sufficiently be used for background objects tracking as demonstrated in the simple video stabilizer based on our proposed algorithm.

Keywords: background object tracking, feature point reduction, low cost tracking, video stabilization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1727
3992 Low-Noise Amplifier Design for Improvement of Communication Range for Wake-up Receiver Based Wireless Sensor Network Application

Authors: Ilef Ketata, Mohamed Khalil Baazaoui, Robert Fromm, Ahmad Fakhfakh, Faouzi Derbel

Abstract:

The integration of wireless communication, e.g. in realor quasi-real-time applications, is related to many challenges such as energy consumption, communication range, latency, quality of service, and reliability. The improvement of wireless sensor network performance starts by enhancing the capabilities of each sensor node. While consuming less energy, wake-up receiver (WuRx) nodes have an impact on reducing latency. The solution for sensitivity improvements of sensor nodes, and WuRx in particular, with an energy consumption expense is low-noise amplifier (LNAs) blocks placed in the RF Antenna. This paper presents a comparative study for improving communication range and decreasing the energy consumption of WuRx nodes.

Keywords: Wireless sensor network, wake-up receiver, duty-cycled, low-noise amplifier, envelope detector, range study.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 139