Search results for: collocation errors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 487

Search results for: collocation errors

217 Application of Artificial Neural Network for the Prediction of Pressure Distribution of a Plunging Airfoil

Authors: F. Rasi Maezabadi, M. Masdari, M. R. Soltani

Abstract:

Series of experimental tests were conducted on a section of a 660 kW wind turbine blade to measure the pressure distribution of this model oscillating in plunging motion. In order to minimize the amount of data required to predict aerodynamic loads of the airfoil, a General Regression Neural Network, GRNN, was trained using the measured experimental data. The network once proved to be accurate enough, was used to predict the flow behavior of the airfoil for the desired conditions. Results showed that with using a few of the acquired data, the trained neural network was able to predict accurate results with minimal errors when compared with the corresponding measured values. Therefore with employing this trained network the aerodynamic coefficients of the plunging airfoil, are predicted accurately at different oscillation frequencies, amplitudes, and angles of attack; hence reducing the cost of tests while achieving acceptable accuracy.

Keywords: Airfoil, experimental, GRNN, Neural Network, Plunging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1655
216 The Use of Classifiers in Image Analysis of Oil Wells Profiling Process and the Automatic Identification of Events

Authors: Jaqueline M. R. Vieira

Abstract:

Different strategies and tools are available at the oil and gas industry for detecting and analyzing tension and possible fractures in borehole walls. Most of these techniques are based on manual observation of the captured borehole images. While this strategy may be possible and convenient with small images and few data, it may become difficult and suitable to errors when big databases of images must be treated. While the patterns may differ among the image area, depending on many characteristics (drilling strategy, rock components, rock strength, etc.). In this work we propose the inclusion of data-mining classification strategies in order to create a knowledge database of the segmented curves. These classifiers allow that, after some time using and manually pointing parts of borehole images that correspond to tension regions and breakout areas, the system will indicate and suggest automatically new candidate regions, with higher accuracy. We suggest the use of different classifiers methods, in order to achieve different knowledge dataset configurations.

Keywords: Brazil, classifiers, data-mining, Image Segmentation, oil well visualization, classifiers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2544
215 Influence of Noise on the Inference of Dynamic Bayesian Networks from Short Time Series

Authors: Frank Emmert Streib, Matthias Dehmer, Gökhan H. Bakır, Max Mühlhauser

Abstract:

In this paper we investigate the influence of external noise on the inference of network structures. The purpose of our simulations is to gain insights in the experimental design of microarray experiments to infer, e.g., transcription regulatory networks from microarray experiments. Here external noise means, that the dynamics of the system under investigation, e.g., temporal changes of mRNA concentration, is affected by measurement errors. Additionally to external noise another problem occurs in the context of microarray experiments. Practically, it is not possible to monitor the mRNA concentration over an arbitrary long time period as demanded by the statistical methods used to learn the underlying network structure. For this reason, we use only short time series to make our simulations more biologically plausible.

Keywords: Dynamic Bayesian networks, structure learning, gene networks, Markov chain Monte Carlo, microarray data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1610
214 Mathematical Modeling of Gas Turbine Blade Cooling

Authors: А. Pashayev, C. Ardil, D. Askerov, R. Sadiqov, A. Samedov

Abstract:

In contrast to existing methods which do not take into account multiconnectivity in a broad sense of this term, we develop mathematical models and highly effective combination (BIEM and FDM) numerical methods of calculation of stationary and quasistationary temperature field of a profile part of a blade with convective cooling (from the point of view of realization on PC). The theoretical substantiation of these methods is proved by appropriate theorems. For it, converging quadrature processes have been developed and the estimations of errors in the terms of A.Ziqmound continuity modules have been received. For visualization of profiles are used: the method of the least squares with automatic conjecture, device spline, smooth replenishment and neural nets. Boundary conditions of heat exchange are determined from the solution of the corresponding integral equations and empirical relationships. The reliability of designed methods is proved by calculation and experimental investigations heat and hydraulic characteristics of the gas turbine first stage nozzle blade.

Keywords: Mathematical Modeling, Gas Turbine Blade Cooling, Neural Networks, BIEM and FDM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2091
213 The Contribution of Edgeworth, Bootstrap and Monte Carlo Methods in Financial Data

Authors: Edlira Donefski, Tina Donefski, Lorenc Ekonomi

Abstract:

Edgeworth Approximation, Bootstrap and Monte Carlo Simulations have a considerable impact on the achieving certain results related to different problems taken into study. In our paper, we have treated a financial case related to the effect that have the components of a Cash-Flow of one of the most successful businesses in the world, as the financial activity, operational activity and investing activity to the cash and cash equivalents at the end of the three-months period. To have a better view of this case we have created a Vector Autoregression model, and after that we have generated the impulse responses in the terms of Asymptotic Analysis (Edgeworth Approximation), Monte Carlo Simulations and Residual Bootstrap based on the standard errors of every series created. The generated results consisted of the common tendencies for the three methods applied, that consequently verified the advantage of the three methods in the optimization of the model that contains many variants.

Keywords: Autoregression, Bootstrap, Edgeworth Expansion, Monte Carlo Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 595
212 Kernel Matching versus Inverse Probability Weighting: A Comparative Study

Authors: Andy Handouyahia, Tony Haddad, Frank Eaton

Abstract:

Recent quasi-experimental evaluation of the Canadian Active Labour Market Policies (ALMP) by Human Resources and Skills Development Canada (HRSDC) has provided an opportunity to examine alternative methods to estimating the incremental effects of Employment Benefits and Support Measures (EBSMs) on program participants. The focus of this paper is to assess the efficiency and robustness of inverse probability weighting (IPW) relative to kernel matching (KM) in the estimation of program effects. To accomplish this objective, the authors compare pairs of 1,080 estimates, along with their associated standard errors, to assess which type of estimate is generally more efficient and robust. In the interest of practicality, the authorsalso document the computationaltime it took to produce the IPW and KM estimates, respectively.

Keywords: Treatment effect, causal inference, observational studies, Propensity score based matching, Kernel Matching, Inverse Probability Weighting, Estimation methods for incremental effect.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6925
211 Automatic Inspection of Percussion Caps by Means of Combined 2D and 3D Machine Vision Techniques

Authors: A. Tellaeche, R. Arana, I.Maurtua

Abstract:

The exhaustive quality control is becoming more and more important when commercializing competitive products in the world's globalized market. Taken this affirmation as an undeniable truth, it becomes critical in certain sector markets that need to offer the highest restrictions in quality terms. One of these examples is the percussion cap mass production, a critical element assembled in firearm ammunition. These elements, built in great quantities at a very high speed, must achieve a minimum tolerance deviation in their fabrication, due to their vital importance in firing the piece of ammunition where they are built in. This paper outlines a machine vision development for the 100% inspection of percussion caps obtaining data from 2D and 3D simultaneous images. The acquisition speed and precision of these images from a metallic reflective piece as a percussion cap, the accuracy of the measures taken from these images and the multiple fabrication errors detected make the main findings of this work.

Keywords: critical tolerance, high speed decision makingsimultaneous 2D/3D machine vision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1536
210 New Technologies for Modeling of Gas Turbine Cooled Blades

Authors: A. Pashayev, D. Askerov, R.Sadiqov, A. Samedov, C. Ardil

Abstract:

In contrast to existing methods which do not take into account multiconnectivity in a broad sense of this term, we develop mathematical models and highly effective combination (BIEM and FDM) numerical methods of calculation of stationary and cvazistationary temperature field of a profile part of a blade with convective cooling (from the point of view of realization on PC). The theoretical substantiation of these methods is proved by appropriate theorems. For it, converging quadrature processes have been developed and the estimations of errors in the terms of A.Ziqmound continuity modules have been received. For visualization of profiles are used: the method of the least squares with automatic conjecture, device spline, smooth replenishment and neural nets. Boundary conditions of heat exchange are determined from the solution of the corresponding integral equations and empirical relationships. The reliability of designed methods is proved by calculation and experimental investigations heat and hydraulic characteristics of the gas turbine 1st stage nozzle blade

Keywords: multiconnected systems, method of the boundary integrated equations, splines, neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1651
209 Numerical Modeling of Gas Turbine Engines

Authors: A. Pashayev, D. Askerov, C. Ardil, R. Sadiqov

Abstract:

In contrast to existing methods which do not take into account multiconnectivity in a broad sense of this term, we develop mathematical models and highly effective combination (BIEM and FDM) numerical methods of calculation of stationary and quasi-stationary temperature field of a profile part of a blade with convective cooling (from the point of view of realization on PC). The theoretical substantiation of these methods is proved by appropriate theorems. For it, converging quadrature processes have been developed and the estimations of errors in the terms of A.Ziqmound continuity modules have been received. For visualization of profiles are used: the method of the least squares with automatic conjecture, device spline, smooth replenishment and neural nets. Boundary conditions of heat exchange are determined from the solution of the corresponding integral equations and empirical relationships. The reliability of designed methods is proved by calculation and experimental investigations heat and hydraulic characteristics of the gas turbine first stage nozzle blade.

Keywords: Multiconnected systems, method of the boundary integrated equations, splines, neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1624
208 Design and Analysis of Gauge R&R Studies: Making Decisions Based on ANOVA Method

Authors: Afrooz Moatari Kazerouni

Abstract:

In a competitive production environment, critical decision making are based on data resulted by random sampling of product units. Efficiency of these decisions depends on data quality and also their reliability scale. This point leads to the necessity of a reliable measurement system. Therefore, the conjecture process and analysing the errors contributes to a measurement system known as Measurement System Analysis (MSA). The aim of this research is on determining the necessity and assurance of extensive development in analysing measurement systems, particularly with the use of Repeatability and Reproducibility Gages (GR&R) to improve physical measurements. Nowadays in productive industries, repeatability and reproducibility gages released so well but they are not applicable as well as other measurement system analysis methods. To get familiar with this method and gain a feedback in improving measurement systems, this survey would be on “ANOVA" method as the most widespread way of calculating Repeatability and Reproducibility (R&R).

Keywords: Analysis of Variance (ANOVA), MeasurementSystem Analysis (MSA), Part-Operator interaction effect, Repeatability and Reproducibility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4667
207 Computational Evaluation of a C-A Heat Pump

Authors: Young-Jin Baik, Minsung Kim, Young-Soo Lee, Ki-Chang Chang, Seong-Ryong Park

Abstract:

The compression-absorption heat pump (C-A HP), one of the promising heat recovery equipments that make process hot water using low temperature heat of wastewater, was evaluated by computer simulation. A simulation program was developed based on the continuity and the first and second laws of thermodynamics. Both the absorber and desorber were modeled using UA-LMTD method. In order to prevent an unfeasible temperature profile and to reduce calculation errors from the curved temperature profile of a mixture, heat loads were divided into lots of segments. A single-stage compressor was considered. A compressor cooling load was also taken into account. An isentropic efficiency was computed from the map data. Simulation conditions were given based on the system consisting of ordinarily designed components. The simulation results show that most of the total entropy generation occurs during the compression and cooling process, thus suggesting the possibility that system performance can be enhanced if a rectifier is introduced.

Keywords: Waste heat recovery, Heat Pump.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1719
206 Discrete Polynomial Moments and Savitzky-Golay Smoothing

Authors: Paul O'Leary, Matthew Harker

Abstract:

This paper presents unified theory for local (Savitzky- Golay) and global polynomial smoothing. The algebraic framework can represent any polynomial approximation and is seamless from low degree local, to high degree global approximations. The representation of the smoothing operator as a projection onto orthonormal basis functions enables the computation of: the covariance matrix for noise propagation through the filter; the noise gain and; the frequency response of the polynomial filters. A virtually perfect Gram polynomial basis is synthesized, whereby polynomials of degree d = 1000 can be synthesized without significant errors. The perfect basis ensures that the filters are strictly polynomial preserving. Given n points and a support length ls = 2m + 1 then the smoothing operator is strictly linear phase for the points xi, i = m+1. . . n-m. The method is demonstrated on geometric surfaces data lying on an invariant 2D lattice.

Keywords: Gram polynomials, Savitzky-Golay Smoothing, Discrete Polynomial Moments

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2789
205 DRE - A Quality Metric for Component based Software Products

Authors: K. S. Jasmine, R. Vasantha

Abstract:

The overriding goal of software engineering is to provide a high quality system, application or a product. To achieve this goal, software engineers must apply effective methods coupled with modern tools within the context of a mature software process [2]. In addition, it is also must to assure that high quality is realized. Although many quality measures can be collected at the project levels, the important measures are errors and defects. Deriving a quality measure for reusable components has proven to be challenging task now a days. The results obtained from the study are based on the empirical evidence of reuse practices, as emerged from the analysis of industrial projects. Both large and small companies, working in a variety of business domains, and using object-oriented and procedural development approaches contributed towards this study. This paper proposes a quality metric that provides benefit at both project and process level, namely defect removal efficiency (DRE).

Keywords: Software Reuse, Defect density, Reuse metrics, Defect Removal efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2807
204 Supervisory Fuzzy Learning Control for Underwater Target Tracking

Authors: C.Kia, M.R.Arshad, A.H.Adom, P.A.Wilson

Abstract:

This paper presents recent work on the improvement of the robotics vision based control strategy for underwater pipeline tracking system. The study focuses on developing image processing algorithms and a fuzzy inference system for the analysis of the terrain. The main goal is to implement the supervisory fuzzy learning control technique to reduce the errors on navigation decision due to the pipeline occlusion problem. The system developed is capable of interpreting underwater images containing occluded pipeline, seabed and other unwanted noise. The algorithm proposed in previous work does not explore the cooperation between fuzzy controllers, knowledge and learnt data to improve the outputs for underwater pipeline tracking. Computer simulations and prototype simulations demonstrate the effectiveness of this approach. The system accuracy level has also been discussed.

Keywords: Fuzzy logic, Underwater target tracking, Autonomous underwater vehicles, Artificial intelligence, Simulations, Robot navigation, Vision system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897
203 Planar Tracking Control of an Underactuated Autonomous Underwater Vehicle

Authors: Santhakumar M., Asokan T.

Abstract:

This paper addresses the problem of trajectory tracking control of an underactuated autonomous underwater vehicle (AUV) in the horizontal plane. The underwater vehicle under consideration is not actuated in the sway direction, and the system matrices are not assumed to be diagonal and linear, as often found in the literature. In addition, the effect of constant bias of environmental disturbances is considered. Using backstepping techniques and the tracking error dynamics, the system states are stabilized by forcing the tracking errors to an arbitrarily small neighborhood of zero. The effectiveness of the proposed control method is demonstrated through numerical simulations. Simulations are carried out for an experimental vehicle for smooth, inertial, two dimensional (2D) reference trajectories such as constant velocity trajectory (a circle maneuver – constant yaw rate), and time varying velocity trajectory (a sinusoidal path – sinusoidal yaw rate).

Keywords: autonomous underwater vehicle, system matrices, tracking control, time – varying feed back, underactuated control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2145
202 Dissolved Oxygen Prediction Using Support Vector Machine

Authors: Sorayya Malek, Mogeeb Mosleh, Sharifah M. Syed

Abstract:

In this study, Support Vector Machine (SVM) technique was applied to predict the dichotomized value of Dissolved oxygen (DO) from two freshwater lakes namely Chini and Bera Lake (Malaysia). Data sample contained 11 parameters for water quality features from year 2005 until 2009. All data parameters were used to predicate the dissolved oxygen concentration which was dichotomized into 3 different levels (High, Medium, and Low). The input parameters were ranked, and forward selection method was applied to determine the optimum parameters that yield the lowest errors, and highest accuracy. Initial results showed that pH, Water Temperature, and Conductivity are the most important parameters that significantly affect the predication of DO. Then, SVM model was applied using the Anova kernel with those parameters yielded 74% accuracy rate. We concluded that using SVM models to predicate the DO is feasible, and using dichotomized value of DO yields higher prediction accuracy than using precise DO value.

Keywords: Dissolved oxygen, Water quality, predication DO, Support Vector Machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2217
201 Determinants of Profitability in Indian Pharmaceutical Firms in the New Intellectual Property Rights Regime

Authors: Shilpi Tyagi, D. K. Nauriyal

Abstract:

This study investigates the firm level determinants of profitability of Indian drug and pharmaceutical industry. The study uses inflation adjusted panel data for a period 2000-2013 and applies OLS regression model with Driscoll-Kraay standard errors. It has been found that export intensity, A&M intensity, firm’s market power and stronger patent regime dummy have exercised positive influence on profitability. The negative and statistically significant influence of R&D intensity and raw material import intensity points to the need for firms to adopt suitable investment strategies. The study suggests that firms are required to pay far more attention to optimize their operating expenditures, advertisement and marketing expenditures and improve their export orientation, as part of the long term strategy.

Keywords: Indian drug and pharmaceutical industry, trade related intellectual property rights, research and development, food and drug administration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2486
200 Real-Time Image Encryption Using a 3D Discrete Dual Chaotic Cipher

Authors: M. F. Haroun, T. A. Gulliver

Abstract:

In this paper, an encryption algorithm is proposed for real-time image encryption. The scheme employs a dual chaotic generator based on a three dimensional (3D) discrete Lorenz attractor. Encryption is achieved using non-autonomous modulation where the data is injected into the dynamics of the master chaotic generator. The second generator is used to permute the dynamics of the master generator using the same approach. Since the data stream can be regarded as a random source, the resulting permutations of the generator dynamics greatly increase the security of the transmitted signal. In addition, a technique is proposed to mitigate the error propagation due to the finite precision arithmetic of digital hardware. In particular, truncation and rounding errors are eliminated by employing an integer representation of the data which can easily be implemented. The simple hardware architecture of the algorithm makes it suitable for secure real-time applications.

Keywords: Chaotic systems, image encryption, 3D Lorenz attractor, non-autonomous modulation, FPGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1217
199 Linear Quadratic Gaussian/Loop Transfer Recover Control Flight Control on a Nonlinear Model

Authors: T. Sanches, K. Bousson

Abstract:

As part of the development of a 4D autopilot system for unmanned aerial vehicles (UAVs), i.e. a time-dependent robust trajectory generation and control algorithm, this work addresses the problem of optimal path control based on the flight sensors data output that may be unreliable due to noise on data acquisition and/or transmission under certain circumstances. Although several filtering methods, such as the Kalman-Bucy filter or the Linear Quadratic Gaussian/Loop Transfer Recover Control (LQG/LTR), are available, the utter complexity of the control system, together with the robustness and reliability required of such a system on a UAV for airworthiness certifiable autonomous flight, required the development of a proper robust filter for a nonlinear system, as a way of further mitigate errors propagation to the control system and improve its ,performance. As such, a nonlinear algorithm based upon the LQG/LTR, is validated through computational simulation testing, is proposed on this paper.

Keywords: Autonomous flight, LQG/LTR, nonlinear state estimator, robust flight control and stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 695
198 A Self Adaptive Genetic Based Algorithm for the Identification and Elimination of Bad Data

Authors: A. A. Hossam-Eldin, E. N. Abdallah, M. S. El-Nozahy

Abstract:

The identification and elimination of bad measurements is one of the basic functions of a robust state estimator as bad data have the effect of corrupting the results of state estimation according to the popular weighted least squares method. However this is a difficult problem to handle especially when dealing with multiple errors from the interactive conforming type. In this paper, a self adaptive genetic based algorithm is proposed. The algorithm utilizes the results of the classical linearized normal residuals approach to tune the genetic operators thus instead of making a randomized search throughout the whole search space it is more likely to be a directed search thus the optimum solution is obtained at very early stages(maximum of 5 generations). The algorithm utilizes the accumulating databases of already computed cases to reduce the computational burden to minimum. Tests are conducted with reference to the standard IEEE test systems. Test results are very promising.

Keywords: Bad Data, Genetic Algorithms, Linearized Normal residuals, Observability, Power System State Estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1345
197 Multipurpose Cadastre, Essential for Urban Development Plans in Iran

Authors: Mehrshad Khalaj, Elham Lashkari

Abstract:

Majority of researches conducted on Iranian urban development plans indicate that they have been almost unsuccessful in terms of draft, execution and goal achievement. Lack or shortage of essential statistics and information can be listed as an important reason of the failure of these plans. Lack of figures and information has turned into an obvious part of the country-s statistics officials. This problem has made urban planner themselves to embark on physical surveys including real estate and land pricing, population and economic census of the city. Apart from the problems facing urban developers, the possibility of errors is high in such surveys. In the present article, applying the interview technique, it has been mentioned that utilizing multipurpose cadastre system as a land information system is essential for urban development plans in Iran. It can minimize or even remove the failures facing urban development plans.

Keywords: Multipurpose Cadastre, Urban Development Plan(UDP), Land Information System (LIS), Interview Technique

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2448
196 Development of a Fiber based Interferometric Sensor for Non-contact Displacement Measurement

Authors: S. Pullteap

Abstract:

In this paper, a fiber based Fabry-Perot interferometer is proposed and demonstrated for a non-contact displacement measurement. A piece of micro-prism which attached to the mechanical vibrator is served as the target reflector. Interference signal is generated from the superposition between the sensing beam and the reference beam within the sensing arm of the fiber sensor. This signal is then converted to the displacement value by using a developed program written in visual Cµ programming with a resolution of λ/8. A classical function generator is operated for controlling the vibrator. By fixing an excitation frequency of 100 Hz and varying the excitation amplitude range of 0.1 – 3 Volts, the output displacements measured by the fiber sensor are obtained from 1.55 μm to 30.225 μm. A reference displacement sensor with a sensitivity of ~0.4 μm is also employed for comparing the displacement errors between both sensors. We found that over the entire displacement range, a maximum and average measurement error are obtained of 0.977% and 0.44% respectively.

Keywords: Non-contact displacement measurement, extrinsicfiber based Fabry-Perot interferometer, interference signal, zerocrossingfringe counting technique.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2026
195 Control of Commutation of SR Motor Using Its Magnetic Characteristics and Back-of-Core Saturation Effects

Authors: Dr. N.H. Mvungi

Abstract:

The control of commutation of switched reluctance (SR) motor has nominally depended on a physical position detector. The physical rotor position sensor limits robustness and increases size and inertia of the SR drive system. The paper describes a method to overcome these limitations by using magnetization characteristics of the motor to indicate rotor and stator teeth overlap status. The method is using active current probing pulses of same magnitude that is used to simulate flux linkage in the winding being probed. A microprocessor is used for processing magnetization data to deduce rotor-stator teeth overlap status and hence rotor position. However, the back-of-core saturation and mutual coupling introduces overlap detection errors, hence that of commutation control. This paper presents the concept of the detection scheme and the effects of backof core saturation.

Keywords: Microprocessor control, rotor position, sensorless, switched reluctance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1282
194 Nodal Load Profiles Estimation for Time Series Load Flow Using Independent Component Analysis

Authors: Mashitah Mohd Hussain, Salleh Serwan, Zuhaina Hj Zakaria

Abstract:

This paper presents a method to estimate load profile in a multiple power flow solutions for every minutes in 24 hours per day. A method to calculate multiple solutions of non linear profile is introduced. The Power System Simulation/Engineering (PSS®E) and python has been used to solve the load power flow. The result of this power flow solutions has been used to estimate the load profiles for each load at buses using Independent Component Analysis (ICA) without any knowledge of parameter and network topology of the systems. The proposed algorithm is tested with IEEE 69 test bus system represents for distribution part and the method of ICA has been programmed in MATLAB R2012b version. Simulation results and errors of estimations are discussed in this paper.

Keywords: Electrical Distribution System, Power Flow Solution, Distribution Network, Independent Component Analysis, Newton Raphson, Power System Simulation for Engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2916
193 Effects of Incident Angle and Distance on Visible Light Communication

Authors: Taegyoo Woo, Jong Kang Park, Jong Tae Kim

Abstract:

Visible Light Communication (VLC) provides wireless communication features in illumination systems. One of the key applications is to recognize the user location by indoor illuminators such as light emitting diodes. For localization of individual receivers in these systems, we usually assume that receivers and transmitters are placed in parallel. However, it is difficult to satisfy this assumption because the receivers move randomly in real case. It is necessary to analyze the case when transmitter is not placed perfectly parallel to receiver. It is also important to identify changes on optical gain by the tilted angles and distances of them against the illuminators. In this paper, we simulate optical gain for various cases where the tilt of the receiver and the distance change. Then, we identified changing patterns of optical gains according to tilted angles of a receiver and distance. These results can help many VLC applications understand the extent of the location errors with regard to optical gains of the receivers and identify the root cause.

Keywords: Visible light communication, optical channel, indoor positioning, Lambertian radiation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1460
192 Study of Remote Sensing and Satellite Images Ability in Preparing Agricultural Land Use Map (ALUM)

Authors: Ali Gholami

Abstract:

In this research the Preparation of Land use map of scanner LISS III satellite data, belonging to the IRS in the Aghche region in Isfahan province, is studied carefully. For this purpose, the IRS satellite images of August 2008 and various land preparation uses in region including rangelands, irrigation farming, dry farming, gardens and urban areas were separated and identified. Therefore, the GPS and Erdas Imaging software were used and three methods of Maximum Likelihood, Mahalanobis Distance and Minimum Distance were analyzed. In each of these methods, matrix error and Kappa index were calculated and accuracy of each method, based on percentages: 53.13, 56.64 and 48.44, were obtained respectively. Considering the low accuracy of these methods in separation of land preparation use, the visual interpretation of the map was used. Finally, regional visits of 150 points were noted at random and no error was observed. It shows that the map prepared by visual interpretation is in high accuracy. Although the probable errors due to visual interpretation and geometric correction might happen but the desired accuracy of the map which is more than 85 percent is reliable.

Keywords: Land use map, Aghche Region, Erdas Imagine, satellite images

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1570
191 Assessment of Menus in a Selected Social Welfare Home with Regard to Nutritional Recommendations

Authors: E. Grochowska-Niedworok, K. Brukalo, B. Całyniuk, J. Piekorz, M. Kardas

Abstract:

The aim of the study was to assess diets of residents of nursing homes. Provided by social welfare home, 10 day menus were introduced into the computer program Diet 5 and analyzed in respect of protein, fats, carbohydrates, energy, vitamin D and calcium. The resulting mean values of 10-day menus were compared with the existing Nutrition Standards for Polish population. The analysis menus showed that the average amount of energy supplied from food is not sufficient. Carbohydrates in food supply are too high and represent 257% of normal. The average value of fats and proteins supplied with food is adequate 85.2 g/day and 75.2 g/day. The calcium content of the diet is 513.9 mg/day. The amount of vitamin D supplied in the age group 51-65 years is 2.3 µg/day. Dietary errors that have been shown are due to the lack of detailed nutritional guidelines for nursing homes, as well as state-owned care facilities in general.

Keywords: Assessment of diet, essential nutrients, social welfare home, nutrition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1137
190 A Study of Adaptive Fault Detection Method for GNSS Applications

Authors: Je Young Lee, Hee Sung Kim, Kwang Ho Choi, Joonhoo Lim, Sebum Chun, Hyung Keun Lee

Abstract:

This study is purposed to develop an efficient fault detection method for Global Navigation Satellite Systems (GNSS) applications based on adaptive noise covariance estimation. Due to the dependence on radio frequency signals, GNSS measurements are dominated by systematic errors in receiver’s operating environment. In the proposed method, the pseudorange and carrier-phase measurement noise covariances are obtained at time propagations and measurement updates in process of Carrier-Smoothed Code (CSC) filtering, respectively. The test statistics for fault detection are generated by the estimated measurement noise covariances. To evaluate the fault detection capability, intentional faults were added to the filed-collected measurements. The experiment result shows that the proposed method is efficient in detecting unhealthy measurements and improves GNSS positioning accuracy against fault occurrences.

Keywords: Adaptive estimation, fault detection, GNSS, residual.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2555
189 A Hybrid Gene Selection Technique Using Improved Mutual Information and Fisher Score for Cancer Classification Using Microarrays

Authors: M. Anidha, K. Premalatha

Abstract:

Feature Selection is significant in order to perform constructive classification in the area of cancer diagnosis. However, a large number of features compared to the number of samples makes the task of classification computationally very hard and prone to errors in microarray gene expression datasets. In this paper, we present an innovative method for selecting highly informative gene subsets of gene expression data that effectively classifies the cancer data into tumorous and non-tumorous. The hybrid gene selection technique comprises of combined Mutual Information and Fisher score to select informative genes. The gene selection is validated by classification using Support Vector Machine (SVM) which is a supervised learning algorithm capable of solving complex classification problems. The results obtained from improved Mutual Information and F-Score with SVM as a classifier has produced efficient results.

Keywords: Gene selection, mutual information, Fisher score, classification, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1152
188 Filtering and Reconstruction System for Gray Forensic Images

Authors: Ahd Aljarf, Saad Amin

Abstract:

Images are important source of information used as evidence during any investigation process. Their clarity and accuracy is essential and of the utmost importance for any investigation. Images are vulnerable to losing blocks and having noise added to them either after alteration or when the image was taken initially, therefore, having a high performance image processing system and it is implementation is very important in a forensic point of view. This paper focuses on improving the quality of the forensic images. For different reasons packets that store data can be affected, harmed or even lost because of noise. For example, sending the image through a wireless channel can cause loss of bits. These types of errors might give difficulties generally for the visual display quality of the forensic images. Two of the images problems: noise and losing blocks are covered. However, information which gets transmitted through any way of communication may suffer alteration from its original state or even lose important data due to the channel noise. Therefore, a developed system is introduced to improve the quality and clarity of the forensic images.

Keywords: Image Filtering, Image Reconstruction, Image Processing, Forensic Images.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2213