Search results for: real decisions
2138 Experimental Analysis of Diesel Hydrotreating Reactor to Development a Simplified Tool for Process Real- time Optimization
Authors: S.Shokri, S.Zahedi, M.Ahmadi Marvast, B. Baloochi, H.Ganji
Abstract:
In this research, a systematic investigation was carried out to determine the optimum conditions of HDS reactor. Moreover, a suitable model was developed for a rigorous RTO (real time optimization) loop of HDS (Hydro desulfurization) process. A systematic experimental series was designed based on CCD (Central Composite design) and carried out in the related pilot plant to tune the develop model. The designed variables in the experiments were Temperature, LHSV and pressure. However, the hydrogen over fresh feed ratio was remained constant. The ranges of these variables were respectively equal to 320-380ºC, 1- 21/hr and 50-55 bar. a power law kinetic model was also developed for our further research in the future .The rate order and activation energy , power of reactant concentration and frequency factor of this model was respectively equal to 1.4, 92.66 kJ/mol and k0=2.7*109 .
Keywords: Statistical model, Multiphase Reactors, Gas oil, Hydrodesulfurization, Optimization, Kinetics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26862137 A Nodal Transmission Pricing Model based on Newly Developed Expressions of Real and Reactive Power Marginal Prices in Competitive Electricity Markets
Authors: Ashish Saini, A.K. Saxena
Abstract:
In competitive electricity markets all over the world, an adoption of suitable transmission pricing model is a problem as transmission segment still operates as a monopoly. Transmission pricing is an important tool to promote investment for various transmission services in order to provide economic, secure and reliable electricity to bulk and retail customers. The nodal pricing based on SRMC (Short Run Marginal Cost) is found extremely useful by researchers for sending correct economic signals. The marginal prices must be determined as a part of solution to optimization problem i.e. to maximize the social welfare. The need to maximize the social welfare subject to number of system operational constraints is a major challenge from computation and societal point of views. The purpose of this paper is to present a nodal transmission pricing model based on SRMC by developing new mathematical expressions of real and reactive power marginal prices using GA-Fuzzy based optimal power flow framework. The impacts of selecting different social welfare functions on power marginal prices are analyzed and verified with results reported in literature. Network revenues for two different power systems are determined using expressions derived for real and reactive power marginal prices in this paper.
Keywords: Deregulation, electricity markets, nodal pricing, social welfare function, short run marginal cost.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16452136 Noise Performance Optimization of a Fast Wavelength Calibration Algorithm for OSAs
Authors: Thomas Fuhrmann
Abstract:
A new fast correlation algorithm for calibrating the wavelength of Optical Spectrum Analyzers (OSAs) was introduced in [1]. The minima of acetylene gas spectra were measured and correlated with saved theoretical data [2]. So it is possible to find the correct wavelength calibration data using a noisy reference spectrum. First tests showed good algorithmic performance for gas line spectra with high noise. In this article extensive performance tests were made to validate the noise resistance of this algorithm. The filter and correlation parameters of the algorithm were optimized for improved noise performance. With these parameters the performance of this wavelength calibration was simulated to predict the resulting wavelength error in real OSA systems. Long term simulations were made to evaluate the performance of the algorithm over the lifetime of a real OSA.Keywords: correlation, gas reference, optical spectrum analyzer, wavelength calibration
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12722135 Hybrid Fuzzy Selecting-Control-by- Range Controllers of a Servopneumatic Fatigue System
Authors: Marco Soares dos Santos, Jorge Augusto Ferreira, Camila Nicola Boeri, Fernando Neto da Silva
Abstract:
The present paper proposes high performance nonlinear force controllers for a servopneumatic real-time fatigue test machine. A CompactRIO® controller was used, being fully programmed using LabVIEW language. Fuzzy logic control algorithms were evaluated to tune the integral and derivative components in the development of hybrid controllers, namely a FLC P and a hybrid FLC PID real-time-based controllers. Their behaviours were described by using state diagrams. The main contribution is to ensure a smooth transition between control states, avoiding discrete transitions in controller outputs. Steady-state errors lower than 1.5 N were reached, without retuning the controllers. Good results were also obtained for sinusoidal tracking tasks from 1/¤Ç to 8/¤Ç Hz.Keywords: Hybrid Fuzzy Selecting, Control, Range Controllers, Servopneumatic Fatigue System.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20012134 Analysis of Acoustic Emission Signal for the Detection of Defective Manufactures in Press Process
Authors: Dong Hun Kim, Won Kyu Lee, Sok Won Kim
Abstract:
Small cracks or chips of a product appear very frequently in the course of continuous production of an automatic press process system. These phenomena become the cause of not only defective product but also damage of a press mold. In order to solve this problem AE system was introduced. AE system was expected to be very effective to real time detection of the defective product and to prevention of the damage of the press molds. In this study, for pick and analysis of AE signals generated from the press process, AE sensors/pre-amplifier/analysis and processing board were used as frequently found in the other similar cases. For analysis and processing the AE signals picked in real time from the good or bad products, specialized software called cdm8 was used. As a result of this work it was conformed that intensity and shape of the various AE signals differ depending on the weight and thickness of metal sheet and process type.Keywords: press, acoustic emission, signal processing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16312133 A Comparison of the Sum of Squares in Linear and Partial Linear Regression Models
Authors: Dursun Aydın
Abstract:
In this paper, estimation of the linear regression model is made by ordinary least squares method and the partially linear regression model is estimated by penalized least squares method using smoothing spline. Then, it is investigated that differences and similarity in the sum of squares related for linear regression and partial linear regression models (semi-parametric regression models). It is denoted that the sum of squares in linear regression is reduced to sum of squares in partial linear regression models. Furthermore, we indicated that various sums of squares in the linear regression are similar to different deviance statements in partial linear regression. In addition to, coefficient of the determination derived in linear regression model is easily generalized to coefficient of the determination of the partial linear regression model. For this aim, it is made two different applications. A simulated and a real data set are considered to prove the claim mentioned here. In this way, this study is supported with a simulation and a real data example.Keywords: Partial Linear Regression Model, Linear RegressionModel, Residuals, Deviance, Smoothing Spline.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18722132 Evidence of the Long-run Equilibrium between Money Demand Determinants in Croatia
Authors: B. Skrabic, N. Tomic-Plazibat
Abstract:
In this paper real money demand function is analyzed within multivariate time-series framework. Cointegration approach is used (Johansen procedure) assuming interdependence between money demand determinants, which are nonstationary variables. This will help us to understand the behavior of money demand in Croatia, revealing the significant influence between endogenous variables in vector autoregrression system (VAR), i.e. vector error correction model (VECM). Exogeneity of the explanatory variables is tested. Long-run money demand function is estimated indicating slow speed of adjustment of removing the disequilibrium. Empirical results provide the evidence that real industrial production and exchange rate explains the most variations of money demand in the long-run, while interest rate is significant only in short-run.Keywords: Cointegration, Long-run equilibrium, Money demand function, Vector error correction model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21552131 The Effects of a Digital Dialogue Game on Higher Education Students’ Argumentation-Based Learning
Authors: Omid Noroozi
Abstract:
Digital dialogue games have opened up opportunities for learning skills by engaging students in complex problem solving that mimic real world situations, without importing unwanted constraints and risks of the real world. Digital dialogue games can be motivating and engaging to students for fun, creative thinking, and learning. This study explored how undergraduate students engage with argumentative discourse activities which have been designed to intensify debate. A pre-test, post-test design was used with students who were assigned to groups of four and asked to debate a controversial topic with the aim of exploring various 'pros and cons' on the 'Genetically Modified Organisms (GMOs)'. Findings reveal that the Digital dialogue game can facilitate argumentation-based learning. The digital Dialogue game was also evaluated positively in terms of students’ satisfaction and learning experiences.Keywords: Argumentation, dialogue, digital game, learning, motivation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12002130 Failure to Replicate the Unconscious Thought Advantages
Authors: Vladimíra Čavojová, Eva Ballová Mikušková
Abstract:
In this study we tried to replicate the unconscious thought advantage (UTA), which states that complex decisions are better handled by unconscious thinking. We designed an experiment in e-prime using similar material as the original study (choosing between four different apartments, each described by 12 attributes). A total of 73 participants (52 women (71.2%); 18 to 62 age: M=24.63; SD=8.7) took part in the experiment. We did not replicate the results suggested by UTT. However, from the present study we cannot conclude whether this was the case of flaws in the theory or flaws in our experiment and we discuss several ways in which the issue of UTA could be examined further.
Keywords: Decision making, unconscious thoughts, UTT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19542129 Autonomously Determining the Parameters for SVDD with RBF Kernel from a One-Class Training Set
Authors: Andreas Theissler, Ian Dear
Abstract:
The one-class support vector machine “support vector data description” (SVDD) is an ideal approach for anomaly or outlier detection. However, for the applicability of SVDD in real-world applications, the ease of use is crucial. The results of SVDD are massively determined by the choice of the regularisation parameter C and the kernel parameter of the widely used RBF kernel. While for two-class SVMs the parameters can be tuned using cross-validation based on the confusion matrix, for a one-class SVM this is not possible, because only true positives and false negatives can occur during training. This paper proposes an approach to find the optimal set of parameters for SVDD solely based on a training set from one class and without any user parameterisation. Results on artificial and real data sets are presented, underpinning the usefulness of the approach.
Keywords: Support vector data description, anomaly detection, one-class classification, parameter tuning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29352128 Real-Time Implementation of STANAG 4539 High-Speed HF Modem
Authors: S. Saraç, F. Kara, C.Vural
Abstract:
High-frequency (HF) communications have been used by military organizations for more than 90 years. The opportunity of very long range communications without the need for advanced equipment makes HF a convenient and inexpensive alternative of satellite communications. Besides the advantages, voice and data transmission over HF is a challenging task, because the HF channel generally suffers from Doppler shift and spread, multi-path, cochannel interference, and many other sources of noise. In constructing an HF data modem, all these effects must be taken into account. STANAG 4539 is a NATO standard for high-speed data transmission over HF. It allows data rates up to 12800 bps over an HF channel of 3 kHz. In this work, an efficient implementation of STANAG 4539 on a single Texas Instruments- TMS320C6747 DSP chip is described. The state-of-the-art algorithms used in the receiver and the efficiency of the implementation enables real-time high-speed data / digitized voice transmission over poor HF channels.
Keywords: High frequency, modem, STANAG 4539.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 53412127 Fast Factored DCT-LMS Speech Enhancement for Performance Enhancement of Digital Hearing Aid
Authors: Sunitha. S.L., V. Udayashankara
Abstract:
Background noise is particularly damaging to speech intelligibility for people with hearing loss especially for sensorineural loss patients. Several investigations on speech intelligibility have demonstrated sensorineural loss patients need 5-15 dB higher SNR than the normal hearing subjects. This paper describes Discrete Cosine Transform Power Normalized Least Mean Square algorithm to improve the SNR and to reduce the convergence rate of the LMS for Sensory neural loss patients. Since it requires only real arithmetic, it establishes the faster convergence rate as compare to time domain LMS and also this transformation improves the eigenvalue distribution of the input autocorrelation matrix of the LMS filter. The DCT has good ortho-normal, separable, and energy compaction property. Although the DCT does not separate frequencies, it is a powerful signal decorrelator. It is a real valued function and thus can be effectively used in real-time operation. The advantages of DCT-LMS as compared to standard LMS algorithm are shown via SNR and eigenvalue ratio computations. . Exploiting the symmetry of the basis functions, the DCT transform matrix [AN] can be factored into a series of ±1 butterflies and rotation angles. This factorization results in one of the fastest DCT implementation. There are different ways to obtain factorizations. This work uses the fast factored DCT algorithm developed by Chen and company. The computer simulations results show superior convergence characteristics of the proposed algorithm by improving the SNR at least 10 dB for input SNR less than and equal to 0 dB, faster convergence speed and better time and frequency characteristics.Keywords: Hearing Impairment, DCT Adaptive filter, Sensorineural loss patients, Convergence rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21712126 A Discrete-Event-Simulation Approach for Logistic Systems with Real Time Resource Routing and VR Integration
Authors: Gerrit Alves, Jürgen Roßmann, Roland Wischnewski
Abstract:
Today, transport and logistic systems are often tightly integrated in the production. Lean production and just-in-time delivering create multiple constraints that have to be fulfilled. As transport networks often have evolved over time they are very expensive to change. This paper describes a discrete-event-simulation system which simulates transportation models using real time resource routing and collision avoidance. It allows for the specification of own control algorithms and validation of new strategies. The simulation is integrated into a virtual reality (VR) environment and can be displayed in 3-D to show the progress. Simulation elements can be selected through VR metaphors. All data gathered during the simulation can be presented as a detailed summary afterwards. The included cost-benefit calculation can help to optimize the financial outcome. The operation of this approach is shown by the example of a timber harvest simulation.Keywords: Discrete-Event-Simulation, Logistic, Simulation, Virtual Reality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18812125 Implementation of Sprite Animation for Multimedia Application
Authors: Ms. Yi Mon Thant
Abstract:
Animation is simply defined as the sequencing of a series of static images to generate the illusion of movement. Most people believe that actual drawings or creation of the individual images is the animation, when in actuality it is the arrangement of those static images that conveys the motion. To become an animator, it is often assumed that needed the ability to quickly design masterpiece after masterpiece. Although some semblance of artistic skill is a necessity for the job, the real key to becoming a great animator is in the comprehension of timing. This paper will use a combination of sprite animation, frame animation, and some other techniques to cause a group of multi-colored static images to slither around in the bounded area. In addition to slithering, the images will also change the color of different parts of their body, much like the real world creatures that have this amazing ability to change the colors on their bodies do. This paper was implemented by using Java 2 Standard Edition (J2SE). It is both time-consuming and expensive to create animations, regardless if they are created by hand or by using motion-capture equipment. If the animators could reuse old animations and even blend different animations together, a lot of work would be saved in the process. The main objective of this paper is to examine a method for blending several animations together in real time. This paper presents and analyses a solution using Weighted Skeleton Animation (WSA) resulting in limited CPU time and memory waste as well as saving time for the animators. The idea presented is described in detail and implemented. In this paper, text animation, vertex animation, sprite part animation and whole sprite animation were tested. In this research paper, the resolution, smoothness and movement of animated images will be carried out from the parameters, which will be obtained from the experimental research of implementing this paper.Keywords: Weighted Skeleton Animation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18322124 Localization of Geospatial Events and Hoax Prediction in the UFO Database
Authors: Harish Krishnamurthy, Anna Lafontant, Ren Yi
Abstract:
Unidentified Flying Objects (UFOs) have been an interesting topic for most enthusiasts and hence people all over the United States report such findings online at the National UFO Report Center (NUFORC). Some of these reports are a hoax and among those that seem legitimate, our task is not to establish that these events confirm that they indeed are events related to flying objects from aliens in outer space. Rather, we intend to identify if the report was a hoax as was identified by the UFO database team with their existing curation criterion. However, the database provides a wealth of information that can be exploited to provide various analyses and insights such as social reporting, identifying real-time spatial events and much more. We perform analysis to localize these time-series geospatial events and correlate with known real-time events. This paper does not confirm any legitimacy of alien activity, but rather attempts to gather information from likely legitimate reports of UFOs by studying the online reports. These events happen in geospatial clusters and also are time-based. We look at cluster density and data visualization to search the space of various cluster realizations to decide best probable clusters that provide us information about the proximity of such activity. A random forest classifier is also presented that is used to identify true events and hoax events, using the best possible features available such as region, week, time-period and duration. Lastly, we show the performance of the scheme on various days and correlate with real-time events where one of the UFO reports strongly correlates to a missile test conducted in the United States.
Keywords: Time-series clustering, feature extraction, hoax prediction, geospatial events.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8512123 Fuzzy Hierarchical Clustering Applied for Quality Estimation in Manufacturing System
Authors: Y. Q. Lv, C.K.M. Lee
Abstract:
This paper develops a quality estimation method with the application of fuzzy hierarchical clustering. Quality estimation is essential to quality control and quality improvement as a precise estimation can promote a right decision-making in order to help better quality control. Normally the quality of finished products in manufacturing system can be differentiated by quality standards. In the real life situation, the collected data may be vague which is not easy to be classified and they are usually represented in term of fuzzy number. To estimate the quality of product presented by fuzzy number is not easy. In this research, the trapezoidal fuzzy numbers are collected in manufacturing process and classify the collected data into different clusters so as to get the estimation. Since normal hierarchical clustering methods can only be applied for real numbers, fuzzy hierarchical clustering is selected to handle this problem based on quality standards.Keywords: Quality Estimation, Fuzzy Quality Mean, Fuzzy Hierarchical Clustering, Fuzzy Number, Manufacturing system
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16672122 Effect of Columns Stiffness's and Number of Floors on the Accuracy of the Tributary Area Method
Authors: Anas M. Fares
Abstract:
The using of finite element programs in analyzing and designing buildings are becoming very popular, but there are many engineers still using the tributary area method (TAM) in designing the structural members such as columns. This study is an attempt to investigate the accuracy of the TAM results with different load condition (gravity and lateral load), different floors numbers, and different columns stiffness's. To conduct this study, linear elastic analysis in ETABS program is used. The results from finite element method are compared to those obtained from TAM. According to the analysis of the data obtained, it can be seen that there is significance difference between the real load carried by columns and the load which is calculated by using the TAM. Thus, using 3-D models are the best choice to calculate the real load effected on columns and design these columns according to this load.Keywords: Tributary area method, finite element method, ETABS, lateral load, axial loads, reinforced concrete, stiffness, multi-floor buildings.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11192121 Computer Vision Applied to Flower, Fruit and Vegetable Processing
Authors: Luis Gracia, Carlos Perez-Vidal, Carlos Gracia
Abstract:
This paper presents the theoretical background and the real implementation of an automated computer system to introduce machine vision in flower, fruit and vegetable processing for recollection, cutting, packaging, classification, or fumigation tasks. The considerations and implementation issues presented in this work can be applied to a wide range of varieties of flowers, fruits and vegetables, although some of them are especially relevant due to the great amount of units that are manipulated and processed each year over the world. The computer vision algorithms developed in this work are shown in detail, and can be easily extended to other applications. A special attention is given to the electromagnetic compatibility in order to avoid noisy images. Furthermore, real experimentation has been carried out in order to validate the developed application. In particular, the tests show that the method has good robustness and high success percentage in the object characterization.Keywords: Image processing, Vision system, Automation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33232120 Impact of the Real Effective Exchange Rate (Reer) on Turkish Agricultural Trade
Authors: Halil Fidan
Abstract:
In this work, the autoregressive vectors are used to know dynamics of the Agricultural export and import, and the real effective exchange rate (REER). In order to analyze the interactions, the impulse- response function is used in decomposition of variance, causality of Granger as well as the methodology of Johansen to know the relations co integration. The REER causes agricultural export and import in the sense of Granger. The influence displays the innovations of the REER on the agricultural export and import is not very great and the duration of the effects is short. It displays that REER has an immediate positive effect, after the tenth year it displays smooth results on the agricultural export. Evidence of a vector exists co integration, In short run, REER has smaller effects on export and import, compared to the long-run effects.Keywords: Agricultural import, agricultural export, autoregressive causality of granger, impulse-response function, long run, short run.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25782119 Genetic Algorithm for Feature Subset Selection with Exploitation of Feature Correlations from Continuous Wavelet Transform: a real-case Application
Authors: G. Van Dijck, M. M. Van Hulle, M. Wevers
Abstract:
A genetic algorithm (GA) based feature subset selection algorithm is proposed in which the correlation structure of the features is exploited. The subset of features is validated according to the classification performance. Features derived from the continuous wavelet transform are potentially strongly correlated. GA-s that do not take the correlation structure of features into account are inefficient. The proposed algorithm forms clusters of correlated features and searches for a good candidate set of clusters. Secondly a search within the clusters is performed. Different simulations of the algorithm on a real-case data set with strong correlations between features show the increased classification performance. Comparison is performed with a standard GA without use of the correlation structure.Keywords: Classification, genetic algorithm, hierarchicalagglomerative clustering, wavelet transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12242118 System Reliability by Prediction of Generator Output and Losses in a Competitive Energy Market
Authors: Perumal Nallagownden, Ravindra N. Mukerjee, Syafrudin Masri
Abstract:
In a competitive energy market, system reliability should be maintained at all times. Power system operation being of online in nature, the energy balance requirements must be satisfied to ensure reliable operation the system. To achieve this, information regarding the expected status of the system, the scheduled transactions and the relevant inputs necessary to make either a transaction contract or a transmission contract operational, have to be made available in real time. The real time procedure proposed, facilitates this. This paper proposes a quadratic curve learning procedure, which enables a generator-s contribution to the retailer demand, power loss of transaction in a line at the retail end and its associated losses for an oncoming operating scenario to be predicted. Matlab program was used to test in on a 24-bus IEE Reliability Test System, and the results are found to be acceptable.Keywords: Deregulation, learning coefficients, reliability, prediction, competitive energy market.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14782117 Face Detection using Variance based Haar-Like feature and SVM
Authors: Cuong Nguyen Khac, Ju H. Park, Ho-Youl Jung
Abstract:
This paper proposes a new approach to perform the problem of real-time face detection. The proposed method combines primitive Haar-Like feature and variance value to construct a new feature, so-called Variance based Haar-Like feature. Face in image can be represented with a small quantity of features using this new feature. We used SVM instead of AdaBoost for training and classification. We made a database containing 5,000 face samples and 10,000 non-face samples extracted from real images for learning purposed. The 5,000 face samples contain many images which have many differences of light conditions. And experiments showed that face detection system using Variance based Haar-Like feature and SVM can be much more efficient than face detection system using primitive Haar-Like feature and AdaBoost. We tested our method on two Face databases and one Non-Face database. We have obtained 96.17% of correct detection rate on YaleB face database, which is higher 4.21% than that of using primitive Haar-Like feature and AdaBoost.Keywords: AdaBoost, Haar-Like feature, SVM, variance, Variance based Haar-Like feature.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37352116 Real-Time Episodic Memory Construction for Optimal Action Selection in Cognitive Robotics
Authors: Deon de Jager, Yahya Zweiri, Dimitrios Makris
Abstract:
The three most important components in the cognitive architecture for cognitive robotics is memory representation, memory recall, and action-selection performed by the executive. In this paper, action selection, performed by the executive, is defined as a memory quantification and optimization process. The methodology describes the real-time construction of episodic memory through semantic memory optimization. The optimization is performed by set-based particle swarm optimization, using an adaptive entropy memory quantification approach for fitness evaluation. The performance of the approach is experimentally evaluated by simulation, where a UAV is tasked with the collection and delivery of a medical package. The experiments show that the UAV dynamically uses the episodic memory to autonomously control its velocity, while successfully completing its mission.
Keywords: Cognitive robotics, semantic memory, episodic memory, maximum entropy principle, particle swarm optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16352115 A Novel Approach of Power Transformer Diagnostic Using 3D FEM Parametrical Model
Authors: M. Brandt, A. Peniak, J. Makarovič, P. Rafajdus
Abstract:
This paper deals with a novel approach of power transformers diagnostics. This approach identifies the exact location and the range of a fault in the transformer and helps to reduce operation costs related to handling of the faulty transformer, its disassembly and repair. The advantage of the approach is a possibility to simulate healthy transformer and also all faults, which can occur in transformer during its operation without its disassembling, which is very expensive in practice. The approach is based on creating frequency dependent impedance of the transformer by sweep frequency response analysis measurements and by 3D FE parametrical modeling of the fault in the transformer. The parameters of the 3D FE model are the position and the range of the axial short circuit. Then, by comparing the frequency dependent impedances of the parametrical models with the measured ones, the location and the range of the fault is identified. The approach was tested on a real transformer and showed high coincidence between the real fault and the simulated one.
Keywords: Fault, finite element method, parametrical model of transformer, sweep frequency response analysis, transformer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20292114 Virtual Reality Classrooms Strategies for Creating a Social Presence
Authors: Elizabeth M. Hodge, M.H.N. Tabrizi, Mary A. Farwell, Karl L. Wuensch
Abstract:
Delivering course material via a virtual environment is beneficial to today-s students because it offers the interactivity, real-time interaction and social presence that students of all ages have come to accept in our gaming rich community. It is essential that the Net Generation also known as Generation Why, have exposure to learning communities that encompass interactivity to form social and educational connections. As student and professor become interconnected through collaboration and interaction in a virtual learning space, relationships develop and students begin to take on an individual identity. With this in mind the research project was developed to investigate the use of virtual environments on student satisfaction and the effectiveness of course delivery. Furthermore, the project was designed to integrate both interactive (real-time) classes conducted in the Virtual Reality (VR) environment while also creating archived VR sessions for student use in retaining and reviewing course content.Keywords: Virtual Reality, Social Presence, Virtual Environments, Course Delivery Methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19132113 Analysis of Electrocardiograph (ECG) Signal for the Detection of Abnormalities Using MATLAB
Authors: Durgesh Kumar Ojha, Monica Subashini
Abstract:
The proposed method is to study and analyze Electrocardiograph (ECG) waveform to detect abnormalities present with reference to P, Q, R and S peaks. The first phase includes the acquisition of real time ECG data. In the next phase, generation of signals followed by pre-processing. Thirdly, the procured ECG signal is subjected to feature extraction. The extracted features detect abnormal peaks present in the waveform Thus the normal and abnormal ECG signal could be differentiated based on the features extracted. The work is implemented in the most familiar multipurpose tool, MATLAB. This software efficiently uses algorithms and techniques for detection of any abnormalities present in the ECG signal. Proper utilization of MATLAB functions (both built-in and user defined) can lead us to work with ECG signals for processing and analysis in real time applications. The simulation would help in improving the accuracy and the hardware could be built conveniently.
Keywords: ECG Waveform, Peak Detection, Arrhythmia, Matlab.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 120082112 Performance Analysis of a Series of Adaptive Filters in Non-Stationary Environment for Noise Cancelling Setup
Authors: Anam Rafique, Syed Sohail Ahmed
Abstract:
One of the essential components of much of DSP application is noise cancellation. Changes in real time signals are quite rapid and swift. In noise cancellation, a reference signal which is an approximation of noise signal (that corrupts the original information signal) is obtained and then subtracted from the noise bearing signal to obtain a noise free signal. This approximation of noise signal is obtained through adaptive filters which are self adjusting. As the changes in real time signals are abrupt, this needs adaptive algorithm that converges fast and is stable. Least mean square (LMS) and normalized LMS (NLMS) are two widely used algorithms because of their plainness in calculations and implementation. But their convergence rates are small. Adaptive averaging filters (AFA) are also used because they have high convergence, but they are less stable. This paper provides the comparative study of LMS and Normalized NLMS, AFA and new enhanced average adaptive (Average NLMS-ANLMS) filters for noise cancelling application using speech signals.Keywords: AFA, ANLMS, LMS, NLMS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19342111 Eukaryotic Gene Prediction by an Investigation of Nonlinear Dynamical Modeling Techniques on EIIP Coded Sequences
Authors: Mai S. Mabrouk, Nahed H. Solouma, Abou-Bakr M. Youssef, Yasser M. Kadah
Abstract:
Many digital signal processing, techniques have been used to automatically distinguish protein coding regions (exons) from non-coding regions (introns) in DNA sequences. In this work, we have characterized these sequences according to their nonlinear dynamical features such as moment invariants, correlation dimension, and largest Lyapunov exponent estimates. We have applied our model to a number of real sequences encoded into a time series using EIIP sequence indicators. In order to discriminate between coding and non coding DNA regions, the phase space trajectory was first reconstructed for coding and non-coding regions. Nonlinear dynamical features are extracted from those regions and used to investigate a difference between them. Our results indicate that the nonlinear dynamical characteristics have yielded significant differences between coding (CR) and non-coding regions (NCR) in DNA sequences. Finally, the classifier is tested on real genes where coding and non-coding regions are well known.
Keywords: Gene prediction, nonlinear dynamics, correlation dimension, Lyapunov exponent.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18252110 A Real-Time Bayesian Decision-Support System for Predicting Suspect Vehicle’s Intended Target Using a Sparse Camera Network
Authors: Payam Mousavi, Andrew L. Stewart, Huiwen You, Aryeh F. G. Fayerman
Abstract:
We present a decision-support tool to assist an operator in the detection and tracking of a suspect vehicle traveling to an unknown target destination. Multiple data sources, such as traffic cameras, traffic information, weather, etc., are integrated and processed in real-time to infer a suspect’s intended destination chosen from a list of pre-determined high-value targets. Previously, we presented our work in the detection and tracking of vehicles using traffic and airborne cameras. Here, we focus on the fusion and processing of that information to predict a suspect’s behavior. The network of cameras is represented by a directional graph, where the edges correspond to direct road connections between the nodes and the edge weights are proportional to the average time it takes to travel from one node to another. For our experiments, we construct our graph based on the greater Los Angeles subset of the Caltrans’s “Performance Measurement System” (PeMS) dataset. We propose a Bayesian approach where a posterior probability for each target is continuously updated based on detections of the suspect in the live video feeds. Additionally, we introduce the concept of ‘soft interventions’, inspired by the field of Causal Inference. Soft interventions are herein defined as interventions that do not immediately interfere with the suspect’s movements; rather, a soft intervention may induce the suspect into making a new decision, ultimately making their intent more transparent. For example, a soft intervention could be temporarily closing a road a few blocks from the suspect’s current location, which may require the suspect to change their current course. The objective of these interventions is to gain the maximum amount of information about the suspect’s intent in the shortest possible time. Our system currently operates in a human-on-the-loop mode where at each step, a set of recommendations are presented to the operator to aid in decision-making. In principle, the system could operate autonomously, only prompting the operator for critical decisions, allowing the system to significantly scale up to larger areas and multiple suspects. Once the intended target is identified with sufficient confidence, the vehicle is reported to the authorities to take further action. Other recommendations include a selection of road closures, i.e., soft interventions, or to continue monitoring. We evaluate the performance of the proposed system using simulated scenarios where the suspect, starting at random locations, takes a noisy shortest path to their intended target. In all scenarios, the suspect’s intended target is unknown to our system. The decision thresholds are selected to maximize the chances of determining the suspect’s intended target in the minimum amount of time and with the smallest number of interventions. We conclude by discussing the limitations of our current approach to motivate a machine learning approach, based on reinforcement learning in order to relax some of the current limiting assumptions.
Keywords: Autonomous surveillance, Bayesian reasoning, decision-support, interventions, patterns-of-life, predictive analytics, predictive insights.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5402109 Automatic Light Control in Domotics using Artificial Neural Networks
Authors: Carlos Machado, José A. Mendes
Abstract:
Home Automation is a field that, among other subjects, is concerned with the comfort, security and energy requirements of private homes. The configuration of automatic functions in this type of houses is not always simple to its inhabitants requiring the initial setup and regular adjustments. In this work, the ubiquitous computing system vision is used, where the users- action patterns are captured, recorded and used to create the contextawareness that allows the self-configuration of the home automation system. The system will try to free the users from setup adjustments as the home tries to adapt to its inhabitants- real habits. In this paper it is described a completely automated process to determine the light state and act on them, taking in account the users- daily habits. Artificial Neural Network (ANN) is used as a pattern recognition method, classifying for each moment the light state. The work presented uses data from a real house where a family is actually living.Keywords: ANN, Home Automation, Neural Systems, PatternRecognition, Ubiquitous Computing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2071