Search results for: automatic clustering
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1029

Search results for: automatic clustering

129 A Simple Low-Cost 2-D Optical Measurement System for Linear Guideways

Authors: Wen-Yuh Jywe, Bor-Jeng Lin, Jing-Chung Shen, Jeng-Dao Lee, Hsueh-Liang Huang, Tung-Hsien Hsieh

Abstract:

In this study, a simple 2-D measurement system based on optical design was developed to measure the motion errors of the linear guideway. Compared with the transitional methods about the linear guideway for measuring the motion errors, our proposed 2-D optical measurement system can simultaneously measure horizontal and vertical running straightness errors for the linear guideway.

The performance of the 2-D optical measurement system is verified by experimental results. The standard deviation of the 2-D optical measurement system is about 0.4μm in the measurement range of 100 mm. The maximum measuring speed of the proposed automatic measurement instrument is 1 m/sec.

Keywords: 2-D measurement, linear guideway, motion errors, running straightness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2231
128 In Search of an SVD and QRcp Based Optimization Technique of ANN for Automatic Classification of Abnormal Heart Sounds

Authors: Samit Ari, Goutam Saha

Abstract:

Artificial Neural Network (ANN) has been extensively used for classification of heart sounds for its discriminative training ability and easy implementation. However, it suffers from overparameterization if the number of nodes is not chosen properly. In such cases, when the dataset has redundancy within it, ANN is trained along with this redundant information that results in poor validation. Also a larger network means more computational expense resulting more hardware and time related cost. Therefore, an optimum design of neural network is needed towards real-time detection of pathological patterns, if any from heart sound signal. The aims of this work are to (i) select a set of input features that are effective for identification of heart sound signals and (ii) make certain optimum selection of nodes in the hidden layer for a more effective ANN structure. Here, we present an optimization technique that involves Singular Value Decomposition (SVD) and QR factorization with column pivoting (QRcp) methodology to optimize empirically chosen over-parameterized ANN structure. Input nodes present in ANN structure is optimized by SVD followed by QRcp while only SVD is required to prune undesirable hidden nodes. The result is presented for classifying 12 common pathological cases and normal heart sound.

Keywords: ANN, Classification of heart diseases, murmurs, optimization, Phonocardiogram, QRcp, SVD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2071
127 Multimedia Firearms Training System

Authors: Aleksander Nawrat, Karol Jędrasiak, Artur Ryt, Dawid Sobel

Abstract:

The goal of the article is to present a novel Multimedia Firearms Training System. The system was developed in order to compensate for major problems of existing shooting training systems. The designed and implemented solution can be characterized by five major advantages: algorithm for automatic geometric calibration, algorithm of photometric recalibration, firearms hit point detection using thermal imaging camera, IR laser spot tracking algorithm for after action review analysis, and implementation of ballistics equations. The combination of the abovementioned advantages in a single multimedia firearms training system creates a comprehensive solution for detecting and tracking of the target point usable for shooting training systems and improving intervention tactics of uniformed services. The introduced algorithms of geometric and photometric recalibration allow the use of economically viable commercially available projectors for systems that require long and intensive use without most of the negative impacts on color mapping of existing multi-projector multimedia shooting range systems. The article presents the results of the developed algorithms and their application in real training systems.

Keywords: Firearms shot detection, geometric recalibration, photometric recalibration, IR tracking algorithm, thermography, ballistics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1334
126 Fuzzy Sequential Algorithm for Discrimination and Decision Maker in Sporting Events

Authors: Mourad Moussa, Ali Douik, Hassani Messaoud

Abstract:

Events discrimination and decision maker in sport field are the subject of many interesting studies in computer vision and artificial intelligence. A large volume of research has been conducted for automatic semantic event detection and summarization of sports videos. Indeed the results of these researches have a very significant contribution, as well to television broadcasts as to the football teams, since the result of sporting event can be reflected on the economic field. In this paper, we propose a novel fuzzy sequential technique which lead to discriminate events and specify the technico-tactics on going the game, nor the fuzzy system or the sequential one, may be able to respond to the asked question, in fact fuzzy process is not sufficient, it does not respect the chronological order according the time of various events, similarly the sequential process needs flexibility about the parameters used in this study, it may affect a membership degree of each parameter on the one hand and respect the sequencing of events for each frame on the other hand. Indeed this technique describes special events such as dribbling, headings, short sprints, rapid acceleration or deceleration, turning, jumping, kicking, ball occupation, and tackling according velocity vectors of the two players and the ball direction.

Keywords: Sequential process, Event detection, Soccer videos analysis, Fuzzy process, Spatio-temporal parameters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1881
125 Machine Learning for Music Aesthetic Annotation Using MIDI Format: A Harmony-Based Classification Approach

Authors: Lin Yang, Zhian Mi, Jiacheng Xiao, Rong Li

Abstract:

Swimming with the tide of deep learning, the field of music information retrieval (MIR) experiences parallel development and a sheer variety of feature-learning models has been applied to music classification and tagging tasks. Among those learning techniques, the deep convolutional neural networks (CNNs) have been widespreadly used with better performance than the traditional approach especially in music genre classification and prediction. However, regarding the music recommendation, there is a large semantic gap between the corresponding audio genres and the various aspects of a song that influence user preference. In our study, aiming to bridge the gap, we strive to construct an automatic music aesthetic annotation model with MIDI format for better comparison and measurement of the similarity between music pieces in the way of harmonic analysis. We use the matrix of qualification converted from MIDI files as input to train two different classifiers, support vector machine (SVM) and Decision Tree (DT). Experimental results in performance of a tag prediction task have shown that both learning algorithms are capable of extracting high-level properties in an end-to end manner from music information. The proposed model is helpful to learn the audience taste and then the resulting recommendations are likely to appeal to a niche consumer.

Keywords: Harmonic analysis, machine learning, music classification and tagging, MIDI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 758
124 Efficient Boosting-Based Active Learning for Specific Object Detection Problems

Authors: Thuy Thi Nguyen, Nguyen Dang Binh, Horst Bischof

Abstract:

In this work, we present a novel active learning approach for learning a visual object detection system. Our system is composed of an active learning mechanism as wrapper around a sub-algorithm which implement an online boosting-based learning object detector. In the core is a combination of a bootstrap procedure and a semi automatic learning process based on the online boosting procedure. The idea is to exploit the availability of classifier during learning to automatically label training samples and increasingly improves the classifier. This addresses the issue of reducing labeling effort meanwhile obtain better performance. In addition, we propose a verification process for further improvement of the classifier. The idea is to allow re-update on seen data during learning for stabilizing the detector. The main contribution of this empirical study is a demonstration that active learning based on an online boosting approach trained in this manner can achieve results comparable or even outperform a framework trained in conventional manner using much more labeling effort. Empirical experiments on challenging data set for specific object deteciton problems show the effectiveness of our approach.

Keywords: Computer vision, object detection, online boosting, active learning, labeling complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1784
123 Monitoring and Fault-Recovery Capacity with Waveguide Grating-based Optical Switch over WDM/OCDMA-PON

Authors: Yao-Tang Chang, Chuen-Ching Wang, Shu-Han Hu

Abstract:

In order to implement flexibility as well as survivable capacities over passive optical network (PON), a new automatic random fault-recovery mechanism with array-waveguide-grating based (AWG-based) optical switch (OSW) is presented. Firstly, wavelength-division-multiplexing and optical code-division multiple-access (WDM/OCDMA) scheme are configured to meet the various geographical locations requirement between optical network unit (ONU) and optical line terminal (OLT). The AWG-base optical switch is designed and viewed as central star-mesh topology to prohibit/decrease the duplicated redundant elements such as fiber and transceiver as well. Hence, by simple monitoring and routing switch algorithm, random fault-recovery capacity is achieved over bi-directional (up/downstream) WDM/OCDMA scheme. When error of distribution fiber (DF) takes place or bit-error-rate (BER) is higher than 10-9 requirement, the primary/slave AWG-based OSW are adjusted and controlled dynamically to restore the affected ONU groups via the other working DFs immediately.

Keywords: Random fault recovery mechanism, Array-waveguide-grating based optical switch (AWG- based OSW), wavelength-division-multiplexing and optical code-divisionmultiple-access (WDM/ OCDMA)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
122 Studying the Possibility to Weld AA1100 Aluminum Alloy by Friction Stir Spot Welding

Authors: Ahmad K. Jassim, Raheem Kh. Al-Subar

Abstract:

Friction stir welding is a modern and an environmentally friendly solid state joining process used to joint relatively lighter family of materials. Recently, friction stir spot welding has been used instead of resistance spot welding which has received considerable attention from the automotive industry. It is environmentally friendly process that eliminated heat and pollution. In this research, friction stir spot welding has been used to study the possibility to weld AA1100 aluminum alloy sheet with 3 mm thickness by overlapping the edges of sheet as lap joint. The process was done using a drilling machine instead of milling machine. Different tool rotational speeds of 760, 1065, 1445, and 2000 RPM have been applied with manual and automatic compression to study their effect on the quality of welded joints. Heat generation, pressure applied, and depth of tool penetration have been measured during the welding process. The result shows that there is a possibility to weld AA1100 sheets; however, there is some surface defect that happened due to insufficient condition of welding. Moreover, the relationship between rotational speed, pressure, heat generation and tool depth penetration was created.

Keywords: Friction, spot, stir, environmental, sustainable, AA1100 aluminum alloy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1145
121 A Modified Run Length Coding Technique for Test Data Compression Based on Multi-Level Selective Huffman Coding

Authors: C. Kalamani, K. Paramasivam

Abstract:

Test data compression is an efficient method for reducing the test application cost. The problem of reducing test data has been addressed by researchers in three different aspects: Test Data Compression, Built-in-Self-Test (BIST) and Test set compaction. The latter two methods are capable of enhancing fault coverage with cost of hardware overhead. The drawback of the conventional methods is that they are capable of reducing the test storage and test power but when test data have redundant length of runs, no additional compression method is followed. This paper presents a modified Run Length Coding (RLC) technique with Multilevel Selective Huffman Coding (MLSHC) technique to reduce test data volume, test pattern delivery time and power dissipation in scan test applications where redundant length of runs is encountered then the preceding run symbol is replaced with tiny codeword. Experimental results show that the presented method not only improves the test data compression but also reduces the overall test data volume compared to recent schemes. Experiments for the six largest ISCAS-98 benchmarks show that our method outperforms most known techniques.

Keywords: Modified run length coding, multilevel selective Huffman coding, built-in-self-test modified selective Huffman coding, automatic test equipment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1274
120 Design and Development of an Efficient and Cost-Effective Microcontroller-Based Irrigation Control System to Enhance Food Security

Authors: Robert A. Sowah, Stephen K. Armoo, Koudjo M. Koumadi, Rockson Agyeman, Seth Y. Fiawoo

Abstract:

The development of the agricultural sector in Ghana has been reliant on the use of irrigation systems to ensure food security. However, the manual operation of these systems has not facilitated their maximum efficiency due to human limitations. This paper seeks to address this problem by designing and implementing an efficient, cost effective automated system which monitors and controls the water flow of irrigation through communication with an authorized operator via text messages. The automatic control component of the system is timer based with an Atmega32 microcontroller and a real time clock from the SM5100B cellular module. For monitoring purposes, the system sends periodic notification of the system on the performance of duty via SMS to the authorized person(s). Moreover, the GSM based Irrigation Monitoring and Control System saves time and labour and reduces cost of operating irrigation systems by saving electricity usage and conserving water. Field tests conducted have proven its operational efficiency and ease of assessment of farm irrigation equipment due to its costeffectiveness and data logging capabilities.

Keywords: Agriculture, control system, data logging, food security, irrigation system, microcontroller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5208
119 Design of a Fuzzy Feed-forward Controller for Monitor HAGC System of Cold Rolling Mill

Authors: S. Khosravi, A. Afshar, F. Barazandeh

Abstract:

In this study we propose a novel monitor hydraulic automatic gauge control (HAGC) system based on fuzzy feedforward controller. This is used in the development of cold rolling mill automation system to improve the quality of cold strip. According to features/ properties of entry steel strip like its average yield stress, width of strip, and desired exit thickness, this controller realizes the compensation for the exit thickness error. The traditional methods of adjusting the roller position, can-t tolerate the variance in the entry steel strip. The proposed method uses a mathematical model of the system together with the expert knowledge to perform this adjustment while minimizing the effect of the stated problem. In order to improve the speed of the controller in rejecting disturbances introduced by entry strip thickness variations, expert knowledge is added as a feed-forward term to the HAGC system. Simulation results for the application of the proposed controller to a real cold mill show that the exit strip quality is highly improved.

Keywords: Fuzzy feed-forward controller, monitor HAGC system, dynamic mathematical model, entry strip thickness deviation compensation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2206
118 Automatic Visualization Pipeline Formation for Medical Datasets on Grid Computing Environment

Authors: Aboamama Atahar Ahmed, Muhammad Shafie Abd Latiff, Kamalrulnizam Abu Bakar, Zainul AhmadRajion

Abstract:

Distance visualization of large datasets often takes the direction of remote viewing and zooming techniques of stored static images. However, the continuous increase in the size of datasets and visualization operation causes insufficient performance with traditional desktop computers. Additionally, the visualization techniques such as Isosurface depend on the available resources of the running machine and the size of datasets. Moreover, the continuous demand for powerful computing powers and continuous increase in the size of datasets results an urgent need for a grid computing infrastructure. However, some issues arise in current grid such as resources availability at the client machines which are not sufficient enough to process large datasets. On top of that, different output devices and different network bandwidth between the visualization pipeline components often result output suitable for one machine and not suitable for another. In this paper we investigate how the grid services could be used to support remote visualization of large datasets and to break the constraint of physical co-location of the resources by applying the grid computing technologies. We show our grid enabled architecture to visualize large medical datasets (circa 5 million polygons) for remote interactive visualization on modest resources clients.

Keywords: Visualization, Grid computing, Medical datasets, visualization techniques, thin clients, Globus toolkit, VTK.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1751
117 Mining Genes Relations in Microarray Data Combined with Ontology in Colon Cancer Automated Diagnosis System

Authors: A. Gruzdz, A. Ihnatowicz, J. Siddiqi, B. Akhgar

Abstract:

MATCH project [1] entitle the development of an automatic diagnosis system that aims to support treatment of colon cancer diseases by discovering mutations that occurs to tumour suppressor genes (TSGs) and contributes to the development of cancerous tumours. The constitution of the system is based on a) colon cancer clinical data and b) biological information that will be derived by data mining techniques from genomic and proteomic sources The core mining module will consist of the popular, well tested hybrid feature extraction methods, and new combined algorithms, designed especially for the project. Elements of rough sets, evolutionary computing, cluster analysis, self-organization maps and association rules will be used to discover the annotations between genes, and their influence on tumours [2]-[11]. The methods used to process the data have to address their high complexity, potential inconsistency and problems of dealing with the missing values. They must integrate all the useful information necessary to solve the expert's question. For this purpose, the system has to learn from data, or be able to interactively specify by a domain specialist, the part of the knowledge structure it needs to answer a given query. The program should also take into account the importance/rank of the particular parts of data it analyses, and adjusts the used algorithms accordingly.

Keywords: Bioinformatics, gene expression, ontology, selforganizingmaps.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1974
116 Comparison among Various Question Generations for Decision Tree Based State Tying in Persian Language

Authors: Nasibeh Nasiri, Dawood Talebi Khanmiri

Abstract:

Performance of any continuous speech recognition system is highly dependent on performance of the acoustic models. Generally, development of the robust spoken language technology relies on the availability of large amounts of data. Common way to cope with little data for training each state of Markov models is treebased state tying. This tying method applies contextual questions to tie states. Manual procedure for question generation suffers from human errors and is time consuming. Various automatically generated questions are used to construct decision tree. There are three approaches to generate questions to construct HMMs based on decision tree. One approach is based on misrecognized phonemes, another approach basically uses feature table and the other is based on state distributions corresponding to context-independent subword units. In this paper, all these methods of automatic question generation are applied to the decision tree on FARSDAT corpus in Persian language and their results are compared with those of manually generated questions. The results show that automatically generated questions yield much better results and can replace manually generated questions in Persian language.

Keywords: Decision Tree, Markov Models, Speech Recognition, State Tying.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1722
115 A New Intelligent, Dynamic and Real Time Management System of Sewerage

Authors: R. Tlili Yaakoubi, H. Nakouri, O. Blanpain, S. Lallahem

Abstract:

The current tools for real time management of sewer systems are based on two software tools: the software of weather forecast and the software of hydraulic simulation. The use of the first ones is an important cause of imprecision and uncertainty, the use of the second requires temporal important steps of decision because of their need in times of calculation. This way of proceeding fact that the obtained results are generally different from those waited. The major idea of this project is to change the basic paradigm by approaching the problem by the "automatic" face rather than by that "hydrology". The objective is to make possible the realization of a large number of simulations at very short times (a few seconds) allowing to take place weather forecasts by using directly the real time meditative pluviometric data. The aim is to reach a system where the decision-making is realized from reliable data and where the correction of the error is permanent. A first model of control laws was realized and tested with different return-period rainfalls. The gains obtained in rejecting volume vary from 19 to 100 %. The development of a new algorithm was then used to optimize calculation time and thus to overcome the subsequent combinatorial problem in our first approach. Finally, this new algorithm was tested with 16- year-rainfall series. The obtained gains are 40 % of total volume rejected to the natural environment and of 65 % in the number of discharges.

Keywords: Automation, optimization, paradigm, RTC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1488
114 A Distributed Cryptographically Generated Address Computing Algorithm for Secure Neighbor Discovery Protocol in IPv6

Authors: M. Moslehpour, S. Khorsandi

Abstract:

Due to shortage in IPv4 addresses, transition to IPv6 has gained significant momentum in recent years. Like Address Resolution Protocol (ARP) in IPv4, Neighbor Discovery Protocol (NDP) provides some functions like address resolution in IPv6. Besides functionality of NDP, it is vulnerable to some attacks. To mitigate these attacks, Internet Protocol Security (IPsec) was introduced, but it was not efficient due to its limitation. Therefore, SEND protocol is proposed to automatic protection of auto-configuration process. It is secure neighbor discovery and address resolution process. To defend against threats on NDP’s integrity and identity, Cryptographically Generated Address (CGA) and asymmetric cryptography are used by SEND. Besides advantages of SEND, its disadvantages like the computation process of CGA algorithm and sequentially of CGA generation algorithm are considerable. In this paper, we parallel this process between network resources in order to improve it. In addition, we compare the CGA generation time in self-computing and distributed-computing process. We focus on the impact of the malicious nodes on the CGA generation time in the network. According to the result, although malicious nodes participate in the generation process, CGA generation time is less than when it is computed in a one-way. By Trust Management System, detecting and insulating malicious nodes is easier.

Keywords: NDP, IPsec, SEND, CGA, Modifier, Malicious node, Self-Computing, Distributed-Computing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1376
113 View-Point Insensitive Human Pose Recognition using Neural Network

Authors: Sanghyeok Oh, Yunli Lee, Kwangjin Hong, Kirak Kim, Keechul Jung

Abstract:

This paper proposes view-point insensitive human pose recognition system using neural network. Recognition system consists of silhouette image capturing module, data driven database, and neural network. The advantages of our system are first, it is possible to capture multiple view-point silhouette images of 3D human model automatically. This automatic capture module is helpful to reduce time consuming task of database construction. Second, we develop huge feature database to offer view-point insensitivity at pose recognition. Third, we use neural network to recognize human pose from multiple-view because every pose from each model have similar feature patterns, even though each model has different appearance and view-point. To construct database, we need to create 3D human model using 3D manipulate tools. Contour shape is used to convert silhouette image to feature vector of 12 degree. This extraction task is processed semi-automatically, which benefits in that capturing images and converting to silhouette images from the real capturing environment is needless. We demonstrate the effectiveness of our approach with experiments on virtual environment.

Keywords: Computer vision, neural network, pose recognition, view-point insensitive.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1328
112 Event Information Extraction System (EIEE): FSM vs HMM

Authors: Shaukat Wasi, Zubair A. Shaikh, Sajid Qasmi, Hussain Sachwani, Rehman Lalani, Aamir Chagani

Abstract:

Automatic Extraction of Event information from social text stream (emails, social network sites, blogs etc) is a vital requirement for many applications like Event Planning and Management systems and security applications. The key information components needed from Event related text are Event title, location, participants, date and time. Emails have very unique distinctions over other social text streams from the perspective of layout and format and conversation style and are the most commonly used communication channel for broadcasting and planning events. Therefore we have chosen emails as our dataset. In our work, we have employed two statistical NLP methods, named as Finite State Machines (FSM) and Hidden Markov Model (HMM) for the extraction of event related contextual information. An application has been developed providing a comparison among the two methods over the event extraction task. It comprises of two modules, one for each method, and works for both bulk as well as direct user input. The results are evaluated using Precision, Recall and F-Score. Experiments show that both methods produce high performance and accuracy, however HMM was good enough over Title extraction and FSM proved to be better for Venue, Date, and time.

Keywords: Emails, Event Extraction, Event Detection, Finite state machines, Hidden Markov Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2317
111 Development of an Indoor Drone Designed for the Needs of the Creative Industries

Authors: V. Santamarina Campos, M. de Miguel Molina, S. Kröner, B. de Miguel Molina

Abstract:

With this contribution, we want to show how the AiRT system could change the future way of working of a part of the creative industry and what new economic opportunities could arise for them. Remotely Piloted Aircraft Systems (RPAS), also more commonly known as drones, are now essential tools used by many different companies for their creative outdoor work. However, using this very flexible applicable tool indoor is almost impossible, since safe navigation cannot be guaranteed by the operator due to the lack of a reliable and affordable indoor positioning system which ensures a stable flight, among other issues. Here we present our first results of a European project, which consists of developing an indoor drone for professional footage especially designed for the creative industries. One of the main achievements of this project is the successful implication of the end-users in the overall design process from the very beginning. To ensure safe flight in confined spaces, our drone incorporates a positioning system based on ultra-wide band technology, an RGB-D (depth) camera for 3D environment reconstruction and the possibility to fully pre-program automatic flights. Since we also want to offer this tool for inexperienced pilots, we have always focused on user-friendly handling of the whole system throughout the entire process.

Keywords: Virtual reality, 3D reconstruction, indoor positioning system, UWB, RPAS, aerial film, intelligent navigation, advanced safety measures, creative industries.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 905
110 ANN Based Currency Recognition System using Compressed Gray Scale and Application for Sri Lankan Currency Notes - SLCRec

Authors: D. A. K. S. Gunaratna, N. D. Kodikara, H. L. Premaratne

Abstract:

Automatic currency note recognition invariably depends on the currency note characteristics of a particular country and the extraction of features directly affects the recognition ability. Sri Lanka has not been involved in any kind of research or implementation of this kind. The proposed system “SLCRec" comes up with a solution focusing on minimizing false rejection of notes. Sri Lankan currency notes undergo severe changes in image quality in usage. Hence a special linear transformation function is adapted to wipe out noise patterns from backgrounds without affecting the notes- characteristic images and re-appear images of interest. The transformation maps the original gray scale range into a smaller range of 0 to 125. Applying Edge detection after the transformation provided better robustness for noise and fair representation of edges for new and old damaged notes. A three layer back propagation neural network is presented with the number of edges detected in row order of the notes and classification is accepted in four classes of interest which are 100, 500, 1000 and 2000 rupee notes. The experiments showed good classification results and proved that the proposed methodology has the capability of separating classes properly in varying image conditions.

Keywords: Artificial intelligence, linear transformation and pattern recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2833
109 Inferring Hierarchical Pronunciation Rules from a Phonetic Dictionary

Authors: Erika Pigliapoco, Valerio Freschi, Alessandro Bogliolo

Abstract:

This work presents a new phonetic transcription system based on a tree of hierarchical pronunciation rules expressed as context-specific grapheme-phoneme correspondences. The tree is automatically inferred from a phonetic dictionary by incrementally analyzing deeper context levels, eventually representing a minimum set of exhaustive rules that pronounce without errors all the words in the training dictionary and that can be applied to out-of-vocabulary words. The proposed approach improves upon existing rule-tree-based techniques in that it makes use of graphemes, rather than letters, as elementary orthographic units. A new linear algorithm for the segmentation of a word in graphemes is introduced to enable outof- vocabulary grapheme-based phonetic transcription. Exhaustive rule trees provide a canonical representation of the pronunciation rules of a language that can be used not only to pronounce out-of-vocabulary words, but also to analyze and compare the pronunciation rules inferred from different dictionaries. The proposed approach has been implemented in C and tested on Oxford British English and Basic English. Experimental results show that grapheme-based rule trees represent phonetically sound rules and provide better performance than letter-based rule trees.

Keywords: Automatic phonetic transcription, pronunciation rules, hierarchical tree inference.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1925
108 Recognizing an Individual, Their Topic of Conversation, and Cultural Background from 3D Body Movement

Authors: Gheida J. Shahrour, Martin J. Russell

Abstract:

The 3D body movement signals captured during human-human conversation include clues not only to the content of people’s communication but also to their culture and personality. This paper is concerned with automatic extraction of this information from body movement signals. For the purpose of this research, we collected a novel corpus from 27 subjects, arranged them into groups according to their culture. We arranged each group into pairs and each pair communicated with each other about different topics. A state-of-art recognition system is applied to the problems of person, culture, and topic recognition. We borrowed modeling, classification, and normalization techniques from speech recognition. We used Gaussian Mixture Modeling (GMM) as the main technique for building our three systems, obtaining 77.78%, 55.47%, and 39.06% from the person, culture, and topic recognition systems respectively. In addition, we combined the above GMM systems with Support Vector Machines (SVM) to obtain 85.42%, 62.50%, and 40.63% accuracy for person, culture, and topic recognition respectively. Although direct comparison among these three recognition systems is difficult, it seems that our person recognition system performs best for both GMM and GMM-SVM, suggesting that intersubject differences (i.e. subject’s personality traits) are a major source of variation. When removing these traits from culture and topic recognition systems using the Nuisance Attribute Projection (NAP) and the Intersession Variability Compensation (ISVC) techniques, we obtained 73.44% and 46.09% accuracy from culture and topic recognition systems respectively.

Keywords: Person Recognition, Topic Recognition, Culture Recognition, 3D Body Movement Signals, Variability Compensation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2174
107 A Mathematical Representation for Mechanical Model Assessment: Numerical Model Qualification Method

Authors: Keny Ordaz-Hernandez, Xavier Fischer, Fouad Bennis

Abstract:

This article illustrates a model selection management approach for virtual prototypes in interactive simulations. In those numerical simulations, the virtual prototype and its environment are modelled as a multiagent system, where every entity (prototype,human, etc.) is modelled as an agent. In particular, virtual prototyp ingagents that provide mathematical models of mechanical behaviour inform of computational methods are considered. This work argues that selection of an appropriate model in a changing environment,supported by models? characteristics, can be managed by the deter-mination a priori of specific exploitation and performance measures of virtual prototype models. As different models exist to represent a single phenomenon, it is not always possible to select the best one under all possible circumstances of the environment. Instead the most appropriate shall be selecting according to the use case. The proposed approach consists in identifying relevant metrics or indicators for each group of models (e.g. entity models, global model), formulate their qualification, analyse the performance, and apply the qualification criteria. Then, a model can be selected based on the performance prediction obtained from its qualification. The authors hope that this approach will not only help to inform engineers and researchers about another approach for selecting virtual prototype models, but also assist virtual prototype engineers in the systematic or automatic model selection.

Keywords: Virtual prototype models, domain, qualification criterion, model qualification, model assessment, environmental modelling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2039
106 Localization of Geospatial Events and Hoax Prediction in the UFO Database

Authors: Harish Krishnamurthy, Anna Lafontant, Ren Yi

Abstract:

Unidentified Flying Objects (UFOs) have been an interesting topic for most enthusiasts and hence people all over the United States report such findings online at the National UFO Report Center (NUFORC). Some of these reports are a hoax and among those that seem legitimate, our task is not to establish that these events confirm that they indeed are events related to flying objects from aliens in outer space. Rather, we intend to identify if the report was a hoax as was identified by the UFO database team with their existing curation criterion. However, the database provides a wealth of information that can be exploited to provide various analyses and insights such as social reporting, identifying real-time spatial events and much more. We perform analysis to localize these time-series geospatial events and correlate with known real-time events. This paper does not confirm any legitimacy of alien activity, but rather attempts to gather information from likely legitimate reports of UFOs by studying the online reports. These events happen in geospatial clusters and also are time-based. We look at cluster density and data visualization to search the space of various cluster realizations to decide best probable clusters that provide us information about the proximity of such activity. A random forest classifier is also presented that is used to identify true events and hoax events, using the best possible features available such as region, week, time-period and duration. Lastly, we show the performance of the scheme on various days and correlate with real-time events where one of the UFO reports strongly correlates to a missile test conducted in the United States.

Keywords: Time-series clustering, feature extraction, hoax prediction, geospatial events.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 851
105 Total and Partial Factor Productivity Analysis of Irrigated Wheat in Iran by Separate of Exploitation Scales

Authors: Hassan Masoumi, Rashed Alavi

Abstract:

Wheat is one of the strategic crops in Iran, on which the household food basket is highly dependent. Although this crop is cultivated and produced in almost all provinces of the country, its production efficiency is lower than the global and regional averages due to the lack of optimal use of allocated resources. In this research, which was carried out with a documentary and library method, first, the total and partial productivity indices of irrigated wheat production were calculated in large, medium and small exploitation scales in different provinces of the country, and then the provinces were clustered in terms of these indices. The results showed that the total productivity of production factors had a direct correlation with the scale of exploitation, so that with the increase in the size of exploitations, the total productivity index increased. On the scale of small exploitations, North Khorasan, Zanjan, Chaharmahal and Bakhtiari Province, on a medium scale, Chaharmahal and Bakhtiari Province and on the scale of large exploitations, Zanjan, Chaharmahal and Bakhtiari provinces, Kohkiloyeh and Boyer Ahmad and North Khorasan, with better use of production resources compared to other provinces, were placed in the best cluster in terms of total productivity index. The high total productivity index in Zanjan, Chaharmahal and Bakhtiari Province is related to the higher productivity of factors such as mechanization and land in these provinces. Finally, the methods of using these factors in productive provinces, along with technical and specialized regional guidelines, can facilitate the improvement of productivity in less productive provinces.

Keywords: Clustering, Irrigated wheat, Iran, total productivity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 170
104 Application of the Total Least Squares Estimation Method for an Aircraft Aerodynamic Model Identification

Authors: Zaouche Mohamed, Amini Mohamed, Foughali Khaled, Aitkaid Souhila, Bouchiha Nihad Sarah

Abstract:

The aerodynamic coefficients are important in the evaluation of an aircraft performance and stability-control characteristics. These coefficients also can be used in the automatic flight control systems and mathematical model of flight simulator. The study of the aerodynamic aspect of flying systems is a reserved domain and inaccessible for the developers. Doing tests in a wind tunnel to extract aerodynamic forces and moments requires a specific and expensive means. Besides, the glaring lack of published documentation in this field of study makes the aerodynamic coefficients determination complicated. This work is devoted to the identification of an aerodynamic model, by using an aircraft in virtual simulated environment. We deal with the identification of the system, we present an environment framework based on Software In the Loop (SIL) methodology and we use MicrosoftTM Flight Simulator (FS-2004) as the environment for plane simulation. We propose The Total Least Squares Estimation technique (TLSE) to identify the aerodynamic parameters, which are unknown, variable, classified and used in the expression of the piloting law. In this paper, we define each aerodynamic coefficient as the mean of its numerical values. All other variations are considered as modeling uncertainties that will be compensated by the robustness of the piloting control.

Keywords: Aircraft aerodynamic model, Microsoft flight simulator, MQ-1 Predator, total least squares estimation, piloting the aircraft.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1668
103 ADA Tool for Satellite InSAR-Based Ground Displacement Analysis: The Granada Region

Authors: M. Cuevas-González, O. Monserrat, A. Barra, C. Reyes-Carmona, R. M. Mateos, J. P. Galve, R. Sarro, M. Cantalejo, E. Peña, M. Martínez-Corbella, J. A. Luque, J. M. Azañón, A. Millares, M. Béjar, J. A. Navarro, L. Solari

Abstract:

Geohazard prone areas require continuous monitoring to detect risks, understand the phenomena occurring in those regions and prevent disasters. Satellite interferometry (InSAR) has come to be a trustworthy technique for ground movement detection and monitoring in the last few years. InSAR based techniques allow to process large areas providing high number of displacement measurements at low cost. However, the results provided by such techniques are usually not easy to interpret by non-experienced users hampering its use for decision makers. This work presents a set of tools developed in the framework of different projects (Momit, Safety, U-Geohaz, Riskcoast) and an example of their use in the Granada Coastal area (Spain) is shown. The ADA (Active Displacement Areas) tool has been developed with the aim of easing the management, use and interpretation of InSAR based results. It provides a semi-automatic extraction of the most significant ADAs through the application ADAFinder tool. This tool aims to support the exploitation of the European Ground Motion Service (EU-GMS), which will offer reliable and systematic information on natural and anthropogenic ground motion phenomena across Europe.

Keywords: Ground displacements, InSAR, natural hazards, satellite imagery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 412
102 Computer-Aided Classification of Liver Lesions Using Contrasting Features Difference

Authors: Hussein Alahmer, Amr Ahmed

Abstract:

Liver cancer is one of the common diseases that cause the death. Early detection is important to diagnose and reduce the incidence of death. Improvements in medical imaging and image processing techniques have significantly enhanced interpretation of medical images. Computer-Aided Diagnosis (CAD) systems based on these techniques play a vital role in the early detection of liver disease and hence reduce liver cancer death rate.  This paper presents an automated CAD system consists of three stages; firstly, automatic liver segmentation and lesion’s detection. Secondly, extracting features. Finally, classifying liver lesions into benign and malignant by using the novel contrasting feature-difference approach. Several types of intensity, texture features are extracted from both; the lesion area and its surrounding normal liver tissue. The difference between the features of both areas is then used as the new lesion descriptors. Machine learning classifiers are then trained on the new descriptors to automatically classify liver lesions into benign or malignant. The experimental results show promising improvements. Moreover, the proposed approach can overcome the problems of varying ranges of intensity and textures between patients, demographics, and imaging devices and settings.

Keywords: CAD system, difference of feature, Fuzzy c means, Liver segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1421
101 Unbalanced Distribution Optimal Power Flow to Minimize Losses with Distributed Photovoltaic Plants

Authors: Malinwo Estone Ayikpa

Abstract:

Electric power systems are likely to operate with minimum losses and voltage meeting international standards. This is made possible generally by control actions provide by automatic voltage regulators, capacitors and transformers with on-load tap changer (OLTC). With the development of photovoltaic (PV) systems technology, their integration on distribution networks has increased over the last years to the extent of replacing the above mentioned techniques. The conventional analysis and simulation tools used for electrical networks are no longer able to take into account control actions necessary for studying distributed PV generation impact. This paper presents an unbalanced optimal power flow (OPF) model that minimizes losses with association of active power generation and reactive power control of single-phase and three-phase PV systems. Reactive power can be generated or absorbed using the available capacity and the adjustable power factor of the inverter. The unbalance OPF is formulated by current balance equations and solved by primal-dual interior point method. Several simulation cases have been carried out varying the size and location of PV systems and the results show a detailed view of the impact of PV distributed generation on distribution systems.

Keywords: Distribution system, losses, photovoltaic generation, primal-dual interior point method, reactive power control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1080
100 Texture Based Weed Detection Using Multi Resolution Combined Statistical and Spatial Frequency (MRCSF)

Authors: R.S.Sabeenian, V.Palanisamy

Abstract:

Texture classification is a trendy and a catchy technology in the field of texture analysis. Textures, the repeated patterns, have different frequency components along different orientations. Our work is based on Texture Classification and its applications. It finds its applications in various fields like Medical Image Classification, Computer Vision, Remote Sensing, Agricultural Field, and Textile Industry. Weed control has a major effect on agriculture. A large amount of herbicide has been used for controlling weeds in agriculture fields, lawns, golf courses, sport fields, etc. Random spraying of herbicides does not meet the exact requirement of the field. Certain areas in field have more weed patches than estimated. So, we need a visual system that can discriminate weeds from the field image which will reduce or even eliminate the amount of herbicide used. This would allow farmers to not use any herbicides or only apply them where they are needed. A machine vision precision automated weed control system could reduce the usage of chemicals in crop fields. In this paper, an intelligent system for automatic weeding strategy Multi Resolution Combined Statistical & spatial Frequency is used to discriminate the weeds from the crops and to classify them as narrow, little and broad weeds.

Keywords: crop weed discrimination, MRCSF, MRFM, Weeddetection, Spatial Frequency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1828