Search results for: fingerprint classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1166

Search results for: fingerprint classification

686 A Multilanguage Source Code Retrieval System Using Structural-Semantic Fingerprints

Authors: Mohamed Amine Ouddan, Hassane Essafi

Abstract:

Source code retrieval is of immense importance in the software engineering field. The complex tasks of retrieving and extracting information from source code documents is vital in the development cycle of the large software systems. The two main subtasks which result from these activities are code duplication prevention and plagiarism detection. In this paper, we propose a Mohamed Amine Ouddan, and Hassane Essafi source code retrieval system based on two-level fingerprint representation, respectively the structural and the semantic information within a source code. A sequence alignment technique is applied on these fingerprints in order to quantify the similarity between source code portions. The specific purpose of the system is to detect plagiarism and duplicated code between programs written in different programming languages belonging to the same class, such as C, Cµ, Java and CSharp. These four languages are supported by the actual version of the system which is designed such that it may be easily adapted for any programming language.

Keywords: Source code retrieval, plagiarism detection, clonedetection, sequence alignment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1799
685 Differential Protection for Power Transformer Using Wavelet Transform and PNN

Authors: S. Sendilkumar, B. L. Mathur, Joseph Henry

Abstract:

A new approach for protection of power transformer is presented using a time-frequency transform known as Wavelet transform. Different operating conditions such as inrush, Normal, load, External fault and internal fault current are sampled and processed to obtain wavelet coefficients. Different Operating conditions provide variation in wavelet coefficients. Features like energy and Standard deviation are calculated using Parsevals theorem. These features are used as inputs to PNN (Probabilistic neural network) for fault classification. The proposed algorithm provides more accurate results even in the presence of noise inputs and accurately identifies inrush and fault currents. Overall classification accuracy of the proposed method is found to be 96.45%. Simulation of the fault (with and without noise) was done using MATLAB AND SIMULINK software taking 2 cycles of data window (40 m sec) containing 800 samples. The algorithm was evaluated by using 10 % Gaussian white noise.

Keywords: Power Transformer, differential Protection, internalfault, inrush current, Wavelet Energy, Db9.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3141
684 Classification of Initial Stripe Height Patterns using Radial Basis Function Neural Network for Proportional Gain Prediction

Authors: Prasit Wonglersak, Prakarnkiat Youngkong, Ittipon Cheowanish

Abstract:

This paper aims to improve a fine lapping process of hard disk drive (HDD) lapping machines by removing materials from each slider together with controlling the strip height (SH) variation to minimum value. The standard deviation is the key parameter to evaluate the strip height variation, hence it is minimized. In this paper, a design of experiment (DOE) with factorial analysis by twoway analysis of variance (ANOVA) is adopted to obtain a statistically information. The statistics results reveal that initial stripe height patterns affect the final SH variation. Therefore, initial SH classification using a radial basis function neural network is implemented to achieve the proportional gain prediction.

Keywords: Stripe height variation, Two-way analysis ofvariance (ANOVA), Radial basis function neural network, Proportional gain prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654
683 Discrimination of Seismic Signals Using Artificial Neural Networks

Authors: Mohammed Benbrahim, Adil Daoudi, Khalid Benjelloun, Aomar Ibenbrahim

Abstract:

The automatic discrimination of seismic signals is an important practical goal for earth-science observatories due to the large amount of information that they receive continuously. An essential discrimination task is to allocate the incoming signal to a group associated with the kind of physical phenomena producing it. In this paper, two classes of seismic signals recorded routinely in geophysical laboratory of the National Center for Scientific and Technical Research in Morocco are considered. They correspond to signals associated to local earthquakes and chemical explosions. The approach adopted for the development of an automatic discrimination system is a modular system composed by three blocs: 1) Representation, 2) Dimensionality reduction and 3) Classification. The originality of our work consists in the use of a new wavelet called "modified Mexican hat wavelet" in the representation stage. For the dimensionality reduction, we propose a new algorithm based on the random projection and the principal component analysis.

Keywords: Seismic signals, Wavelets, Dimensionality reduction, Artificial neural networks, Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1638
682 Profit and Nonprofit Sports Clubs: Financial and Organizational Comparison in Poland

Authors: Wojciech B. Cieśliński, Igor Perechuda

Abstract:

The paper identifies the features of Polish sports clubs in the particular organizational forms: profit and nonprofit. Identification and description of these features is carried out in terms of financial efficiency of the given organizational form. Under the terms of the efficiency the research allows you to specify the advantages of particular organizational sports club form and the following limitations. Paper considers features of sports clubs in range of Polish conditions as legal regulations. The sources of the functioning efficiency of sports clubs may lie in the organizational forms in which they operate. Each of the available forms can be considered either a for-profit or nonprofit enterprise. Depending on this classification there are different capabilities of increasing organizational and financial efficiency of a given sports club. Authors start with general classification and difference between for-profit and non-profit sport clubs. Next identifies specific financial and organizational conditions of both organizational form and then show examples of mixed activity forms and their efficiency effect.

Keywords: Financial efficiency, for-profit, non-profit, sports club.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2048
681 A World Map of Seabed Sediment Based on 50 Years of Knowledge

Authors: T. Garlan, I. Gabelotaud, S. Lucas, E. Marchès

Abstract:

Production of a global sedimentological seabed map has been initiated in 1995 to provide the necessary tool for searches of aircraft and boats lost at sea, to give sedimentary information for nautical charts, and to provide input data for acoustic propagation modelling. This original approach had already been initiated one century ago when the French hydrographic service and the University of Nancy had produced maps of the distribution of marine sediments of the French coasts and then sediment maps of the continental shelves of Europe and North America. The current map of the sediment of oceans presented was initiated with a UNESCO's general map of the deep ocean floor. This map was adapted using a unique sediment classification to present all types of sediments: from beaches to the deep seabed and from glacial deposits to tropical sediments. In order to allow good visualization and to be adapted to the different applications, only the granularity of sediments is represented. The published seabed maps are studied, if they present an interest, the nature of the seabed is extracted from them, the sediment classification is transcribed and the resulted map is integrated in the world map. Data come also from interpretations of Multibeam Echo Sounder (MES) imagery of large hydrographic surveys of deep-ocean. These allow a very high-quality mapping of areas that until then were represented as homogeneous. The third and principal source of data comes from the integration of regional maps produced specifically for this project. These regional maps are carried out using all the bathymetric and sedimentary data of a region. This step makes it possible to produce a regional synthesis map, with the realization of generalizations in the case of over-precise data. 86 regional maps of the Atlantic Ocean, the Mediterranean Sea, and the Indian Ocean have been produced and integrated into the world sedimentary map. This work is permanent and permits a digital version every two years, with the integration of some new maps. This article describes the choices made in terms of sediment classification, the scale of source data and the zonation of the variability of the quality. This map is the final step in a system comprising the Shom Sedimentary Database, enriched by more than one million punctual and surface items of data, and four series of coastal seabed maps at 1:10,000, 1:50,000, 1:200,000 and 1:1,000,000. This step by step approach makes it possible to take into account the progresses in knowledge made in the field of seabed characterization during the last decades. Thus, the arrival of new classification systems for seafloor has improved the recent seabed maps, and the compilation of these new maps with those previously published allows a gradual enrichment of the world sedimentary map. But there is still a lot of work to enhance some regions, which are still based on data acquired more than half a century ago.

Keywords: Marine sedimentology, seabed map, sediment classification, World Ocean.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1050
680 Real-Time Specific Weed Recognition System Using Histogram Analysis

Authors: Irshad Ahmad, Abdul Muhamin Naeem, Muhammad Islam

Abstract:

Information on weed distribution within the field is necessary to implement spatially variable herbicide application. Since hand labor is costly, an automated weed control system could be feasible. This paper deals with the development of an algorithm for real time specific weed recognition system based on Histogram Analysis of an image that is used for the weed classification. This algorithm is specifically developed to classify images into broad and narrow class for real-time selective herbicide application. The developed system has been tested on weeds in the lab, which have shown that the system to be very effectiveness in weed identification. Further the results show a very reliable performance on images of weeds taken under varying field conditions. The analysis of the results shows over 95 percent classification accuracy over 140 sample images (broad and narrow) with 70 samples from each category of weeds.

Keywords: Image Processing, real-time recognition, Weeddetection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1779
679 Establishment of Air Quality Zones in Italy

Authors: M. G. Dirodi, G. Gugliotta, C. Leonardi

Abstract:

Member States shall establish zones and agglomerations throughout their territory to assess and manage air quality in order to comply with European directives. In Italy decree 155/2010, transposing Directive 2008/50/EC on ambient air quality and cleaner air for Europe, merged into a single act the previous provisions on ambient air quality assessment and management, including those resulting from the implementation of Directive 2004/107/EC relating to arsenic, cadmium, nickel, mercury and polycyclic aromatic hydrocarbons in ambient air. Decree 155/2010 introduced stricter rules for identifying zones on the basis of the characteristics of the territory in spite of considering pollution levels, as it was in the past. The implementation of such new criteria has reduced the great variability of the previous zoning, leading to a significant reduction of the total number of zones and to a complete and uniform ambient air quality assessment and management throughout the Country. The present document is related to the new zones definition in Italy according to Decree 155/2010. In particular the paper contains the description and the analysis of the outcome of zoning and classification.

Keywords: Zones, agglomerations, air quality assessment, classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2131
678 Speaker Identification by Atomic Decomposition of Learned Features Using Computational Auditory Scene Analysis Principals in Noisy Environments

Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic

Abstract:

Speaker recognition is performed in high Additive White Gaussian Noise (AWGN) environments using principals of Computational Auditory Scene Analysis (CASA). CASA methods often classify sounds from images in the time-frequency (T-F) plane using spectrograms or cochleargrams as the image. In this paper atomic decomposition implemented by matching pursuit performs a transform from time series speech signals to the T-F plane. The atomic decomposition creates a sparsely populated T-F vector in “weight space” where each populated T-F position contains an amplitude weight. The weight space vector along with the atomic dictionary represents a denoised, compressed version of the original signal. The arraignment or of the atomic indices in the T-F vector are used for classification. Unsupervised feature learning implemented by a sparse autoencoder learns a single dictionary of basis features from a collection of envelope samples from all speakers. The approach is demonstrated using pairs of speakers from the TIMIT data set. Pairs of speakers are selected randomly from a single district. Each speak has 10 sentences. Two are used for training and 8 for testing. Atomic index probabilities are created for each training sentence and also for each test sentence. Classification is performed by finding the lowest Euclidean distance between then probabilities from the training sentences and the test sentences. Training is done at a 30dB Signal-to-Noise Ratio (SNR). Testing is performed at SNR’s of 0 dB, 5 dB, 10 dB and 30dB. The algorithm has a baseline classification accuracy of ~93% averaged over 10 pairs of speakers from the TIMIT data set. The baseline accuracy is attributable to short sequences of training and test data as well as the overall simplicity of the classification algorithm. The accuracy is not affected by AWGN and produces ~93% accuracy at 0dB SNR.

Keywords: Time-frequency plane, atomic decomposition, envelope sampling, Gabor atoms, matching pursuit, sparse dictionary learning, sparse autoencoder.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1577
677 An Intelligent Human-Computer Interaction System for Decision Support

Authors: Chee Siong Teh, Chee Peng Lim

Abstract:

This paper proposes a novel architecture for developing decision support systems. Unlike conventional decision support systems, the proposed architecture endeavors to reveal the decision-making process such that humans' subjectivity can be incorporated into a computerized system and, at the same time, to preserve the capability of the computerized system in processing information objectively. A number of techniques used in developing the decision support system are elaborated to make the decisionmarking process transparent. These include procedures for high dimensional data visualization, pattern classification, prediction, and evolutionary computational search. An artificial data set is first employed to compare the proposed approach with other methods. A simulated handwritten data set and a real data set on liver disease diagnosis are then employed to evaluate the efficacy of the proposed approach. The results are analyzed and discussed. The potentials of the proposed architecture as a useful decision support system are demonstrated.

Keywords: Interactive evolutionary computation, multivariate data projection, pattern classification, topographic map.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1459
676 Multiple Targets Classification and Fuzzy Logic Decision Fusion in Wireless Sensor Networks

Authors: Ahmad Aljaafreh

Abstract:

This paper proposes a hierarchical hidden Markov model (HHMM) to model the detection of M vehicles in a wireless sensor network (WSN). The HHMM model contains an extra level of hidden Markov model to model the temporal transitions of each state of the first HMM. By modeling the temporal transitions, only those hypothesis with nonzero transition probabilities needs to be tested. Thus, this method efficiently reduces the computation load, which is preferable in WSN applications.This paper integrates several techniques to optimize the detection performance. The output of the states of the first HMM is modeled as Gaussian Mixture Model (GMM), where the number of states and the number of Gaussians are experimentally determined, while the other parameters are estimated using Expectation Maximization (EM). HHMM is used to model the sequence of the local decisions which are based on multiple hypothesis testing with maximum likelihood approach. The states in the HHMM represent various combinations of vehicles of different types. Due to the statistical advantages of multisensor data fusion, we propose a heuristic based on fuzzy weighted majority voting to enhance cooperative classification of moving vehicles within a region that is monitored by a wireless sensor network. A fuzzy inference system weighs each local decision based on the signal to noise ratio of the acoustic signal for target detection and the signal to noise ratio of the radio signal for sensor communication. The spatial correlation among the observations of neighboring sensor nodes is efficiently utilized as well as the temporal correlation. Simulation results demonstrate the efficiency of this scheme.

Keywords: Classification, decision fusion, fuzzy logic, hidden Markov model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6254
675 Justification and Classification of Issues for the Selection and Implementation of Advanced Manufacturing Technologies

Authors: Zahra Banakar, Farzad Tahriri

Abstract:

It has often been said that the strength of any country resides in the strength of its industrial sector, and Progress in industrial society has been accomplished by the creation of new technologies. Developments have been facilitated by the increasing availability of advanced manufacturing technology (AMT), in addition the implementation of advanced manufacturing technology (AMT) requires careful planning at all levels of the organization to ensure that the implementation will achieve the intended goals. Justification and implementation of advanced manufacturing technology (AMT) involves decisions that are crucial for the practitioners regarding the survival of business in the present days of uncertain manufacturing world. This paper assists the industrial managers to consider all the important criteria for success AMT implementation, when purchasing new technology. Concurrently, this paper classifies the tangible benefits of a technology that are evaluated by addressing both cost and time dimensions, and the intangible benefits are evaluated by addressing technological, strategic, social and human issues to identify and create awareness of the essential elements in the AMT implementation process and identify the necessary actions before implementing AMT.

Keywords: Advanced Manufacturing Technology (AMT), Justification and Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2537
674 The Integrated Management of Health Care Strategies and Differential Diagnosis by Expert System Technology: A Single-Dimensional Approach

Authors: A. B. Adehor, P. R. Burrell

Abstract:

The Integrated Management of Child illnesses (IMCI) and the surveillance Health Information Systems (HIS) are related strategies that are designed to manage child illnesses and community practices of diseases. However, both strategies do not function well together because of classification incompatibilities and, as such, are difficult to use by health care personnel in rural areas where a majority of people lack the basic knowledge of interpreting disease classification from these methods. This paper discusses a single approach on how a stand-alone expert system can be used as a prompt diagnostic tool for all cases of illnesses presented. The system combines the action-oriented IMCI and the disease-oriented HIS approaches to diagnose malaria and typhoid fever in the rural areas of the Niger-delta region.

Keywords: Differential diagnosis, Health Information System(HIS), Integrated Management of Child Illnesses (IMCI), Malaria andTyphoid fever.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1874
673 Multi-Sensor Target Tracking Using Ensemble Learning

Authors: Bhekisipho Twala, Mantepu Masetshaba, Ramapulana Nkoana

Abstract:

Multiple classifier systems combine several individual classifiers to deliver a final classification decision. However, an increasingly controversial question is whether such systems can outperform the single best classifier, and if so, what form of multiple classifiers system yields the most significant benefit. Also, multi-target tracking detection using multiple sensors is an important research field in mobile techniques and military applications. In this paper, several multiple classifiers systems are evaluated in terms of their ability to predict a system’s failure or success for multi-sensor target tracking tasks. The Bristol Eden project dataset is utilised for this task. Experimental and simulation results show that the human activity identification system can fulfil requirements of target tracking due to improved sensors classification performances with multiple classifier systems constructed using boosting achieving higher accuracy rates.

Keywords: Single classifier, machine learning, ensemble learning, multi-sensor target tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 604
672 Cellulose Nanocrystals Suspensions as Water-Based Lubricants for Slurry Pump Gland Seals

Authors: Mohammad Javad Shariatzadeh, Dana Grecov

Abstract:

The tribological tests were performed on a new tribometer, in order to measure the coefficient of friction of a gland seal packing material on stainless steel shafts in presence of Cellulose Nanocrystal (CNC) suspension as a sustainable, environmentally friendly, water-based lubricant. To simulate the real situation from the slurry pumps, silica sands were used as slurry particles. The surface profiles after tests were measured by interferometer microscope to characterize the surface wear. Moreover, the coefficient of friction and surface wear were measured between stainless steel shaft and chrome steel ball to investigate the tribological effects of CNC in boundary lubrication region. Alignment of nanoparticles in the CNC suspensions are the main reason for friction and wear reduction. The homogeneous concentrated suspensions showed fingerprint patterns of a chiral nematic liquid crystal. These properties made CNC a very good lubricant additive in water.

Keywords: Gland seal, lubricant additives, nanocrystalline cellulose, water-based lubricants.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 818
671 Seismic Base Shear Force Depending on Building Fundamental Period and Site Conditions: Deterministic Formulation and Probabilistic Analysis

Authors: S. Dorbani, M. Badaoui, D. Benouar

Abstract:

The aim of this paper is to investigate the effect of the building fundamental period of reinforced concrete buildings of (6, 9, and 12-storey), with different floor plans: Symmetric, mono-symmetric, and unsymmetric. These structures are erected at different epicentral distances. Using the Boumerdes, Algeria (2003) earthquake data, we focused primarily on the establishment of the deterministic formulation linking the base shear force to two parameters: The first one is the fundamental period that represents the numerical fingerprint of the structure, and the second one is the epicentral distance used to represent the impact of the earthquake on this force. In a second step, with a view to highlight the effect of uncertainty in these parameters on the analyzed response, these parameters are modeled as random variables with a log-normal distribution. The variability of the coefficients of variation of the chosen uncertain parameters, on the statistics on the seismic base shear force, showed that the effect of uncertainty on fundamental period on this force statistics is low compared to the epicentral distance uncertainty influence.

Keywords: Base shear force, fundamental period, epicentral distance, uncertainty, lognormal variable, statistics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1310
670 Fingerprint Compression Using Multiwavelets

Authors: Sudhakar.R, Jayaraman.S

Abstract:

Large volumes of fingerprints are collected and stored every day in a wide range of applications, including forensics, access control etc. It is evident from the database of Federal Bureau of Investigation (FBI) which contains more than 70 million finger prints. Compression of this database is very important because of this high Volume. The performance of existing image coding standards generally degrades at low bit-rates because of the underlying block based Discrete Cosine Transform (DCT) scheme. Over the past decade, the success of wavelets in solving many different problems has contributed to its unprecedented popularity. Due to implementation constraints scalar wavelets do not posses all the properties which are needed for better performance in compression. New class of wavelets called 'Multiwavelets' which posses more than one scaling filters overcomes this problem. The objective of this paper is to develop an efficient compression scheme and to obtain better quality and higher compression ratio through multiwavelet transform and embedded coding of multiwavelet coefficients through Set Partitioning In Hierarchical Trees algorithm (SPIHT) algorithm. A comparison of the best known multiwavelets is made to the best known scalar wavelets. Both quantitative and qualitative measures of performance are examined for Fingerprints.

Keywords: Mutiwavelet, Modified SPIHT Algorithm, SPIHT, Wavelet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1616
669 Unit Selection Algorithm Using Bi-grams Model For Corpus-Based Speech Synthesis

Authors: Mohamed Ali KAMMOUN, Ahmed Ben HAMIDA

Abstract:

In this paper, we present a novel statistical approach to corpus-based speech synthesis. Classically, phonetic information is defined and considered as acoustic reference to be respected. In this way, many studies were elaborated for acoustical unit classification. This type of classification allows separating units according to their symbolic characteristics. Indeed, target cost and concatenation cost were classically defined for unit selection. In Corpus-Based Speech Synthesis System, when using large text corpora, cost functions were limited to a juxtaposition of symbolic criteria and the acoustic information of units is not exploited in the definition of the target cost. In this manuscript, we token in our consideration the unit phonetic information corresponding to acoustic information. This would be realized by defining a probabilistic linguistic Bi-grams model basically used for unit selection. The selected units would be extracted from the English TIMIT corpora.

Keywords: Unit selection, Corpus-based Speech Synthesis, Bigram model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1446
668 Early-Warning Lights Classification Management System for Industrial Parks in Taiwan

Authors: Yu-Min Chang, Kuo-Sheng Tsai, Hung-Te Tsai, Chia-Hsin Li

Abstract:

This paper presents the early-warning lights classification management system for industrial parks promoted by the Taiwan Environmental Protection Administration (EPA) since 2011, including the definition of each early-warning light, objectives, action program and accomplishments. All of the 151 industrial parks in Taiwan were classified into four early-warning lights, including red, orange, yellow and green, for carrying out respective pollution management according to the monitoring data of soil and groundwater quality, regulatory compliance, and regulatory listing of control site or remediation site. The Taiwan EPA set up a priority list for high potential polluted industrial parks and investigated their soil and groundwater qualities based on the results of the light classification and pollution potential assessment. In 2011-2013, there were 44 industrial parks selected and carried out different investigation, such as the early warning groundwater well networks establishment and pollution investigation/verification for the red and orange-light industrial parks and the environmental background survey for the yellow-light industrial parks. Among them, 22 industrial parks were newly or continuously confirmed that the concentrations of pollutants exceeded those in soil or groundwater pollution control standards. Thus, the further investigation, groundwater use restriction, listing of pollution control site or remediation site, and pollutant isolation measures were implemented by the local environmental protection and industry competent authorities; the early warning lights of those industrial parks were proposed to adjust up to orange or red-light. Up to the present, the preliminary positive effect of the soil and groundwater quality management system for industrial parks has been noticed in several aspects, such as environmental background information collection, early warning of pollution risk, pollution investigation and control, information integration and application, and inter-agency collaboration. Finally, the work and goal of self-initiated quality management of industrial parks will be carried out on the basis of the inter-agency collaboration by the classified lights system of early warning and management as well as the regular announcement of the status of each industrial park.

Keywords: Industrial park, soil and groundwater quality management, early-warning lights classification, SOP for reporting and treatment of monitored abnormal events.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1996
667 A Hybrid Scheme for on-Line Diagnostic Decision Making Using Optimal Data Representation and Filtering Technique

Authors: Hyun-Woo Cho

Abstract:

The early diagnostic decision making in industrial processes is absolutely necessary to produce high quality final products. It helps to provide early warning for a special event in a process, and finding its assignable cause can be obtained. This work presents a hybrid diagnostic schmes for batch processes. Nonlinear representation of raw process data is combined with classification tree techniques. The nonlinear kernel-based dimension reduction is executed for nonlinear classification decision boundaries for fault classes. In order to enhance diagnosis performance for batch processes, filtering of the data is performed to get rid of the irrelevant information of the process data. For the diagnosis performance of several representation, filtering, and future observation estimation methods, four diagnostic schemes are evaluated. In this work, the performance of the presented diagnosis schemes is demonstrated using batch process data.

Keywords: Diagnostics, batch process, nonlinear representation, data filtering, multivariate statistical approach

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1326
666 Unearthing Decisional Patterns of Air Traffic Control Officers from Simulator Data

Authors: Z. Zakaria, S. W. Lye, S. Endy

Abstract:

Despite the continuous advancements in automated conflict resolution tools, there is still a low rate of adoption of automation from Air Traffic Control Officers (ATCOs). Trust or acceptance in these tools and conformance to the individual ATCO preferences in strategy execution for conflict resolution are two key factors that impact their use. This paper proposes a methodology to unearth and classify ATCO conflict resolution strategies from simulator data of trained and qualified ATCOs. The methodology involves the extraction of ATCO executive control actions and the establishment of a system of strategy resolution classification based on ATCO radar commands and prevailing flight parameters in deconflicting a pair of aircraft. Six main strategies used to handle various categories of conflict were identified and discussed. It was found that ATCOs were about twice more likely to choose only vertical maneuvers in conflict resolution compared to horizontal maneuvers or a combination of both vertical and horizontal maneuvers.

Keywords: Air traffic control strategies, conflict resolution, simulator data, strategy classification system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 80
665 Real-time Laser Monitoring based on Pipe Detective Operation

Authors: Mongkorn Klingajay, Tawatchai Jitson

Abstract:

The pipe inspection operation is the difficult detective performance. Almost applications are mainly relies on a manual recognition of defective areas that have carried out detection by an engineer. Therefore, an automation process task becomes a necessary in order to avoid the cost incurred in such a manual process. An automated monitoring method to obtain a complete picture of the sewer condition is proposed in this work. The focus of the research is the automated identification and classification of discontinuities in the internal surface of the pipe. The methodology consists of several processing stages including image segmentation into the potential defect regions and geometrical characteristic features. Automatic recognition and classification of pipe defects are carried out by means of using an artificial neural network technique (ANN) based on Radial Basic Function (RBF). Experiments in a realistic environment have been conducted and results are presented.

Keywords: Artificial neural network, Radial basic function, Curve fitting, CCTV, Image segmentation, Data acquisition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1824
664 Functional Near Infrared Spectroscope for Cognition Brain Tasks by Wavelets Analysis and Neural Networks

Authors: Truong Quang Dang Khoa, Masahiro Nakagawa

Abstract:

Brain Computer Interface (BCI) has been recently increased in research. Functional Near Infrared Spectroscope (fNIRs) is one the latest technologies which utilize light in the near-infrared range to determine brain activities. Because near infrared technology allows design of safe, portable, wearable, non-invasive and wireless qualities monitoring systems, fNIRs monitoring of brain hemodynamics can be value in helping to understand brain tasks. In this paper, we present results of fNIRs signal analysis indicating that there exist distinct patterns of hemodynamic responses which recognize brain tasks toward developing a BCI. We applied two different mathematics tools separately, Wavelets analysis for preprocessing as signal filters and feature extractions and Neural networks for cognition brain tasks as a classification module. We also discuss and compare with other methods while our proposals perform better with an average accuracy of 99.9% for classification.

Keywords: functional near infrared spectroscope (fNIRs), braincomputer interface (BCI), wavelets, neural networks, brain activity, neuroimaging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2044
663 Image Spam Detection Using Color Features and K-Nearest Neighbor Classification

Authors: T. Kumaresan, S. Sanjushree, C. Palanisamy

Abstract:

Image spam is a kind of email spam where the spam text is embedded with an image. It is a new spamming technique being used by spammers to send their messages to bulk of internet users. Spam email has become a big problem in the lives of internet users, causing time consumption and economic losses. The main objective of this paper is to detect the image spam by using histogram properties of an image. Though there are many techniques to automatically detect and avoid this problem, spammers employing new tricks to bypass those techniques, as a result those techniques are inefficient to detect the spam mails. In this paper we have proposed a new method to detect the image spam. Here the image features are extracted by using RGB histogram, HSV histogram and combination of both RGB and HSV histogram. Based on the optimized image feature set classification is done by using k- Nearest Neighbor(k-NN) algorithm. Experimental result shows that our method has achieved better accuracy. From the result it is known that combination of RGB and HSV histogram with k-NN algorithm gives the best accuracy in spam detection.

Keywords: File Type, HSV Histogram, k-NN, RGB Histogram, Spam Detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2154
662 Earthquake Classification in Molluca Collision Zone Using Conventional Statistical Methods

Authors: H. J. Wattimanela, U. S. Passaribu, N. T. Puspito, S. W. Indratno

Abstract:

Molluca Collision Zone is located at the junction of the Eurasian, Australian, Pacific and the Philippines plates. Between the Sangihe arc, west of the collision zone, and to the east of Halmahera arc is active collision and convex toward the Molluca Sea. This research will analyze the behavior of earthquake occurrence in Molluca Collision Zone related to the distributions of an earthquake in each partition regions, determining the type of distribution of a occurrence earthquake of partition regions, and the mean occurence of earthquakes each partition regions, and the correlation between the partitions region. We calculate number of earthquakes using partition method and its behavioral using conventional statistical methods. In this research, we used data of shallow earthquakes type and its magnitudes ≥4 SR (period 1964-2013). From the results, we can classify partitioned regions based on the correlation into two classes: strong and very strong. This classification can be used for early warning system in disaster management.

Keywords: Molluca Collision Zone, partition regions, conventional statistical methods, Earthquakes, classifications, disaster management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1991
661 Towards Real-Time Classification of Finger Movement Direction Using Encephalography Independent Components

Authors: Mohamed Mounir Tellache, Hiroyuki Kambara, Yasuharu Koike, Makoto Miyakoshi, Natsue Yoshimura

Abstract:

This study explores the practicality of using electroencephalographic (EEG) independent components to predict eight-direction finger movements in pseudo-real-time. Six healthy participants with individual-head MRI images performed finger movements in eight directions with two different arm configurations. The analysis was performed in two stages. The first stage consisted of using independent component analysis (ICA) to separate the signals representing brain activity from non-brain activity signals and to obtain the unmixing matrix. The resulting independent components (ICs) were checked, and those reflecting brain-activity were selected. Finally, the time series of the selected ICs were used to predict eight finger-movement directions using Sparse Logistic Regression (SLR). The second stage consisted of using the previously obtained unmixing matrix, the selected ICs, and the model obtained by applying SLR to classify a different EEG dataset. This method was applied to two different settings, namely the single-participant level and the group-level. For the single-participant level, the EEG dataset used in the first stage and the EEG dataset used in the second stage originated from the same participant. For the group-level, the EEG datasets used in the first stage were constructed by temporally concatenating each combination without repetition of the EEG datasets of five participants out of six, whereas the EEG dataset used in the second stage originated from the remaining participants. The average test classification results across datasets (mean ± S.D.) were 38.62 ± 8.36% for the single-participant, which was significantly higher than the chance level (12.50 ± 0.01%), and 27.26 ± 4.39% for the group-level which was also significantly higher than the chance level (12.49% ± 0.01%). The classification accuracy within [–45°, 45°] of the true direction is 70.03 ± 8.14% for single-participant and 62.63 ± 6.07% for group-level which may be promising for some real-life applications. Clustering and contribution analyses further revealed the brain regions involved in finger movement and the temporal aspect of their contribution to the classification. These results showed the possibility of using the ICA-based method in combination with other methods to build a real-time system to control prostheses.

Keywords: Brain-computer interface, BCI, electroencephalography, EEG, finger motion decoding, independent component analysis, pseudo-real-time motion decoding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 606
660 Machine Learning Methods for Flood Hazard Mapping

Authors: S. Zappacosta, C. Bove, M. Carmela Marinelli, P. di Lauro, K. Spasenovic, L. Ostano, G. Aiello, M. Pietrosanto

Abstract:

This paper proposes a neural network approach for assessing flood hazard mapping. The core of the model is a machine learning component fed by frequency ratios, namely statistical correlations between flood event occurrences and a selected number of topographic properties. The classification capability was compared with the flood hazard mapping River Basin Plans (Piani Assetto Idrogeologico, acronimed as PAI) designed by the Italian Institute for Environmental Research and Defence, ISPRA (Istituto Superiore per la Protezione e la Ricerca Ambientale), encoding four different increasing flood hazard levels. The study area of Piemonte, an Italian region, has been considered without loss of generality. The frequency ratios may be used as a standalone block to model the flood hazard mapping. Nevertheless, the mixture with a neural network improves the classification power of several percentage points, and may be proposed as a basic tool to model the flood hazard map in a wider scope.

Keywords: flood modeling, hazard map, neural networks, hydrogeological risk, flood risk assessment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 744
659 Spatial Data Mining by Decision Trees

Authors: S. Oujdi, H. Belbachir

Abstract:

Existing methods of data mining cannot be applied on spatial data because they require spatial specificity consideration, as spatial relationships. This paper focuses on the classification with decision trees, which are one of the data mining techniques. We propose an extension of the C4.5 algorithm for spatial data, based on two different approaches Join materialization and Querying on the fly the different tables. Similar works have been done on these two main approaches, the first - Join materialization - favors the processing time in spite of memory space, whereas the second - Querying on the fly different tables- promotes memory space despite of the processing time. The modified C4.5 algorithm requires three entries tables: a target table, a neighbor table, and a spatial index join that contains the possible spatial relationship among the objects in the target table and those in the neighbor table. Thus, the proposed algorithms are applied to a spatial data pattern in the accidentology domain. A comparative study of our approach with other works of classification by spatial decision trees will be detailed.

Keywords: C4.5 Algorithm, Decision trees, S-CART, Spatial data mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2994
658 Developing an Advanced Algorithm Capable of Classifying News, Articles and Other Textual Documents Using Text Mining Techniques

Authors: R. B. Knudsen, O. T. Rasmussen, R. A. Alphinas

Abstract:

The reason for conducting this research is to develop an algorithm that is capable of classifying news articles from the automobile industry, according to the competitive actions that they entail, with the use of Text Mining (TM) methods. It is needed to test how to properly preprocess the data for this research by preparing pipelines which fits each algorithm the best. The pipelines are tested along with nine different classification algorithms in the realm of regression, support vector machines, and neural networks. Preliminary testing for identifying the optimal pipelines and algorithms resulted in the selection of two algorithms with two different pipelines. The two algorithms are Logistic Regression (LR) and Artificial Neural Network (ANN). These algorithms are optimized further, where several parameters of each algorithm are tested. The best result is achieved with the ANN. The final model yields an accuracy of 0.79, a precision of 0.80, a recall of 0.78, and an F1 score of 0.76. By removing three of the classes that created noise, the final algorithm is capable of reaching an accuracy of 94%.

Keywords: Artificial neural network, competitive dynamics, logistic regression, text classification, text mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 542
657 Improved Rare Species Identification Using Focal Loss Based Deep Learning Models

Authors: Chad Goldsworthy, B. Rajeswari Matam

Abstract:

The use of deep learning for species identification in camera trap images has revolutionised our ability to study, conserve and monitor species in a highly efficient and unobtrusive manner, with state-of-the-art models achieving accuracies surpassing the accuracy of manual human classification. The high imbalance of camera trap datasets, however, results in poor accuracies for minority (rare or endangered) species due to their relative insignificance to the overall model accuracy. This paper investigates the use of Focal Loss, in comparison to the traditional Cross Entropy Loss function, to improve the identification of minority species in the “255 Bird Species” dataset from Kaggle. The results show that, although Focal Loss slightly decreased the accuracy of the majority species, it was able to increase the F1-score by 0.06 and improve the identification of the bottom two, five and ten (minority) species by 37.5%, 15.7% and 10.8%, respectively, as well as resulting in an improved overall accuracy of 2.96%.

Keywords: Convolutional neural networks, data imbalance, deep learning, focal loss, species classification, wildlife conservation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1435