Search results for: Fault detection and classification
543 Treatment of Inorganic Filler Surface by Silane-Coupling Agent: Investigation of Treatment Condition and Analysis of Bonding State of Reacted Agent
Authors: Hiroshi Hirano, Joji Kadota, Toshiyuki Yamashita, Yasuyuki Agari
Abstract:
It is well known that enhancing interfacial adhesion between inorganic filler and matrix resin in a composite lead to favorable properties such as excellent mechanical properties, high thermal resistance, prominent electric insulation, low expansion coefficient, and so on. But it should be avoided that much excess of coupling agent is reacted due to a negative impact of their final composite-s properties. There is no report to achieve classification of the bonding state excepting investigation of coating layer thickness. Therefore, the analysis of the bonding state of the coupling agent reacted with the filler surface such as BN particles with less functional group and silica particles having much functional group was performed by thermal gravimetric analysis and pyrolysis GC/MS. The reacted number of functional groups on the silane-coupling agent was classified as a result of the analysis. Thus, we succeeded in classifying the reacted number of the functional groups as a result of this study.Keywords: Inorganic filler, boron nitride, surface treatment, coupling agent, analysis of bonding state
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5043542 Numerical Study of Fiber Bragg Grating Sensor: Longitudinal and Transverse Detection of Temperature and Strain
Authors: K. Khelil, H. Ammar, K. Saouchi
Abstract:
Fiber Bragg Grating (FBG) structure is an periodically modulated optical fiber. It acts as a selective filter of wavelength whose reflected peak is called Bragg wavelength and it depends on the period of the fiber and the refractive index. The simulation of FBG is based on solving the Coupled Mode Theory equation by using the Transfer Matrix Method which is carried out using MATLAB. It is found that spectral reflectivity is shifted when the change of temperature and strain is uniform. Under non-uniform temperature or strain perturbation, the spectrum is both shifted and destroyed. In case of transverse loading, reflectivity spectrum is split into two peaks, the first is specific to X axis, and the second belongs to Y axis. FBGs are used in civil engineering to detect perturbations applied to buildings.
Keywords: Bragg wavelength, coupled mode theory, optical fiber, temperature measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 885541 Indicator of Small Calcification Detection in Ultrasonography using Decorrelation of Forward Scattered Waves
Authors: Hirofumi Taki, Takuya Sakamoto, Makoto Yamakawa, Tsuyoshi Shiina, Toru Sato
Abstract:
For the improvement of the ability in detecting small calcifications using Ultrasonography (US) we propose a novel indicator of calcifications in an ultrasound B-mode image without decrease in frame rate. Since the waveform of an ultrasound pulse changes at a calcification position, the decorrelation of adjacent scan lines occurs behind a calcification. Therefore, we employ the decorrelation of adjacent scan lines as an indicator of a calcification. The proposed indicator depicted wires 0.05 mm in diameter at 2 cm depth with a sensitivity of 86.7% and a specificity of 100%, which were hardly detected in ultrasound B-mode images. This study shows the potential of the proposed indicator to approximate the detectable calcification size using an US device to that of an X-ray imager, implying the possibility that an US device will become a convenient, safe, and principal clinical tool for the screening of breast cancer.Keywords: Ultrasonography, Calcification, Decorrelation, Forward scattered wave
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1452540 Detection and Analysis of Deficiencies in Groundnut Plant using Geometric Moments
Authors: Sumeet S. Nisale, Chandan J. Bharambe, Vidya N.More
Abstract:
We propose our genuine research of geometric moments which detects the mineral inadequacy in the frail groundnut plant. This plant is prone to many deficiencies as a result of the variance in the soil nutrients. By analyzing the leaves of the plant, we detect the visual symptoms that are not recognizable to the naked eyes. We have collected about 160 samples of leaves from the nearby fields. The images have been taken by keeping every leaf into a black box to avoid the external interference. For the first time, it has been possible to provide the farmer with the stages of deficiencies. This paper has applied the algorithms successfully to many other plants like Lady-s finger, Green Bean, Lablab Bean, Chilli and Tomato. But we submit the results of the groundnut predominantly. The accuracy of our algorithm and method is almost 93%. This will again pioneer a kind of green revolution in the field of agriculture and will be a boon to that field.Keywords: Component image, geometric moments, average intensity, average affected area, black box
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2132539 Speech Enhancement of Vowels Based on Pitch and Formant Frequency
Authors: R. Rishma Rodrigo, R. Radhika, M. Vanitha Lakshmi
Abstract:
Numerous signal processing based speech enhancement systems have been proposed to improve intelligibility in the presence of noise. Traditionally, studies of neural vowel encoding have focused on the representation of formants (peaks in vowel spectra) in the discharge patterns of the population of auditory-nerve (AN) fibers. A method is presented for recording high-frequency speech components into a low-frequency region, to increase audibility for hearing loss listeners. The purpose of the paper is to enhance the formant of the speech based on the Kaiser window. The pitch and formant of the signal is based on the auto correlation, zero crossing and magnitude difference function. The formant enhancement stage aims to restore the representation of formants at the level of the midbrain. A MATLAB software’s are used for the implementation of the system with low complexity is developed.
Keywords: Formant estimation, formant enhancement, pitch detection, speech analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1641538 Hybrid Honeypot System for Network Security
Authors: Kyi Lin Lin Kyaw
Abstract:
Nowadays, we are facing with network threats that cause enormous damage to the Internet community day by day. In this situation, more and more people try to prevent their network security using some traditional mechanisms including firewall, Intrusion Detection System, etc. Among them honeypot is a versatile tool for a security practitioner, of course, they are tools that are meant to be attacked or interacted with to more information about attackers, their motives and tools. In this paper, we will describe usefulness of low-interaction honeypot and high-interaction honeypot and comparison between them. And then we propose hybrid honeypot architecture that combines low and high -interaction honeypot to mitigate the drawback. In this architecture, low-interaction honeypot is used as a traffic filter. Activities like port scanning can be effectively detected by low-interaction honeypot and stop there. Traffic that cannot be handled by low-interaction honeypot is handed over to high-interaction honeypot. In this case, low-interaction honeypot is used as proxy whereas high-interaction honeypot offers the optimal level realism. To prevent the high-interaction honeypot from infections, containment environment (VMware) is used.Keywords: Low-interaction honeypot, High-interactionhoneypot, VMware, Proxy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2953537 Real-time Tracking in Image Sequences based-on Parameters Updating with Temporal and Spatial Neighborhoods Mixture Gaussian Model
Abstract:
Gaussian mixture background model is widely used in moving target detection of the image sequences. However, traditional Gaussian mixture background model usually considers the time continuity of the pixels, and establishes background through statistical distribution of pixels without taking into account the pixels- spatial similarity, which will cause noise, imperfection and other problems. This paper proposes a new Gaussian mixture modeling approach, which combines the color and gradient of the spatial information, and integrates the spatial information of the pixel sequences to establish Gaussian mixture background. The experimental results show that the movement background can be extracted accurately and efficiently, and the algorithm is more robust, and can work in real time in tracking applications.Keywords: Gaussian mixture model, real-time tracking, sequence image, gradient.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1477536 A Green Method for Selective Spectrophotometric Determination of Hafnium(IV) with Aqueous Extract of Ficus carica Tree Leaves
Authors: A. Boveiri Monji, H. Yousefnia, M. Haji Hosseini, S. Zolghadri
Abstract:
A clean spectrophotometric method for the determination of hafnium by using a green reagent, acidic extract of Ficus carica tree leaves is developed. In 6-M hydrochloric acid, hafnium reacts with this reagent to form a yellow product. The formed product shows maximum absorbance at 421 nm with a molar absorptivity value of 0.28 × 104 l mol⁻¹ cm⁻¹, and the method was linear in the 2-11 µg ml⁻¹ concentration range. The detection limit value was found to be 0.312 µg ml⁻¹. Except zirconium and iron, the selectivity was good, and most of the ions did not show any significant spectral interference at concentrations up to several hundred times. The proposed method was green, simple, low cost, and selective.
Keywords: Spectrophotometric determination, Ficus carica tree leaves, synthetic reagents, hafnium.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 737535 Active Contours with Prior Corner Detection
Authors: U.A.A. Niroshika, Ravinda G.N. Meegama
Abstract:
Deformable active contours are widely used in computer vision and image processing applications for image segmentation, especially in biomedical image analysis. The active contour or “snake" deforms towards a target object by controlling the internal, image and constraint forces. However, if the contour initialized with a lesser number of control points, there is a high probability of surpassing the sharp corners of the object during deformation of the contour. In this paper, a new technique is proposed to construct the initial contour by incorporating prior knowledge of significant corners of the object detected using the Harris operator. This new reconstructed contour begins to deform, by attracting the snake towards the targeted object, without missing the corners. Experimental results with several synthetic images show the ability of the new technique to deal with sharp corners with a high accuracy than traditional methods.Keywords: Active Contours, Image Segmentation, Harris Operator, Snakes
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2281534 High Performance Liquid Chromatographic Method for Determination of Colistin Sulfate and its Application in Medicated Premixand Animal Feed
Authors: S.Choosakoonkriang, S. Supaluknari, P. Puangkaew
Abstract:
The aim of the present study was to develop and validate an inexpensive and simple high performance liquid chromatographic (HPLC) method for the determination of colistin sulfate. Separation of colistin sulfate was achieved on a ZORBAX Eclipse XDB-C18 column using UV detection at λ=215 nm. The mobile phase was 30 mM sulfate buffer (pH 2.5):acetonitrile(76:24). An excellent linearity (r2=0.998) was found in the concentration range of 25 - 400 μg/mL. Intra- day and inter-day precisions of method (%RSD, n=3) were less than 7.9%.The developed and validated method was applied to determination of the content of colistin sulfate in medicated premix and animal feed sample.The recovery of colistin from animal feed was satisfactorily ranged from 90.92 to 93.77%. The results demonstrated that the HPLC method developed in this work is appropriate for direct determination of colistin sulfate in commercial medicated premixes and animal feed.Keywords: Colistin sulfate, HPLC, medicated premix, animal feed
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8165533 ROC Analysis of PVC Detection Algorithm using ECG and Vector-ECG Charateristics
Authors: J. S. Nah, A. Y. Jeon, J. H. Ro, G. R. Jeon
Abstract:
ECG analysis method was developed using ROC analysis of PVC detecting algorithm. ECG signal of MIT-BIH arrhythmia database was analyzed by MATLAB. First of all, the baseline was removed by median filter to preprocess the ECG signal. R peaks were detected for ECG analysis method, and normal VCG was extracted for VCG analysis method. Four PVC detecting algorithm was analyzed by ROC curve, which parameters are maximum amplitude of QRS complex, width of QRS complex, r-r interval and geometric mean of VCG. To set cut-off value of parameters, ROC curve was estimated by true-positive rate (sensitivity) and false-positive rate. sensitivity and false negative rate (specificity) of ROC curve calculated, and ECG was analyzed using cut-off value which was estimated from ROC curve. As a result, PVC detecting algorithm of VCG geometric mean have high availability, and PVC could be detected more accurately with amplitude and width of QRS complex.Keywords: Vectorcardiogram (VCG), Premature Ventricular contraction (PVC), ROC (receiver operating characteristic) curve, ECG
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2945532 Nonparametric Control Chart Using Density Weighted Support Vector Data Description
Authors: Myungraee Cha, Jun Seok Kim, Seung Hwan Park, Jun-Geol Baek
Abstract:
In manufacturing industries, development of measurement leads to increase the number of monitoring variables and eventually the importance of multivariate control comes to the fore. Statistical process control (SPC) is one of the most widely used as multivariate control chart. Nevertheless, SPC is restricted to apply in processes because its assumption of data as following specific distribution. Unfortunately, process data are composed by the mixture of several processes and it is hard to estimate as one certain distribution. To alternative conventional SPC, therefore, nonparametric control chart come into the picture because of the strength of nonparametric control chart, the absence of parameter estimation. SVDD based control chart is one of the nonparametric control charts having the advantage of flexible control boundary. However,basic concept of SVDD has been an oversight to the important of data characteristic, density distribution. Therefore, we proposed DW-SVDD (Density Weighted SVDD) to cover up the weakness of conventional SVDD. DW-SVDD makes a new attempt to consider dense of data as introducing the notion of density Weight. We extend as control chart using new proposed SVDD and a simulation study of various distributional data is conducted to demonstrate the improvement of performance.
Keywords: Density estimation, Multivariate control chart, Oneclass classification, Support vector data description (SVDD)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2121531 Adaptive Gaussian Mixture Model for Skin Color Segmentation
Authors: Reza Hassanpour, Asadollah Shahbahrami, Stephan Wong
Abstract:
Skin color based tracking techniques often assume a static skin color model obtained either from an offline set of library images or the first few frames of a video stream. These models can show a weak performance in presence of changing lighting or imaging conditions. We propose an adaptive skin color model based on the Gaussian mixture model to handle the changing conditions. Initial estimation of the number and weights of skin color clusters are obtained using a modified form of the general Expectation maximization algorithm, The model adapts to changes in imaging conditions and refines the model parameters dynamically using spatial and temporal constraints. Experimental results show that the method can be used in effectively tracking of hand and face regions.Keywords: Face detection, Segmentation, Tracking, Gaussian Mixture Model, Adaptation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2415530 Improvement of Ground Truth Data for Eye Location on Infrared Driver Recordings
Authors: Sorin Valcan, Mihail Găianu
Abstract:
Labeling is a very costly and time consuming process which aims to generate datasets for training neural networks in several functionalities and projects. For driver monitoring system projects, the need of labeled images has a significant impact on the budget and distribution of effort. This paper presents the modifications done to a ground truth data generation algorithm for 2D eyes location on infrared images with drivers in order to improve the quality of the data and performance of the trained neural networks. The algorithm restrictions become tougher which makes it more accurate but also less constant. The resulting dataset becomes smaller and shall not be altered by any kind of manual labels adjustment before being used in the neural networks training process. These changes resulted in a much better performance of the trained neural networks.
Keywords: Labeling automation, infrared camera, driver monitoring, eye detection, Convolutional Neural Networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 420529 Audio Watermarking Based on Compression-expansion Technique
Authors: Say Wei Foo, Qi Dong
Abstract:
A novel robust audio watermarking scheme is proposed in this paper. In the proposed scheme, the host audio signals are segmented into frames. Two consecutive frames are assessed if they are suitable to represent a watermark bit. If so, frequency transform is performed on these two frames. The compressionexpansion technique is adopted to generate distortion over the two frames. The distortion is used to represent one watermark bit. Psychoacoustic model is applied to calculate local auditory mask to ensure that the distortion is not audible. The watermarking schemes using mono and stereo audio signals are designed differently. The correlation-based detection method is used to detect the distortion and extract embedded watermark bits. The experimental results show that the quality degradation caused by the embedded watermarks is perceptually transparent and the proposed schemes are very robust against different types of attacks.Keywords: Audio watermarking, Compression-expansion, Stereo signals, Robustness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1645528 Massively-Parallel Bit-Serial Neural Networks for Fast Epilepsy Diagnosis: A Feasibility Study
Authors: Si Mon Kueh, Tom J. Kazmierski
Abstract:
There are about 1% of the world population suffering from the hidden disability known as epilepsy and major developing countries are not fully equipped to counter this problem. In order to reduce the inconvenience and danger of epilepsy, different methods have been researched by using a artificial neural network (ANN) classification to distinguish epileptic waveforms from normal brain waveforms. This paper outlines the aim of achieving massive ANN parallelization through a dedicated hardware using bit-serial processing. The design of this bit-serial Neural Processing Element (NPE) is presented which implements the functionality of a complete neuron using variable accuracy. The proposed design has been tested taking into consideration non-idealities of a hardware ANN. The NPE consists of a bit-serial multiplier which uses only 16 logic elements on an Altera Cyclone IV FPGA and a bit-serial ALU as well as a look-up table. Arrays of NPEs can be driven by a single controller which executes the neural processing algorithm. In conclusion, the proposed compact NPE design allows the construction of complex hardware ANNs that can be implemented in a portable equipment that suits the needs of a single epileptic patient in his or her daily activities to predict the occurrences of impending tonic conic seizures.Keywords: Artificial Neural Networks, bit-serial neural processor, FPGA, Neural Processing Element.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1573527 Words Reordering based on Statistical Language Model
Authors: Theologos Athanaselis, Stelios Bakamidis, Ioannis Dologlou
Abstract:
There are multiple reasons to expect that detecting the word order errors in a text will be a difficult problem, and detection rates reported in the literature are in fact low. Although grammatical rules constructed by computer linguists improve the performance of grammar checker in word order diagnosis, the repairing task is still very difficult. This paper presents an approach for repairing word order errors in English text by reordering words in a sentence and choosing the version that maximizes the number of trigram hits according to a language model. The novelty of this method concerns the use of an efficient confusion matrix technique for reordering the words. The comparative advantage of this method is that works with a large set of words, and avoids the laborious and costly process of collecting word order errors for creating error patterns.Keywords: Permutations filtering, Statistical languagemodel N-grams, Word order errors
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1587526 Data and Spatial Analysis for Economy and Education of 28 E.U. Member-States for 2014
Authors: Alexiou Dimitra, Fragkaki Maria
Abstract:
The objective of the paper is the study of geographic, economic and educational variables and their contribution to determine the position of each member-state among the EU-28 countries based on the values of seven variables as given by Eurostat. The Data Analysis methods of Multiple Factorial Correspondence Analysis (MFCA) Principal Component Analysis and Factor Analysis have been used. The cross tabulation tables of data consist of the values of seven variables for the 28 countries for 2014. The data are manipulated using the CHIC Analysis V 1.1 software package. The results of this program using MFCA and Ascending Hierarchical Classification are given in arithmetic and graphical form. For comparison reasons with the same data the Factor procedure of Statistical package IBM SPSS 20 has been used. The numerical and graphical results presented with tables and graphs, demonstrate the agreement between the two methods. The most important result is the study of the relation between the 28 countries and the position of each country in groups or clouds, which are formed according to the values of the corresponding variables.
Keywords: Multiple factorial correspondence analysis, principal component analysis, factor analysis, E.U.-28 countries, statistical package IBM SPSS 20, CHIC Analysis V 1.1 Software, Eurostat.eu statistics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1085525 Expert Witness Testimony in the Battered Woman Syndrome
Authors: Ana Pauna
Abstract:
The Expert Witness Testimony in the Battered Woman Syndrome Expert witness testimony (EWT) is a kind of information given by an expert specialized in the field (here in BWS) to the jury in order to help the court better understand the case. EWT does not always work in favor of the battered women. Two main decision-making models are discussed in the paper: the Mathematical model and the Explanation model. In the first model, the jurors calculate ″the importance and strength of each piece of evidence″ whereas in the second model they try to integrate the EWT with the evidence and create a coherent story that would describe the crime. The jury often misunderstands and misjudges battered women for their action (or in this case inaction). They assume that these women are masochists and accept being mistreated for if a man abuses a woman constantly, she should and could divorce him or simply leave at any time. The research in the domain found that indeed, expert witness testimony has a powerful influence on juror’s decisions thus its quality needs to be further explored. One of the important factors that need further studies is a bias called the dispositionist worldview (a belief that what happens to people is of their own doing). This kind of attributional bias represents a tendency to think that a person’s behavior is due to his or her disposition, even when the behavior is clearly attributed to the situation. Hypothesis The hypothesis of this paper is that if a juror has a dispositionist worldview then he or she will blame the rape victim for triggering the assault. The juror would therefore commit the fundamental attribution error and believe that the victim’s disposition caused the rape and not the situation she was in. Methods The subjects in the study were 500 randomly sampled undergraduate students from McGill, Concordia, Université de Montréal and UQAM. Dispositional Worldview was scored on the Dispositionist Worldview Questionnaire. After reading the Rape Scenarios, each student was asked to play the role of a juror and answer a questionnaire consisting of 7 questions about the responsibility, causality and fault of the victim. Results The results confirm the hypothesis which states that if a juror has a dispositionist worldview then he or she will blame the rape victim for triggering the assault. By doing so, the juror commits the fundamental attribution error because he will believe that the victim’s disposition, and not the constraints or opportunities of the situation, caused the rape scenario.Keywords: bias, expert/witness testimony, attribution error, jury, rape myth
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2179524 Machine Learning Methods for Environmental Monitoring and Flood Protection
Authors: Alexander L. Pyayt, Ilya I. Mokhov, Bernhard Lang, Valeria V. Krzhizhanovskaya, Robert J. Meijer
Abstract:
More and more natural disasters are happening every year: floods, earthquakes, volcanic eruptions, etc. In order to reduce the risk of possible damages, governments all around the world are investing into development of Early Warning Systems (EWS) for environmental applications. The most important task of the EWS is identification of the onset of critical situations affecting environment and population, early enough to inform the authorities and general public. This paper describes an approach for monitoring of flood protections systems based on machine learning methods. An Artificial Intelligence (AI) component has been developed for detection of abnormal dike behaviour. The AI module has been integrated into an EWS platform of the UrbanFlood project (EU Seventh Framework Programme) and validated on real-time measurements from the sensors installed in a dike.Keywords: Early Warning System, intelligent environmentalmonitoring, machine learning, flood protection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4084523 Fast Facial Feature Extraction and Matching with Artificial Face Models
Authors: Y. H. Tsai, Y. W. Chen
Abstract:
Facial features are frequently used to represent local properties of a human face image in computer vision applications. In this paper, we present a fast algorithm that can extract the facial features online such that they can give a satisfying representation of a face image. It includes one step for a coarse detection of each facial feature by AdaBoost and another one to increase the accuracy of the found points by Active Shape Models (ASM) in the regions of interest. The resulted facial features are evaluated by matching with artificial face models in the applications of physiognomy. The distance measure between the features and those in the fate models from the database is carried out by means of the Hausdorff distance. In the experiment, the proposed method shows the efficient performance in facial feature extractions and online system of physiognomy.Keywords: Facial feature extraction, AdaBoost, Active shapemodel, Hausdorff distance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1812522 Land Suitability Analysis for Maize Production in Egbeda Local Government Area of Oyo State Using GIS Techniques
Authors: Abegunde Linda, Adedeji Oluwatola, Tope-Ajayi Opeyemi
Abstract:
Maize constitutes a major agrarian production for use by the vast population but despite its economic importance; it has not been produced to meet the economic needs of the country. Achieving optimum yield in maize can meaningfully be supported by land suitability analysis in order to guarantee self-sufficiency for future production optimization. This study examines land suitability for maize production through the analysis of the physicochemical variations in soil properties and other land attributes over space using a Geographic Information System (GIS) framework. Physicochemical parameters of importance selected include slope, landuse, physical and chemical properties of the soil, and climatic variables. Landsat imagery was used to categorize the landuse, Shuttle Radar Topographic Mapping (SRTM) generated the slope and soil samples were analyzed for its physical and chemical components. Suitability was categorized into highly, moderately and marginally suitable based on Food and Agricultural Organisation (FAO) classification, using the Analytical Hierarchy Process (AHP) technique of GIS. This result can be used by small scale farmers for efficient decision making in the allocation of land for maize production.
Keywords: AHP, GIS, MCE, Suitability, Zea mays.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3889521 Local Curvelet Based Classification Using Linear Discriminant Analysis for Face Recognition
Authors: Mohammed Rziza, Mohamed El Aroussi, Mohammed El Hassouni, Sanaa Ghouzali, Driss Aboutajdine
Abstract:
In this paper, an efficient local appearance feature extraction method based the multi-resolution Curvelet transform is proposed in order to further enhance the performance of the well known Linear Discriminant Analysis(LDA) method when applied to face recognition. Each face is described by a subset of band filtered images containing block-based Curvelet coefficients. These coefficients characterize the face texture and a set of simple statistical measures allows us to form compact and meaningful feature vectors. The proposed method is compared with some related feature extraction methods such as Principal component analysis (PCA), as well as Linear Discriminant Analysis LDA, and independent component Analysis (ICA). Two different muti-resolution transforms, Wavelet (DWT) and Contourlet, were also compared against the Block Based Curvelet-LDA algorithm. Experimental results on ORL, YALE and FERET face databases convince us that the proposed method provides a better representation of the class information and obtains much higher recognition accuracies.Keywords: Curvelet, Linear Discriminant Analysis (LDA) , Contourlet, Discreet Wavelet Transform, DWT, Block-based analysis, face recognition (FR).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1808520 A Monte Carlo Method to Data Stream Analysis
Authors: Kittisak Kerdprasop, Nittaya Kerdprasop, Pairote Sattayatham
Abstract:
Data stream analysis is the process of computing various summaries and derived values from large amounts of data which are continuously generated at a rapid rate. The nature of a stream does not allow a revisit on each data element. Furthermore, data processing must be fast to produce timely analysis results. These requirements impose constraints on the design of the algorithms to balance correctness against timely responses. Several techniques have been proposed over the past few years to address these challenges. These techniques can be categorized as either dataoriented or task-oriented. The data-oriented approach analyzes a subset of data or a smaller transformed representation, whereas taskoriented scheme solves the problem directly via approximation techniques. We propose a hybrid approach to tackle the data stream analysis problem. The data stream has been both statistically transformed to a smaller size and computationally approximated its characteristics. We adopt a Monte Carlo method in the approximation step. The data reduction has been performed horizontally and vertically through our EMR sampling method. The proposed method is analyzed by a series of experiments. We apply our algorithm on clustering and classification tasks to evaluate the utility of our approach.Keywords: Data Stream, Monte Carlo, Sampling, DensityEstimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1417519 Vessel Inscribed Trigonometry to Measure the Vessel Progressive Orientations in the Digital Fundus Image
Authors: Pil Un Kim, Yunjung Lee, Gihyoun Lee, Jin Ho Cho, Myoung Nam Kim
Abstract:
In this paper, the vessel inscribed trigonometry (VITM) for the vessel progression orientation (VPO) is proposed in the two-dimensional fundus image. The VPO is a major factor in the optic disc (OD) detection which is a basic process in the retina analysis. To measure the VPO, skeletons of vessel are used. First, the vessels are classified into three classes as vessel end, vessel branch and vessel stem. And the chain code maps of VS are generated. Next, two farthest neighborhoods of each point on VS are searched by the proposed angle restriction. Lastly, a gradient of the straight line between two farthest neighborhoods is estimated to measure the VPO. VITM is validated by comparing with manual results and 2D Gaussian templates. It is confirmed that VPO of the proposed mensuration is correct enough to detect OD from the results of experiment which applied VITM to detect OD in fundus images.
Keywords: Angle measurement, Optic disc, Retina vessel, Vessel progression orientation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1417518 Fingerprint Identification using Discretization Technique
Authors: W. Y. Leng, S. M. Shamsuddin
Abstract:
Fingerprint based identification system; one of a well known biometric system in the area of pattern recognition and has always been under study through its important role in forensic science that could help government criminal justice community. In this paper, we proposed an identification framework of individuals by means of fingerprint. Different from the most conventional fingerprint identification frameworks the extracted Geometrical element features (GEFs) will go through a Discretization process. The intention of Discretization in this study is to attain individual unique features that could reflect the individual varianceness in order to discriminate one person from another. Previously, Discretization has been shown a particularly efficient identification on English handwriting with accuracy of 99.9% and on discrimination of twins- handwriting with accuracy of 98%. Due to its high discriminative power, this method is adopted into this framework as an independent based method to seek for the accuracy of fingerprint identification. Finally the experimental result shows that the accuracy rate of identification of the proposed system using Discretization is 100% for FVC2000, 93% for FVC2002 and 89.7% for FVC2004 which is much better than the conventional or the existing fingerprint identification system (72% for FVC2000, 26% for FVC2002 and 32.8% for FVC2004). The result indicates that Discretization approach manages to boost up the classification effectively, and therefore prove to be suitable for other biometric features besides handwriting and fingerprint.Keywords: Discretization, fingerprint identification, geometrical features, pattern recognition
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2360517 Energy Efficient Clustering and Data Aggregation in Wireless Sensor Networks
Authors: Surender Kumar Soni
Abstract:
Wireless Sensor Networks (WSNs) are wireless networks consisting of number of tiny, low cost and low power sensor nodes to monitor various physical phenomena like temperature, pressure, vibration, landslide detection, presence of any object, etc. The major limitation in these networks is the use of nonrechargeable battery having limited power supply. The main cause of energy consumption WSN is communication subsystem. This paper presents an efficient grid formation/clustering strategy known as Grid based level Clustering and Aggregation of Data (GCAD). The proposed clustering strategy is simple and scalable that uses low duty cycle approach to keep non-CH nodes into sleep mode thus reducing energy consumption. Simulation results demonstrate that our proposed GCAD protocol performs better in various performance metrics.Keywords: Ad hoc network, Cluster, Grid base clustering, Wireless sensor network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3137516 Long Short-Term Memory Based Model for Modeling Nicotine Consumption Using an Electronic Cigarette and Internet of Things Devices
Authors: Hamdi Amroun, Yacine Benziani, Mehdi Ammi
Abstract:
In this paper, we want to determine whether the accurate prediction of nicotine concentration can be obtained by using a network of smart objects and an e-cigarette. The approach consists of, first, the recognition of factors influencing smoking cessation such as physical activity recognition and participant’s behaviors (using both smartphone and smartwatch), then the prediction of the configuration of the e-cigarette (in terms of nicotine concentration, power, and resistance of e-cigarette). The study uses a network of commonly connected objects; a smartwatch, a smartphone, and an e-cigarette transported by the participants during an uncontrolled experiment. The data obtained from sensors carried in the three devices were trained by a Long short-term memory algorithm (LSTM). Results show that our LSTM-based model allows predicting the configuration of the e-cigarette in terms of nicotine concentration, power, and resistance with a root mean square error percentage of 12.9%, 9.15%, and 11.84%, respectively. This study can help to better control consumption of nicotine and offer an intelligent configuration of the e-cigarette to users.
Keywords: Iot, activity recognition, automatic classification, unconstrained environment, deep neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1133515 Post-Compression Consideration in Video Watermarking for Wireless Communication
Authors: Chuen-Ching Wang, Yao-Tang Chang, Yu-Chang Hsu
Abstract:
A simple but effective digital watermarking scheme utilizing a context adaptive variable length coding (CAVLC) method is presented for wireless communication system. In the proposed approach, the watermark bits are embedded in the final non-zero quantized coefficient of each DCT block, thereby yielding a potential reduction in the length of the coded block. As a result, the watermarking scheme not only provides the means to check the authenticity and integrity of the video stream, but also improves the compression ratio and therefore reduces both the transmission time and the storage space requirements of the coded video sequence. The results confirm that the proposed scheme enables the detection of malicious tampering attacks and reduces the size of the coded H.264 file. Therefore, the current study is feasible to apply in the video applications of wireless communication such as 3G systemKeywords: 3G, wireless communication, CAVLC, digitalwatermarking, motion compensation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1870514 Machine Learning for Aiding Meningitis Diagnosis in Pediatric Patients
Authors: Karina Zaccari, Ernesto Cordeiro Marujo
Abstract:
This paper presents a Machine Learning (ML) approach to support Meningitis diagnosis in patients at a children’s hospital in Sao Paulo, Brazil. The aim is to use ML techniques to reduce the use of invasive procedures, such as cerebrospinal fluid (CSF) collection, as much as possible. In this study, we focus on predicting the probability of Meningitis given the results of a blood and urine laboratory tests, together with the analysis of pain or other complaints from the patient. We tested a number of different ML algorithms, including: Adaptative Boosting (AdaBoost), Decision Tree, Gradient Boosting, K-Nearest Neighbors (KNN), Logistic Regression, Random Forest and Support Vector Machines (SVM). Decision Tree algorithm performed best, with 94.56% and 96.18% accuracy for training and testing data, respectively. These results represent a significant aid to doctors in diagnosing Meningitis as early as possible and in preventing expensive and painful procedures on some children.
Keywords: Machine learning, medical diagnosis, meningitis detection, gradient boosting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1110