Search results for: Gaussian process classification model with multiclass
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12165

Search results for: Gaussian process classification model with multiclass

11745 Dynamic Measurement System Modeling with Machine Learning Algorithms

Authors: Changqiao Wu, Guoqing Ding, Xin Chen

Abstract:

In this paper, ways of modeling dynamic measurement systems are discussed. Specially, for linear system with single-input single-output, it could be modeled with shallow neural network. Then, gradient based optimization algorithms are used for searching the proper coefficients. Besides, method with normal equation and second order gradient descent are proposed to accelerate the modeling process, and ways of better gradient estimation are discussed. It shows that the mathematical essence of the learning objective is maximum likelihood with noises under Gaussian distribution. For conventional gradient descent, the mini-batch learning and gradient with momentum contribute to faster convergence and enhance model ability. Lastly, experimental results proved the effectiveness of second order gradient descent algorithm, and indicated that optimization with normal equation was the most suitable for linear dynamic models.

Keywords: Dynamic system modeling, neural network, normal equation, second order gradient descent.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 743
11744 Automatic Flood Prediction Using Rainfall Runoff Model in Moravian-Silesian Region

Authors: B. Sir, M. Podhoranyi, S. Kuchar, T. Kocyan

Abstract:

Rainfall runoff models play important role in hydrological predictions. However, the model is only one part of the process for creation of flood prediction. The aim of this paper is to show the process of successful prediction for flood event (May 15 – May 18 2014). Prediction was performed by rainfall runoff model HEC–HMS, one of the models computed within Floreon+ system. The paper briefly evaluates the results of automatic hydrologic prediction on the river Olše catchment and its gages Český Těšín and Věřňovice.

Keywords: Flood, HEC-HMS, Prediction, Rainfall – Runoff.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2185
11743 Modeling and Optimization of Abrasive Waterjet Parameters using Regression Analysis

Authors: Farhad Kolahan, A. Hamid Khajavi

Abstract:

Abrasive waterjet is a novel machining process capable of processing wide range of hard-to-machine materials. This research addresses modeling and optimization of the process parameters for this machining technique. To model the process a set of experimental data has been used to evaluate the effects of various parameter settings in cutting 6063-T6 aluminum alloy. The process variables considered here include nozzle diameter, jet traverse rate, jet pressure and abrasive flow rate. Depth of cut, as one of the most important output characteristics, has been evaluated based on different parameter settings. The Taguchi method and regression modeling are used in order to establish the relationships between input and output parameters. The adequacy of the model is evaluated using analysis of variance (ANOVA) technique. The pairwise effects of process parameters settings on process response outputs are also shown graphically. The proposed model is then embedded into a Simulated Annealing algorithm to optimize the process parameters. The optimization is carried out for any desired values of depth of cut. The objective is to determine proper levels of process parameters in order to obtain a certain level of depth of cut. Computational results demonstrate that the proposed solution procedure is quite effective in solving such multi-variable problems.

Keywords: AWJ cutting, Mathematical modeling, Simulated Annealing, Optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2121
11742 A New Approach for the Fingerprint Classification Based On Gray-Level Co- Occurrence Matrix

Authors: Mehran Yazdi, Kazem Gheysari

Abstract:

In this paper, we propose an approach for the classification of fingerprint databases. It is based on the fact that a fingerprint image is composed of regular texture regions that can be successfully represented by co-occurrence matrices. So, we first extract the features based on certain characteristics of the cooccurrence matrix and then we use these features to train a neural network for classifying fingerprints into four common classes. The obtained results compared with the existing approaches demonstrate the superior performance of our proposed approach.

Keywords: Biometrics, fingerprint classification, gray level cooccurrence matrix, regular texture representation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1941
11741 A Supervised Learning Data Mining Approach for Object Recognition and Classification in High Resolution Satellite Data

Authors: Mais Nijim, Rama Devi Chennuboyina, Waseem Al Aqqad

Abstract:

Advances in spatial and spectral resolution of satellite images have led to tremendous growth in large image databases. The data we acquire through satellites, radars, and sensors consists of important geographical information that can be used for remote sensing applications such as region planning, disaster management. Spatial data classification and object recognition are important tasks for many applications. However, classifying objects and identifying them manually from images is a difficult task. Object recognition is often considered as a classification problem, this task can be performed using machine-learning techniques. Despite of many machine-learning algorithms, the classification is done using supervised classifiers such as Support Vector Machines (SVM) as the area of interest is known. We proposed a classification method, which considers neighboring pixels in a region for feature extraction and it evaluates classifications precisely according to neighboring classes for semantic interpretation of region of interest (ROI). A dataset has been created for training and testing purpose; we generated the attributes by considering pixel intensity values and mean values of reflectance. We demonstrated the benefits of using knowledge discovery and data-mining techniques, which can be on image data for accurate information extraction and classification from high spatial resolution remote sensing imagery.

Keywords: Remote sensing, object recognition, classification, data mining, waterbody identification, feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2021
11740 Classifying and Predicting Efficiencies Using Interval DEA Grid Setting

Authors: Yiannis G. Smirlis

Abstract:

The classification and the prediction of efficiencies in Data Envelopment Analysis (DEA) is an important issue, especially in large scale problems or when new units frequently enter the under-assessment set. In this paper, we contribute to the subject by proposing a grid structure based on interval segmentations of the range of values for the inputs and outputs. Such intervals combined, define hyper-rectangles that partition the space of the problem. This structure, exploited by Interval DEA models and a dominance relation, acts as a DEA pre-processor, enabling the classification and prediction of efficiency scores, without applying any DEA models.

Keywords: Data envelopment analysis, interval DEA, efficiency classification, efficiency prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 906
11739 Torrefaction of Biomass Pellets: Modeling of the Process in a Fixed Bed Reactor

Authors: Ekaterina Artiukhina, Panagiotis Grammelis

Abstract:

Torrefaction of biomass pellets is considered as a useful pretreatment technology in order to convert them into a high quality solid biofuel that is more suitable for pyrolysis, gasification, combustion, and co-firing applications. In the course of torrefaction, the temperature varies across the pellet, and therefore chemical reactions proceed unevenly within the pellet. However, the uniformity of the thermal distribution along the pellet is generally assumed. The torrefaction process of a single cylindrical pellet is modeled here, accounting for heat transfer coupled with chemical kinetics. The drying sub-model was also introduced. The nonstationary process of wood pellet decomposition is described by the system of non-linear partial differential equations over the temperature and mass. The model captures well the main features of the experimental data.

Keywords: Torrefaction, biomass pellets, model, heat and mass transfer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1770
11738 A New Vector Quantization Front-End Process for Discrete HMM Speech Recognition System

Authors: M. Debyeche, J.P Haton, A. Houacine

Abstract:

The paper presents a complete discrete statistical framework, based on a novel vector quantization (VQ) front-end process. This new VQ approach performs an optimal distribution of VQ codebook components on HMM states. This technique that we named the distributed vector quantization (DVQ) of hidden Markov models, succeeds in unifying acoustic micro-structure and phonetic macro-structure, when the estimation of HMM parameters is performed. The DVQ technique is implemented through two variants. The first variant uses the K-means algorithm (K-means- DVQ) to optimize the VQ, while the second variant exploits the benefits of the classification behavior of neural networks (NN-DVQ) for the same purpose. The proposed variants are compared with the HMM-based baseline system by experiments of specific Arabic consonants recognition. The results show that the distributed vector quantization technique increase the performance of the discrete HMM system.

Keywords: Hidden Markov Model, Vector Quantization, Neural Network, Speech Recognition, Arabic Language

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2027
11737 Gaussian Particle Flow Bernoulli Filter for Single Target Tracking

Authors: Hyeongbok Kim, Lingling Zhao, Xiaohong Su, Junjie Wang

Abstract:

The Bernoulli filter is a precise Bayesian filter for single target tracking based on the random finite set theory. The standard Bernoulli filter often underestimates the number of the targets. This study proposes a Gaussian particle flow (GPF) Bernoulli filter employing particle flow to migrate particles from prior to posterior positions to improve the performance of the standard Bernoulli filter. By employing the particle flow filter, the computational speed of the Bernoulli filters is significantly improved. In addition, the GPF Bernoulli filter provides more accurate estimation compared with that of the standard Bernoulli filter. Simulation results confirm the improved tracking performance and computational speed in two- and three-dimensional scenarios compared with other algorithms.

Keywords: Bernoulli filter, particle filter, particle flow filter, random finite sets, target tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 287
11736 Pattern Recognition of Partial Discharge by Using Simplified Fuzzy ARTMAP

Authors: S. Boonpoke, B. Marungsri

Abstract:

This paper presents the effectiveness of artificial intelligent technique to apply for pattern recognition and classification of Partial Discharge (PD). Characteristics of PD signal for pattern recognition and classification are computed from the relation of the voltage phase angle, the discharge magnitude and the repeated existing of partial discharges by using statistical and fractal methods. The simplified fuzzy ARTMAP (SFAM) is used for pattern recognition and classification as artificial intelligent technique. PDs quantities, 13 parameters from statistical method and fractal method results, are inputted to Simplified Fuzzy ARTMAP to train system for pattern recognition and classification. The results confirm the effectiveness of purpose technique.

Keywords: Partial discharges, PD Pattern recognition, PDClassification, Artificial intelligent, Simplified Fuzzy ARTMAP

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3040
11735 Computing Transition Intensity Using Time-Homogeneous Markov Jump Process: Case of South African HIV/AIDS Disposition

Authors: A. Bayaga

Abstract:

This research provides a technical account of estimating Transition Probability using Time-homogeneous Markov Jump Process applying by South African HIV/AIDS data from the Statistics South Africa. It employs Maximum Likelihood Estimator (MLE) model to explore the possible influence of Transition Probability of mortality cases in which case the data was based on actual Statistics South Africa. This was conducted via an integrated demographic and epidemiological model of South African HIV/AIDS epidemic. The model was fitted to age-specific HIV prevalence data and recorded death data using MLE model. Though the previous model results suggest HIV in South Africa has declined and AIDS mortality rates have declined since 2002 – 2013, in contrast, our results differ evidently with the generally accepted HIV models (Spectrum/EPP and ASSA2008) in South Africa. However, there is the need for supplementary research to be conducted to enhance the demographic parameters in the model and as well apply it to each of the nine (9) provinces of South Africa.

Keywords: AIDS mortality rates, Epidemiological model, Time-homogeneous Markov Jump Process, Transition Probability, Statistics South Africa.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2140
11734 A Business Model Design Process for Social Enterprises: The Critical Role of the Environment

Authors: Hadia Abdel Aziz, Raghda El Ebrashi

Abstract:

Business models are shaped by their design space or the environment they are designed to be implemented in. The rapidly changing economic, technological, political, regulatory and market external environment severely affects business logic. This is particularly true for social enterprises whose core mission is to transform their environments, and thus, their whole business logic revolves around the interchange between the enterprise and the environment. The context in which social business operates imposes different business design constraints while at the same time, open up new design opportunities. It is also affected to a great extent by the impact that successful enterprises generate; a continuous loop of interaction that needs to be managed through a dynamic capability in order to generate a lasting powerful impact. This conceptual research synthesizes and analyzes literature on social enterprise, social enterprise business models, business model innovation, business model design, and the open system view theory to propose a new business model design process for social enterprises that takes into account the critical role of environmental factors. This process would help the social enterprise develop a dynamic capability that ensures the alignment of its business model to its environmental context, thus, maximizing its probability of success.

Keywords: Social enterprise, business model, business model design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2992
11733 Membrane Distillation Process Modeling: Dynamical Approach

Authors: Fadi Eleiwi, Taous Meriem Laleg-Kirati

Abstract:

This paper presents a complete dynamic modeling of a membrane distillation process. The model contains two consistent dynamic models. A 2D advection-diffusion equation for modeling the whole process and a modified heat equation for modeling the membrane itself. The complete model describes the temperature diffusion phenomenon across the feed, membrane, permeate containers and boundary layers of the membrane. It gives an online and complete temperature profile for each point in the domain. It explains heat conduction and convection mechanisms that take place inside the process in terms of mathematical parameters, and justify process behavior during transient and steady state phases. The process is monitored for any sudden change in the performance at any instance of time. In addition, it assists maintaining production rates as desired, and gives recommendations during membrane fabrication stages. System performance and parameters can be optimized and controlled using this complete dynamic model. Evolution of membrane boundary temperature with time, vapor mass transfer along the process, and temperature difference between membrane boundary layers are depicted and included. Simulations were performed over the complete model with real membrane specifications. The plots show consistency between 2D advection-diffusion model and the expected behavior of the systems as well as literature. Evolution of heat inside the membrane starting from transient response till reaching steady state response for fixed and varying times is illustrated.

Keywords: Membrane distillation, Dynamical modeling, Advection-diffusion equation, Thermal equilibrium, Heat equation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2819
11732 Educating Students in Business Process Management with Simulation Games

Authors: Vesna Bosilj Vuksic, Mirjana Pejic Bach, Tomislav Hernaus

Abstract:

The aim of this paper is to present a framework for empirical investigation of the effectiveness of simulation games for student learning of BPM concept. A future research methodology is explained and a normative model that extends the standard TAM model by introducing latent and mediating variables into the relationship between independent variables and dependent variable is developed. Future research propositions are defined in order to examine the benefits that can be achieved through the use of BPM simulation games in ERP courses.

Keywords: Business process management, simulation games, education, technology acceptance model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2532
11731 Distortion Estimation in Digital Image Watermarking using Genetic Programming

Authors: Labiba Gilani, Asifullah Khan, Anwar M. Mirza

Abstract:

This paper introduces a technique of distortion estimation in image watermarking using Genetic Programming (GP). The distortion is estimated by considering the problem of obtaining a distorted watermarked signal from the original watermarked signal as a function regression problem. This function regression problem is solved using GP, where the original watermarked signal is considered as an independent variable. GP-based distortion estimation scheme is checked for Gaussian attack and Jpeg compression attack. We have used Gaussian attacks of different strengths by changing the standard deviation. JPEG compression attack is also varied by adding various distortions. Experimental results demonstrate that the proposed technique is able to detect the watermark even in the case of strong distortions and is more robust against attacks.

Keywords: Blind Watermarking, Genetic Programming (GP), Fitness Function, Discrete Cosine Transform (DCT).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1678
11730 An Improved QRS Complex Detection for Online Medical Diagnosis

Authors: I. L. Ahmad, M. Mohamed, N. A. Ab. Ghani

Abstract:

This paper presents the work of signal discrimination specifically for Electrocardiogram (ECG) waveform. ECG signal is comprised of P, QRS, and T waves in each normal heart beat to describe the pattern of heart rhythms corresponds to a specific individual. Further medical diagnosis could be done to determine any heart related disease using ECG information. The emphasis on QRS Complex classification is further discussed to illustrate the importance of it. Pan-Tompkins Algorithm, a widely known technique has been adapted to realize the QRS Complex classification process. There are eight steps involved namely sampling, normalization, low pass filter, high pass filter (build a band pass filter), derivation, squaring, averaging and lastly is the QRS detection. The simulation results obtained is represented in a Graphical User Interface (GUI) developed using MATLAB.

Keywords: ECG, Pan Tompkins Algorithm, QRS Complex, Simulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2545
11729 Monitoring Patents Using the Statistical Process Control

Authors: Stephanie Russo Fabris, Edmara Thays Neres Menezes, Ruirogeres dos Santos Cruz, Lucio Leonardo Siqueira Santos, Suzana Leitao Russo

Abstract:

The statistical process control (SPC) is one of the most powerful tools developed to assist ineffective control of quality, involves collecting, organizing and interpreting data during production. This article aims to show how the use of CEP industries can control and continuously improve product quality through monitoring of production that can detect deviations of parameters representing the process by reducing the amount of off-specification products and thus the costs of production. This study aimed to conduct a technological forecasting in order to characterize the research being done related to the CEP. The survey was conducted in the databases Spacenet, WIPO and the National Institute of Industrial Property (INPI). Among the largest are the United States depositors and deposits via PCT, the classification section that was presented in greater abundance to F.

Keywords: Statistical Process Control, Industries

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1496
11728 Hospital Facility Location Selection Using Permanent Analytics Process

Authors: C. Ardil

Abstract:

In this paper, a new MCDMA approach, the permanent analytics process is proposed to assess the immovable valuation criteria and their significance in the placement of the healthcare facility. Five decision factors are considered for the value and selection of immovables. In the multiple factor selection problems, the priority vector of the criteria used to compare several immovables is first determined using the permanent analytics method, a mathematical model for the multiple criteria decisionmaking process. Then, to demonstrate the viability and efficacy of the suggested approach, twenty potential candidate locations were evaluated using the hospital site selection problem's decision criteria. The ranking accuracy of estimation was evaluated using composite programming, which took into account both the permanent analytics process and the weighted multiplicative model. 

Keywords: Hospital Facility Location Selection, Permanent Analytics Process, Multiple Criteria Decision Making (MCDM)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 362
11727 A Serial Hierarchical Support Vector Machine and 2D Feature Sets Act for Brain DTI Segmentation

Authors: Mohammad Javadi

Abstract:

Serial hierarchical support vector machine (SHSVM) is proposed to discriminate three brain tissues which are white matter (WM), gray matter (GM), and cerebrospinal fluid (CSF). SHSVM has novel classification approach by repeating the hierarchical classification on data set iteratively. It used Radial Basis Function (rbf) Kernel with different tuning to obtain accurate results. Also as the second approach, segmentation performed with DAGSVM method. In this article eight univariate features from the raw DTI data are extracted and all the possible 2D feature sets are examined within the segmentation process. SHSVM succeed to obtain DSI values higher than 0.95 accuracy for all the three tissues, which are higher than DAGSVM results.

Keywords: Brain segmentation, DTI, hierarchical, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1825
11726 Analysis of Sonogram Images of Thyroid Gland Based on Wavelet Transform

Authors: M. Bastanfard, B. Jalaeian, S. Jafari

Abstract:

Sonogram images of normal and lymphocyte thyroid tissues have considerable overlap which makes it difficult to interpret and distinguish. Classification from sonogram images of thyroid gland is tackled in semiautomatic way. While making manual diagnosis from images, some relevant information need not to be recognized by human visual system. Quantitative image analysis could be helpful to manual diagnostic process so far done by physician. Two classes are considered: normal tissue and chronic lymphocyte thyroid (Hashimoto's Thyroid). Data structure is analyzed using K-nearest-neighbors classification. This paper is mentioned that unlike the wavelet sub bands' energy, histograms and Haralick features are not appropriate to distinguish between normal tissue and Hashimoto's thyroid.

Keywords: Sonogram, thyroid, Haralick feature, wavelet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1291
11725 Feature Subset Selection Using Ant Colony Optimization

Authors: Ahmed Al-Ani

Abstract:

Feature selection is an important step in many pattern classification problems. It is applied to select a subset of features, from a much larger set, such that the selected subset is sufficient to perform the classification task. Due to its importance, the problem of feature selection has been investigated by many researchers. In this paper, a novel feature subset search procedure that utilizes the Ant Colony Optimization (ACO) is presented. The ACO is a metaheuristic inspired by the behavior of real ants in their search for the shortest paths to food sources. It looks for optimal solutions by considering both local heuristics and previous knowledge. When applied to two different classification problems, the proposed algorithm achieved very promising results.

Keywords: Ant Colony Optimization, ant systems, feature selection, pattern recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562
11724 Classification and Resolving Urban Problems by Means of Fuzzy Approach

Authors: F. Habib, A. Shokoohi

Abstract:

Urban problems are problems of organized complexity. Thus, many models and scientific methods to resolve urban problems are failed. This study is concerned with proposing of a fuzzy system driven approach for classification and solving urban problems. The proposed study investigated mainly the selection of the inputs and outputs of urban systems for classification of urban problems. In this research, five categories of urban problems, respect to fuzzy system approach had been recognized: control, polytely, optimizing, open and decision making problems. Grounded Theory techniques were then applied to analyze the data and develop new solving method for each category. The findings indicate that the fuzzy system methods are powerful processes and analytic tools for helping planners to resolve urban complex problems. These tools can be successful where as others have failed because both incorporate or address uncertainty and risk; complexity and systems interacting with other systems.

Keywords: Classification, complexity, Fuzzy theory, urban problems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2082
11723 A Theoretical Hypothesis on Ferris Wheel Model of University Social Responsibility

Authors: Le Kang

Abstract:

According to the nature of the university, as a free and responsible academic community, USR is based on a different foundation —academic responsibility, so the Pyramid and the IC Model of CSR could not fully explain the most distinguished feature of USR. This paper sought to put forward a new model— Ferris Wheel Model, to illustrate the nature of USR and the process of achievement. The Ferris Wheel Model of USR shows the university creates a balanced, fairness and neutrality systemic structure to afford social responsibilities; that makes the organization could obtain a synergistic effect to achieve more extensive interests of stakeholders and wider social responsibilities.

Keywords: USR, Achievement model, Ferris wheel model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1524
11722 Preparation of Computer Model of the Aircraft for Numerical Aeroelasticity Tests – Flutter

Authors: M. Rychlik, R. Roszak, M. Morzynski, M. Nowak, H. Hausa, K. Kotecki

Abstract:

Article presents the geometry and structure reconstruction procedure of the aircraft model for flatter research (based on the I22-IRYDA aircraft). For reconstruction the Reverse Engineering techniques and advanced surface modeling CAD tools are used. Authors discuss all stages of data acquisition process, computation and analysis of measured data. For acquisition the three dimensional structured light scanner was used. In the further sections, details of reconstruction process are present. Geometry reconstruction procedure transform measured input data (points cloud) into the three dimensional parametric computer model (NURBS solid model) which is compatible with CAD systems. Parallel to the geometry of the aircraft, the internal structure (structural model) are extracted and modeled. In last chapter the evaluation of obtained models are discussed.

Keywords: computer modeling, numerical simulation, Reverse Engineering, structural model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1728
11721 Classifying Biomedical Text Abstracts based on Hierarchical 'Concept' Structure

Authors: Rozilawati Binti Dollah, Masaki Aono

Abstract:

Classifying biomedical literature is a difficult and challenging task, especially when a large number of biomedical articles should be organized into a hierarchical structure. In this paper, we present an approach for classifying a collection of biomedical text abstracts downloaded from Medline database with the help of ontology alignment. To accomplish our goal, we construct two types of hierarchies, the OHSUMED disease hierarchy and the Medline abstract disease hierarchies from the OHSUMED dataset and the Medline abstracts, respectively. Then, we enrich the OHSUMED disease hierarchy before adapting it to ontology alignment process for finding probable concepts or categories. Subsequently, we compute the cosine similarity between the vector in probable concepts (in the “enriched" OHSUMED disease hierarchy) and the vector in Medline abstract disease hierarchies. Finally, we assign category to the new Medline abstracts based on the similarity score. The results obtained from the experiments show the performance of our proposed approach for hierarchical classification is slightly better than the performance of the multi-class flat classification.

Keywords: Biomedical literature, hierarchical text classification, ontology alignment, text mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1987
11720 Asynchronous Microcontroller Simulation Model in VHDL

Authors: M. Kovac

Abstract:

This article describes design of the 8-bit asynchronous microcontroller simulation model in VHDL. The model is created in ISE Foundation design tool and simulated in Modelsim tool. This model is a simple application example of asynchronous systems designed in synchronous design tools. The design process of creating asynchronous system with 4-phase bundled-data protocol and with matching delays is described in the article. The model is described in gate-level abstraction. The simulation waveform of the functional construction is the result of this article. Described construction covers only the simulation model. The next step would be creating synthesizable model to FPGA.

Keywords: Asynchronous, Microcontroller, VHDL, FPGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3270
11719 Performance Analysis of Genetic Algorithm with kNN and SVM for Feature Selection in Tumor Classification

Authors: C. Gunavathi, K. Premalatha

Abstract:

Tumor classification is a key area of research in the field of bioinformatics. Microarray technology is commonly used in the study of disease diagnosis using gene expression levels. The main drawback of gene expression data is that it contains thousands of genes and a very few samples. Feature selection methods are used to select the informative genes from the microarray. These methods considerably improve the classification accuracy. In the proposed method, Genetic Algorithm (GA) is used for effective feature selection. Informative genes are identified based on the T-Statistics, Signal-to-Noise Ratio (SNR) and F-Test values. The initial candidate solutions of GA are obtained from top-m informative genes. The classification accuracy of k-Nearest Neighbor (kNN) method is used as the fitness function for GA. In this work, kNN and Support Vector Machine (SVM) are used as the classifiers. The experimental results show that the proposed work is suitable for effective feature selection. With the help of the selected genes, GA-kNN method achieves 100% accuracy in 4 datasets and GA-SVM method achieves in 5 out of 10 datasets. The GA with kNN and SVM methods are demonstrated to be an accurate method for microarray based tumor classification.

Keywords: F-Test, Gene Expression, Genetic Algorithm, k- Nearest-Neighbor, Microarray, Signal-to-Noise Ratio, Support Vector Machine, T-statistics, Tumor Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4504
11718 Classification of Soil Aptness to Establish of Panicum virgatum in Mississippi using Sensitivity Analysis and GIS

Authors: Eduardo F. Arias, William Cooke III, Zhaofei Fan, William Kingery

Abstract:

During the last decade Panicum virgatum, known as Switchgrass, has been broadly studied because of its remarkable attributes as a substitute pasture and as a functional biofuel source. The objective of this investigation was to establish soil suitability for Switchgrass in the State of Mississippi. A linear weighted additive model was developed to forecast soil suitability. Multicriteria analysis and Sensitivity analysis were utilized to adjust and optimize the model. The model was fit using seven years of field data associated with soils characteristics collected from Natural Resources Conservation System - United States Department of Agriculture (NRCS-USDA). The best model was selected by correlating calculated biomass yield with each model's soils-based output for Switchgrass suitability. Coefficient of determination (r2) was the decisive factor used to establish the 'best' soil suitability model. Coefficients associated with the 'best' model were implemented within a Geographic Information System (GIS) to create a map of relative soil suitability for Switchgrass in Mississippi. A Geodatabase associated with soil parameters was built and is available for future Geographic Information System use.

Keywords: Aptness, GIS, sensitivity analysis, switchgrass, soil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1502
11717 Combined Feature Based Hyperspectral Image Classification Technique Using Support Vector Machines

Authors: Mrs.K.Kavitha, S.Arivazhagan

Abstract:

A spatial classification technique incorporating a State of Art Feature Extraction algorithm is proposed in this paper for classifying a heterogeneous classes present in hyper spectral images. The classification accuracy can be improved if and only if both the feature extraction and classifier selection are proper. As the classes in the hyper spectral images are assumed to have different textures, textural classification is entertained. Run Length feature extraction is entailed along with the Principal Components and Independent Components. A Hyperspectral Image of Indiana Site taken by AVIRIS is inducted for the experiment. Among the original 220 bands, a subset of 120 bands is selected. Gray Level Run Length Matrix (GLRLM) is calculated for the selected forty bands. From GLRLMs the Run Length features for individual pixels are calculated. The Principle Components are calculated for other forty bands. Independent Components are calculated for next forty bands. As Principal & Independent Components have the ability to represent the textural content of pixels, they are treated as features. The summation of Run Length features, Principal Components, and Independent Components forms the Combined Features which are used for classification. SVM with Binary Hierarchical Tree is used to classify the hyper spectral image. Results are validated with ground truth and accuracies are calculated.

Keywords: Multi-class, Run Length features, PCA, ICA, classification and Support Vector Machines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1487
11716 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function

Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos

Abstract:

Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.

Keywords: Diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion equation, trends functions, bi-parameters Weibull density function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1941