Search results for: Gaussian process classification model with multiclass
11379 Project Management and Software Development Processes: Integrating PMBOK and OPEN
Authors: Maurício Covolan Rosito, Daniel Antonio Callegari, Ricardo Melo Bastos
Abstract:
Software organizations are constantly looking for better solutions when designing and using well-defined software processes for the development of their products and services. However, while the technical aspects are virtually easier to arrange, many software development processes lack more support on project management issues. When adopting such processes, an organization needs to apply good project management skills along with technical views provided by those models. This research proposes the definition of a new model that integrates the concepts of PMBOK and those available on the OPEN metamodel, helping not only process integration but also building the steps towards a more comprehensive and automatable model.Keywords: OPEN metamodel, PMBOK metamodel, Project Management, Software Process
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 320911378 Discovering Complex Regularities: from Tree to Semi-Lattice Classifications
Authors: A. Faro, D. Giordano, F. Maiorana
Abstract:
Data mining uses a variety of techniques each of which is useful for some particular task. It is important to have a deep understanding of each technique and be able to perform sophisticated analysis. In this article we describe a tool built to simulate a variation of the Kohonen network to perform unsupervised clustering and support the entire data mining process up to results visualization. A graphical representation helps the user to find out a strategy to optimize classification by adding, moving or delete a neuron in order to change the number of classes. The tool is able to automatically suggest a strategy to optimize the number of classes optimization, but also support both tree classifications and semi-lattice organizations of the classes to give to the users the possibility of passing from one class to the ones with which it has some aspects in common. Examples of using tree and semi-lattice classifications are given to illustrate advantages and problems. The tool is applied to classify macroeconomic data that report the most developed countries- import and export. It is possible to classify the countries based on their economic behaviour and use the tool to characterize the commercial behaviour of a country in a selected class from the analysis of positive and negative features that contribute to classes formation. Possible interrelationships between the classes and their meaning are also discussed.Keywords: Unsupervised classification, Kohonen networks, macroeconomics, Visual data mining, Cluster interpretation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 154211377 Centralized Cooperative Spectrum Sensing with MIMO in the Reporting Network over κ − μ Fading Channel
Authors: S Hariharan, K Chaitanya, P Muthuchidambaranathan
Abstract:
The IEEE 802.22 working group aims to drive the Digital Video Broadcasting-Terrestrial (DVB-T) bands for data communication to the rural area without interfering the TV broadcast. In this paper, we arrive at a closed-form expression for average detection probability of Fusion center (FC) with multiple antenna over the κ − μ fading channel model. We consider a centralized cooperative multiple antenna network for reporting. The DVB-T samples forwarded by the secondary user (SU) were combined using Maximum ratio combiner at FC, an energy detection is performed to make the decision. The fading effects of the channel degrades the detection probability of the FC, a generalized independent and identically distributed (IID) κ − μ and an additive white Gaussian noise (AWGN) channel is considered for reporting and sensing respectively. The proposed system performance is verified through simulation results.
Keywords: IEEE 802.22, Cooperative spectrum sensing, Multiple antenna, κ − μ .
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 545411376 Identifying Significant Factors of Brick Laying Process through Design of Experiment and Computer Simulation: A Case Study
Authors: M. H. Zarei, A. Nikakhtar, A. H. Roudsari, N. Madadi, K. Y. Wong
Abstract:
Improving performance measures in the construction processes has been a major concern for managers and decision makers in the industry. They seek for ways to recognize the key factors which have the largest effect on the process. Identifying such factors can guide them to focus on the right parts of the process in order to gain the best possible result. In the present study design of experiment (DOE) has been applied to a computer simulation model of brick laying process to determine significant factors while productivity has been chosen as the response of the experiment. To this end, four controllable factors and their interaction have been experimented and the best factor level has been calculated for each one. The results indicate that three factors, namely, labor of brick, labor of mortar and inter arrival time of mortar along with interaction of labor of brick and labor of mortar are significant.Keywords: Brick laying process, computer simulation, design of experiment, significant factors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 210011375 Analysis of P, d and 3He Elastically Scattered by 11B Nuclei at Different Energies
Authors: Ahmed H. Amer, A. Amar, Sh. Hamada, I. I. Bondouk
Abstract:
Elastic scattering of Protons and deuterons from 11B nuclei at different p, d energies have been analyzed within the framework of optical model code (ECIS88). The elastic scattering of 3He+11B nuclear system at different 3He energies have been analyzed using double folding model code (FRESCO). The real potential obtained from the folding model was supplemented by a phenomenological imaginary potential, and during the fitting process the real potential was normalized and the imaginary potential optimized. Volumetric integrals of the real and imaginary potential depths (JR, JW) have been calculated for 3He+11B system. The agreement between the experimental data and the theoretical calculations in the whole angular range is fairly good. Normalization factor Nr is calculated in the range between 0.70 and 1.236.
Keywords: Elastic scattering, optical model parameters, double folding model, nuclear density distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 126611374 A Proposed Optimized and Efficient Intrusion Detection System for Wireless Sensor Network
Authors: Abdulaziz Alsadhan, Naveed Khan
Abstract:
In recent years intrusions on computer network are the major security threat. Hence, it is important to impede such intrusions. The hindrance of such intrusions entirely relies on its detection, which is primary concern of any security tool like Intrusion detection system (IDS). Therefore, it is imperative to accurately detect network attack. Numerous intrusion detection techniques are available but the main issue is their performance. The performance of IDS can be improved by increasing the accurate detection rate and reducing false positive. The existing intrusion detection techniques have the limitation of usage of raw dataset for classification. The classifier may get jumble due to redundancy, which results incorrect classification. To minimize this problem, Principle component analysis (PCA), Linear Discriminant Analysis (LDA) and Local Binary Pattern (LBP) can be applied to transform raw features into principle features space and select the features based on their sensitivity. Eigen values can be used to determine the sensitivity. To further classify, the selected features greedy search, back elimination, and Particle Swarm Optimization (PSO) can be used to obtain a subset of features with optimal sensitivity and highest discriminatory power. This optimal feature subset is used to perform classification. For classification purpose, Support Vector Machine (SVM) and Multilayer Perceptron (MLP) are used due to its proven ability in classification. The Knowledge Discovery and Data mining (KDD’99) cup dataset was considered as a benchmark for evaluating security detection mechanisms. The proposed approach can provide an optimal intrusion detection mechanism that outperforms the existing approaches and has the capability to minimize the number of features and maximize the detection rates.
Keywords: Particle Swarm Optimization (PSO), Principle component analysis (PCA), Linear Discriminant Analysis (LDA), Local Binary Pattern (LBP), Support Vector Machine (SVM), Multilayer Perceptron (MLP).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 276511373 Wave Atom Transform Based Two Class Motor Imagery Classification
Authors: Nebi Gedik
Abstract:
Electroencephalography (EEG) investigations of the brain computer interfaces are based on the electrical signals resulting from neural activities in the brain. In this paper, it is offered a method for classifying motor imagery EEG signals. The suggested method classifies EEG signals into two classes using the wave atom transform, and the transform coefficients are assessed, creating the feature set. Classification is done with SVM and k-NN algorithms with and without feature selection. For feature selection t-test approaches are utilized. A test of the approach is performed on the BCI competition III dataset IIIa.
Keywords: motor imagery, EEG, wave atom transform, SVM, k-NN, t-test
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 49211372 Feature-Based Summarizing and Ranking from Customer Reviews
Authors: Dim En Nyaung, Thin Lai Lai Thein
Abstract:
Due to the rapid increase of Internet, web opinion sources dynamically emerge which is useful for both potential customers and product manufacturers for prediction and decision purposes. These are the user generated contents written in natural languages and are unstructured-free-texts scheme. Therefore, opinion mining techniques become popular to automatically process customer reviews for extracting product features and user opinions expressed over them. Since customer reviews may contain both opinionated and factual sentences, a supervised machine learning technique applies for subjectivity classification to improve the mining performance. In this paper, we dedicate our work is the task of opinion summarization. Therefore, product feature and opinion extraction is critical to opinion summarization, because its effectiveness significantly affects the identification of semantic relationships. The polarity and numeric score of all the features are determined by Senti-WordNet Lexicon. The problem of opinion summarization refers how to relate the opinion words with respect to a certain feature. Probabilistic based model of supervised learning will improve the result that is more flexible and effective.
Keywords: Opinion Mining, Opinion Summarization, Sentiment Analysis, Text Mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 293311371 Virtual Learning Process Environment: Cohort Analytics for Learning and Learning Processes
Authors: Ayodeji Adesina, Derek Molloy
Abstract:
Traditional higher-education classrooms allow lecturers to observe students- behaviours and responses to a particular pedagogy during learning in a way that can influence changes to the pedagogical approach. Within current e-learning systems it is difficult to perform continuous analysis of the cohort-s behavioural tendency, making real-time pedagogical decisions difficult. This paper presents a Virtual Learning Process Environment (VLPE) based on the Business Process Management (BPM) conceptual framework. Within the VLPE, course designers can model various education pedagogies in the form of learning process workflows using an intuitive flow diagram interface. These diagrams are used to visually track the learning progresses of a cohort of students. This helps assess the effectiveness of the chosen pedagogy, providing the information required to improve course design. A case scenario of a cohort of students is presented and quantitative statistical analysis of their learning process performance is gathered and displayed in realtime using dashboards.
Keywords: Business process management, cohort analytics, learning processes, virtual learning environment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 281611370 Segmenting Ultrasound B-Mode Images Using RiIG Distributions and Stochastic Optimization
Abstract:
In this paper, we propose a novel algorithm for delineating the endocardial wall from a human heart ultrasound scan. We assume that the gray levels in the ultrasound images are independent and identically distributed random variables with different Rician Inverse Gaussian (RiIG) distributions. Both synthetic and real clinical data will be used for testing the algorithm. Algorithm performance will be evaluated using the expert radiologist evaluation of a soft copy of an ultrasound scan during the scanning process and secondly, doctor’s conclusion after going through a printed copy of the same scan. Successful implementation of this algorithm should make it possible to differentiate normal from abnormal soft tissue and help disease identification, what stage the disease is in and how best to treat the patient. We hope that an automated system that uses this algorithm will be useful in public hospitals especially in Third World countries where problems such as shortage of skilled radiologists and shortage of ultrasound machines are common. These public hospitals are usually the first and last stop for most patients in these countries.
Keywords: Endorcardial Wall, Rician Inverse Distributions, Segmentation, Ultrasound Images.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 157311369 Orthogonal Array Application and Response Surface Method Approach for Optimal Product Values: An Application for Oil Blending Process
Authors: Christopher C. Ihueze, Constance C. Obiuto, Christian E. Okafor, Charles C. Okpala
Abstract:
This paper presents a methodical approach for designing and optimizing process parameters in oil blending industries. Twenty seven replicated experiments were conducted for production of A-Z crown super oil (SAE20W/50) employing L9 orthogonal array to establish process response parameters. Power law model was fitted to experimental data and the obtained model was optimized applying the central composite design (CCD) of response surface methodology (RSM). Quadratic model was found to be significant for production of A-Z crown supper oil. The study recognized and specified four new lubricant formulations that conform to ISO oil standard in the course of analyzing the batch productions of A-Z crown supper oil as: L1: KV = 21.8293Cst, BS200 = 9430.00Litres, Ad102=11024.00Litres, PVI = 2520 Litres, L2: KV = 22.513Cst, BS200 = 12430.00 Litres, Ad102 = 11024.00 Litres, PVI = 2520 Litres, L3: KV = 22.1671Cst, BS200 = 9430.00 Litres, Ad102 = 10481.00 Litres, PVI= 2520 Litres, L4: KV = 22.8605Cst, BS200 = 12430.00 Litres, Ad102 = 10481.00 Litres, PVI = 2520 Litres. The analysis of variance showed that quadratic model is significant for kinematic viscosity production while the R-sq value statistic of 0.99936 showed that the variation of kinematic viscosity is due to its relationship with the control factors. This study therefore resulted to appropriate blending proportions of lubricants base oil and additives and recommends the optimal kinematic viscosity of A-Z crown super oil (SAE20W/50) to be 22.86Cst.
Keywords: Additives, control factors, kinematic viscosity, lubricant, orthogonal array, process parameter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 194511368 Evolving a Fuzzy Rule-Base for Image Segmentation
Abstract:
A new method for color image segmentation using fuzzy logic is proposed in this paper. Our aim here is to automatically produce a fuzzy system for color classification and image segmentation with least number of rules and minimum error rate. Particle swarm optimization is a sub class of evolutionary algorithms that has been inspired from social behavior of fishes, bees, birds, etc, that live together in colonies. We use comprehensive learning particle swarm optimization (CLPSO) technique to find optimal fuzzy rules and membership functions because it discourages premature convergence. Here each particle of the swarm codes a set of fuzzy rules. During evolution, a population member tries to maximize a fitness criterion which is here high classification rate and small number of rules. Finally, particle with the highest fitness value is selected as the best set of fuzzy rules for image segmentation. Our results, using this method for soccer field image segmentation in Robocop contests shows 89% performance. Less computational load is needed when using this method compared with other methods like ANFIS, because it generates a smaller number of fuzzy rules. Large train dataset and its variety, makes the proposed method invariant to illumination noiseKeywords: Comprehensive learning Particle Swarmoptimization, fuzzy classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 195611367 A User - Requirements Approach in Medical Devices Maintenance System Development: A Case Study from an Industry Perspective
Authors: Manar AlJazzazi, Mohammed Rawashdeh, Tariq Alshawaheen, Aktham Malkawi
Abstract:
This paper is a part of research, in which the way the biomedical engineers follow in their work is analyzed. The goal of this paper is to present a method for specification of user requirements in the medical devices maintenance process. Data Gathering Methods, Research Model Phases and Descriptive Analysis is presented. These technology and verification rules can be implemented in Medical devices maintenance management process to the maintenance process.Keywords: Quality Function Deployment (QFD), User - requirements approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 224011366 Comparative Study Using Weka for Red Blood Cells Classification
Authors: Jameela Ali Alkrimi, Hamid A. Jalab, Loay E. George, Abdul Rahim Ahmad, Azizah Suliman, Karim Al-Jashamy
Abstract:
Red blood cells (RBC) are the most common types of blood cells and are the most intensively studied in cell biology. The lack of RBCs is a condition in which the amount of hemoglobin level is lower than normal and is referred to as “anemia”. Abnormalities in RBCs will affect the exchange of oxygen. This paper presents a comparative study for various techniques for classifying the RBCs as normal or abnormal (anemic) using WEKA. WEKA is an open source consists of different machine learning algorithms for data mining applications. The algorithms tested are Radial Basis Function neural network, Support vector machine, and K-Nearest Neighbors algorithm. Two sets of combined features were utilized for classification of blood cells images. The first set, exclusively consist of geometrical features, was used to identify whether the tested blood cell has a spherical shape or non-spherical cells. While the second set, consist mainly of textural features was used to recognize the types of the spherical cells. We have provided an evaluation based on applying these classification methods to our RBCs image dataset which were obtained from Serdang Hospital - Malaysia, and measuring the accuracy of test results. The best achieved classification rates are 97%, 98%, and 79% for Support vector machines, Radial Basis Function neural network, and K-Nearest Neighbors algorithm respectively.
Keywords: K-Nearest Neighbors, Neural Network, Radial Basis Function, Red blood cells, Support vector machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 299511365 Phenomenological and Semi-microscopic Analysis for Elastic Scattering of Protons on 6,7Li
Authors: A. Amar, N. Burtebayev, Sh. Hamada, Kerimkulov Zhambul, N. Amangieldy
Abstract:
Analysis of the elastic scattering of protons on 6,7Li nuclei has been done in the framework of the optical model at the beam energies up to 50 MeV. Differential cross sections for the 6,7Li + p scattering were measured over the proton laboratory–energy range from 400 to 1050 keV. The elastic scattering of 6,7Li+p data at different proton incident energies have been analyzed using singlefolding model. In each case the real potential obtained from the folding model was supplemented by a phenomenological imaginary potential, and during the fitting process the real potential was normalized and the imaginary potential optimized. Normalization factor NR is calculated in the range between 0.70 and 0.84.Keywords: scattering of protons on 6, 7Li nuclei, Esis88 Codesingle-folding model, phenomenological.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 137511364 Problems and Possible Solutions with the Development of a Computer Model of Quantum Theory
Authors: Hans H. Diel
Abstract:
A computer model of Quantum Theory (QT) has been developed by the author. Major goal of the computer model was support and demonstration of an as large as possible scope of QT. This includes simulations for the major QT (Gedanken-) experiments such as, for example, the famous double-slit experiment. Besides the anticipated difficulties with (1) transforming exacting mathematics into a computer program, two further types of problems showed up, namely (2) areas where QT provides a complete mathematical formalism, but when it comes to concrete applications the equations are not solvable at all, or only with extremely high effort; (3) QT rules which are formulated in natural language and which do not seem to be translatable to precise mathematical expressions, nor to a computer program. The paper lists problems in all three categories and describes also the possible solutions or circumventions developed for the computer model.Keywords: Computability, Foundation of Quantum Mechanics, Measurement Process, Modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 170211363 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis
Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya
Abstract:
In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.Keywords: Cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 97711362 Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow
Authors: Mohammad H. Taheri, David J. Brown, Nasser Sherkat
Abstract:
Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.
Keywords: Affective computing in education, affect detection, continuous performance test, engagement, flow, HCI, interaction, learning disabilities, machine learning, multimodal, multisensor, physiological sensors, Signal Detection Theory, student engagement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 126211361 Moment Invariants in Image Analysis
Authors: Jan Flusser
Abstract:
This paper aims to present a survey of object recognition/classification methods based on image moments. We review various types of moments (geometric moments, complex moments) and moment-based invariants with respect to various image degradations and distortions (rotation, scaling, affine transform, image blurring, etc.) which can be used as shape descriptors for classification. We explain a general theory how to construct these invariants and show also a few of them in explicit forms. We review efficient numerical algorithms that can be used for moment computation and demonstrate practical examples of using moment invariants in real applications.Keywords: Object recognition, degraded images, moments, moment invariants, geometric invariants, invariants to convolution, moment computation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 392011360 Simulation and Validation of Spur Gear Heated by Induction using 3d Model
Authors: A. Chebak, N. Barka, A. Menou, J. Brousseau, D. S. Ramdenee
Abstract:
This paper presents the study of hardness profile of spur gear heated by induction heating process in function of the machine parameters, such as the power (kW), the heating time (s) and the generator frequency (kHz). The global work is realized by 3D finite-element simulation applied to the process by coupling and resolving the electromagnetic field and the heat transfer problems, and it was performed in three distinguished steps. First, a Comsol 3D model was built using an adequate formulation and taking into account the material properties and the machine parameters. Second, the convergence study was conducted to optimize the mesh. Then, the surface temperatures and the case depths were deeply analyzed in function of the initial current density and the heating time in medium frequency (MF) and high frequency (HF) heating modes and the edge effect were studied. Finally, the simulations results are validated using experimental tests.
Keywords: Induction heating, simulation, experimental validation, 3D model, hardness profile.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 165211359 Hand Gesture Recognition Based on Combined Features Extraction
Authors: Mahmoud Elmezain, Ayoub Al-Hamadi, Bernd Michaelis
Abstract:
Hand gesture is an active area of research in the vision community, mainly for the purpose of sign language recognition and Human Computer Interaction. In this paper, we propose a system to recognize alphabet characters (A-Z) and numbers (0-9) in real-time from stereo color image sequences using Hidden Markov Models (HMMs). Our system is based on three main stages; automatic segmentation and preprocessing of the hand regions, feature extraction and classification. In automatic segmentation and preprocessing stage, color and 3D depth map are used to detect hands where the hand trajectory will take place in further step using Mean-shift algorithm and Kalman filter. In the feature extraction stage, 3D combined features of location, orientation and velocity with respected to Cartesian systems are used. And then, k-means clustering is employed for HMMs codeword. The final stage so-called classification, Baum- Welch algorithm is used to do a full train for HMMs parameters. The gesture of alphabets and numbers is recognized using Left-Right Banded model in conjunction with Viterbi algorithm. Experimental results demonstrate that, our system can successfully recognize hand gestures with 98.33% recognition rate.Keywords: Gesture Recognition, Computer Vision & Image Processing, Pattern Recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 403211358 Cirrhosis Mortality Prediction as Classification Using Frequent Subgraph Mining
Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride
Abstract:
In this work, we use machine learning and data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. Our work applies modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.
Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44911357 Decision Support System for Flood Crisis Management using Artificial Neural Network
Authors: Muhammad Aqil, Ichiro Kita, Akira Yano, Nishiyama Soichi
Abstract:
This paper presents an alternate approach that uses artificial neural network to simulate the flood level dynamics in a river basin. The algorithm was developed in a decision support system environment in order to enable users to process the data. The decision support system is found to be useful due to its interactive nature, flexibility in approach and evolving graphical feature and can be adopted for any similar situation to predict the flood level. The main data processing includes the gauging station selection, input generation, lead-time selection/generation, and length of prediction. This program enables users to process the flood level data, to train/test the model using various inputs and to visualize results. The program code consists of a set of files, which can as well be modified to match other purposes. This program may also serve as a tool for real-time flood monitoring and process control. The running results indicate that the decision support system applied to the flood level seems to have reached encouraging results for the river basin under examination. The comparison of the model predictions with the observed data was satisfactory, where the model is able to forecast the flood level up to 5 hours in advance with reasonable prediction accuracy. Finally, this program may also serve as a tool for real-time flood monitoring and process control.Keywords: Decision Support System, Neural Network, Flood Level
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 162611356 Left Ventricular Model Using Second Order Electromechanical Coupling: Effects of Viscoelastic Damping
Authors: Elie H. Karam, Antoine B. Abche
Abstract:
It is known that the heart interacts with and adapts to its venous and arterial loading conditions. Various experimental studies and modeling approaches have been developed to investigate the underlying mechanisms. This paper presents a model of the left ventricle derived based on nonlinear stress-length myocardial characteristics integrated over truncated ellipsoidal geometry, and second-order dynamic mechanism for the excitation-contraction coupling system. The results of the model presented here describe the effects of the viscoelastic damping element of the electromechanical coupling system on the hemodynamic response. Different heart rates are considered to study the pacing effects on the performance of the left-ventricle against constant preload and afterload conditions under various damping conditions. The results indicate that the pacing process of the left ventricle has to take into account, among other things, the viscoelastic damping conditions of the myofilament excitation-contraction process.Keywords: Myocardial sarcomere, cardiac pump, excitationcontraction coupling, viscoelasicity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 142611355 Classification Algorithms in Human Activity Recognition using Smartphones
Authors: Mohd Fikri Azli bin Abdullah, Ali Fahmi Perwira Negara, Md. Shohel Sayeed, Deok-Jai Choi, Kalaiarasi Sonai Muthu
Abstract:
Rapid advancement in computing technology brings computers and humans to be seamlessly integrated in future. The emergence of smartphone has driven computing era towards ubiquitous and pervasive computing. Recognizing human activity has garnered a lot of interest and has raised significant researches- concerns in identifying contextual information useful to human activity recognition. Not only unobtrusive to users in daily life, smartphone has embedded built-in sensors that capable to sense contextual information of its users supported with wide range capability of network connections. In this paper, we will discuss the classification algorithms used in smartphone-based human activity. Existing technologies pertaining to smartphone-based researches in human activity recognition will be highlighted and discussed. Our paper will also present our findings and opinions to formulate improvement ideas in current researches- trends. Understanding research trends will enable researchers to have clearer research direction and common vision on latest smartphone-based human activity recognition area.Keywords: Classification algorithms, Human Activity Recognition (HAR), Smartphones
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 630011354 Texture Feature Extraction using Slant-Hadamard Transform
Authors: M. J. Nassiri, A. Vafaei, A. Monadjemi
Abstract:
Random and natural textures classification is still one of the biggest challenges in the field of image processing and pattern recognition. In this paper, texture feature extraction using Slant Hadamard Transform was studied and compared to other signal processing-based texture classification schemes. A parametric SHT was also introduced and employed for natural textures feature extraction. We showed that a subtly modified parametric SHT can outperform ordinary Walsh-Hadamard transform and discrete cosine transform. Experiments were carried out on a subset of Vistex random natural texture images using a kNN classifier.Keywords: Texture Analysis, Slant Transform, Hadamard, DCT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 267311353 Extraction of Symbolic Rules from Artificial Neural Networks
Authors: S. M. Kamruzzaman, Md. Monirul Islam
Abstract:
Although backpropagation ANNs generally predict better than decision trees do for pattern classification problems, they are often regarded as black boxes, i.e., their predictions cannot be explained as those of decision trees. In many applications, it is desirable to extract knowledge from trained ANNs for the users to gain a better understanding of how the networks solve the problems. A new rule extraction algorithm, called rule extraction from artificial neural networks (REANN) is proposed and implemented to extract symbolic rules from ANNs. A standard three-layer feedforward ANN is the basis of the algorithm. A four-phase training algorithm is proposed for backpropagation learning. Explicitness of the extracted rules is supported by comparing them to the symbolic rules generated by other methods. Extracted rules are comparable with other methods in terms of number of rules, average number of conditions for a rule, and predictive accuracy. Extensive experimental studies on several benchmarks classification problems, such as breast cancer, iris, diabetes, and season classification problems, demonstrate the effectiveness of the proposed approach with good generalization ability.Keywords: Backpropagation, clustering algorithm, constructivealgorithm, continuous activation function, pruning algorithm, ruleextraction algorithm, symbolic rules.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 161611352 Numerical Simulation of Investment Casting of Gold Jewelry: Experiments and Validations
Authors: Marco Actis Grande, Somlak Wannarumon
Abstract:
This paper proposes the numerical simulation of the investment casting of gold jewelry. It aims to study the behavior of fluid flow during mould filling and solidification and to optimize the process parameters, which lead to predict and control casting defects such as gas porosity and shrinkage porosity. A finite difference method, computer simulation software FLOW-3D was used to simulate the jewelry casting process. The simplified model was designed for both numerical simulation and real casting production. A set of sensor acquisitions were allocated on the different positions of the wax tree of the model to detect filling times, while a set of thermocouples were allocated to detect the temperature during casting and cooling. Those detected data were applied to validate the results of the numerical simulation to the results of the real casting. The resulting comparisons signify that the numerical simulation can be used as an effective tool in investment-casting-process optimization and casting-defect prediction.Keywords: Computer fluid dynamic, Investment casting, Jewelry, Mould filling, Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 273711351 Superior Performances of the Neural Network on the Masses Lesions Classification through Morphological Lesion Differences
Authors: U. Bottigli, R.Chiarucci, B. Golosio, G.L. Masala, P. Oliva, S.Stumbo, D.Cascio, F. Fauci, M. Glorioso, M. Iacomi, R. Magro, G. Raso
Abstract:
Purpose of this work is to develop an automatic classification system that could be useful for radiologists in the breast cancer investigation. The software has been designed in the framework of the MAGIC-5 collaboration. In an automatic classification system the suspicious regions with high probability to include a lesion are extracted from the image as regions of interest (ROIs). Each ROI is characterized by some features based generally on morphological lesion differences. A study in the space features representation is made and some classifiers are tested to distinguish the pathological regions from the healthy ones. The results provided in terms of sensitivity and specificity will be presented through the ROC (Receiver Operating Characteristic) curves. In particular the best performances are obtained with the Neural Networks in comparison with the K-Nearest Neighbours and the Support Vector Machine: The Radial Basis Function supply the best results with 0.89 ± 0.01 of area under ROC curve but similar results are obtained with the Probabilistic Neural Network and a Multi Layer Perceptron.
Keywords: Neural Networks, K-Nearest Neighbours, Support Vector Machine, Computer Aided Detection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 161411350 Myanmar Character Recognition Using Eight Direction Chain Code Frequency Features
Authors: Kyi Pyar Zaw, Zin Mar Kyu
Abstract:
Character recognition is the process of converting a text image file into editable and searchable text file. Feature Extraction is the heart of any character recognition system. The character recognition rate may be low or high depending on the extracted features. In the proposed paper, 25 features for one character are used in character recognition. Basically, there are three steps of character recognition such as character segmentation, feature extraction and classification. In segmentation step, horizontal cropping method is used for line segmentation and vertical cropping method is used for character segmentation. In the Feature extraction step, features are extracted in two ways. The first way is that the 8 features are extracted from the entire input character using eight direction chain code frequency extraction. The second way is that the input character is divided into 16 blocks. For each block, although 8 feature values are obtained through eight-direction chain code frequency extraction method, we define the sum of these 8 feature values as a feature for one block. Therefore, 16 features are extracted from that 16 blocks in the second way. We use the number of holes feature to cluster the similar characters. We can recognize the almost Myanmar common characters with various font sizes by using these features. All these 25 features are used in both training part and testing part. In the classification step, the characters are classified by matching the all features of input character with already trained features of characters.
Keywords: Chain code frequency, character recognition, feature extraction, features matching, segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 753