Search results for: classification and clustering.
459 Application of Granular Computing Paradigm in Knowledge Induction
Authors: Iftikhar U. Sikder
Abstract:
This paper illustrates an application of granular computing approach, namely rough set theory in data mining. The paper outlines the formalism of granular computing and elucidates the mathematical underpinning of rough set theory, which has been widely used by the data mining and the machine learning community. A real-world application is illustrated, and the classification performance is compared with other contending machine learning algorithms. The predictive performance of the rough set rule induction model shows comparative success with respect to other contending algorithms.
Keywords: Concept approximation, granular computing, reducts, rough set theory, rule induction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 832458 An Overview of Construction and Demolition Waste as Coarse Aggregate in Concrete
Authors: S. R. Shamili, J. Karthikeyan
Abstract:
Fast development of the total populace and far and wide urbanization has surprisingly expanded the advancement of the construction industry. As a result of these activities, old structures are being demolished to make new buildings. Due to these large-scale demolitions, a huge amount of debris is generated all over the world, which results in a landfill. The use of construction and demolition waste as landfill causes groundwater contamination, which is hazardous. Using construction and demolition waste as aggregate can reduce the use of natural aggregates and the problem of mining. The objective of this study is to provide a detailed overview on how the construction and demolition waste material has been used as aggregate in structural concrete. In this study, the preparation, classification, and composition of construction and demolition wastes are also discussed.
Keywords: Aggregate, construction and demolition waste, landfill, large scale demolition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 639457 Weighted k-Nearest-Neighbor Techniques for High Throughput Screening Data
Authors: Kozak K, M. Kozak, K. Stapor
Abstract:
The k-nearest neighbors (knn) is a simple but effective method of classification. In this paper we present an extended version of this technique for chemical compounds used in High Throughput Screening, where the distances of the nearest neighbors can be taken into account. Our algorithm uses kernel weight functions as guidance for the process of defining activity in screening data. Proposed kernel weight function aims to combine properties of graphical structure and molecule descriptors of screening compounds. We apply the modified knn method on several experimental data from biological screens. The experimental results confirm the effectiveness of the proposed method.
Keywords: biological screening, kernel methods, KNN, QSAR
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2273456 On-Road Text Detection Platform for Driver Assistance Systems
Authors: Guezouli Larbi, Belkacem Soundes
Abstract:
The automation of the text detection process can help the human in his driving task. Its application can be very useful to help drivers to have more information about their environment by facilitating the reading of road signs such as directional signs, events, stores, etc. In this paper, a system consisting of two stages has been proposed. In the first one, we used pseudo-Zernike moments to pinpoint areas of the image that may contain text. The architecture of this part is based on three main steps, region of interest (ROI) detection, text localization, and non-text region filtering. Then, in the second step, we present a convolutional neural network architecture (On-Road Text Detection Network - ORTDN) which is considered as a classification phase. The results show that the proposed framework achieved ≈ 35 fps and an mAP of ≈ 90%, thus a low computational time with competitive accuracy.
Keywords: Text detection, CNN, PZM, deep learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 161455 Eclectic Rule-Extraction from Support Vector Machines
Authors: Nahla Barakat, Joachim Diederich
Abstract:
Support vector machines (SVMs) have shown superior performance compared to other machine learning techniques, especially in classification problems. Yet one limitation of SVMs is the lack of an explanation capability which is crucial in some applications, e.g. in the medical and security domains. In this paper, a novel approach for eclectic rule-extraction from support vector machines is presented. This approach utilizes the knowledge acquired by the SVM and represented in its support vectors as well as the parameters associated with them. The approach includes three stages; training, propositional rule-extraction and rule quality evaluation. Results from four different experiments have demonstrated the value of the approach for extracting comprehensible rules of high accuracy and fidelity.Keywords: Data mining, hybrid rule-extraction algorithms, medical diagnosis, SVMs
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1707454 Spatio-Temporal Video Slice Edges Analysis for Shot Transition Detection and Classification
Authors: Aissa Saoudi, Hassane Essafi
Abstract:
In this work we will present a new approach for shot transition auto-detection. Our approach is based on the analysis of Spatio-Temporal Video Slice (STVS) edges extracted from videos. The proposed approach is capable to efficiently detect both abrupt shot transitions 'cuts' and gradual ones such as fade-in, fade-out and dissolve. Compared to other techniques, our method is distinguished by its high level of precision and speed. Those performances are obtained due to minimizing the problem of the boundary shot detection to a simple 2D image partitioning problem.Keywords: Boundary shot detection, Shot transition detection, Video analysis, Video indexing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1636453 Identification and Classification of Plastic Resins using Near Infrared Reflectance Spectroscopy
Authors: Hamed Masoumi, Seyed Mohsen Safavi, Zahra Khani
Abstract:
In this paper, an automated system is presented for identification and separation of plastic resins based on near infrared (NIR) reflectance spectroscopy. For identification and separation among resins, a "Two-Filter" identification method is proposed that is capable to distinguish among polyethylene terephthalate (PET), high density polyethylene (HDPE), polyvinyl chloride (PVC), polypropylene (PP) and polystyrene (PS). Through surveying effects of parameters such as surface contamination, sample thickness, label and cap existence, it was obvious that the "Two-Filter" method has a high efficiency in identification of resins. It is shown that accurate identification and separation of five major resins can be obtained through calculating the relative reflectance at two wavelengths in the NIR region.Keywords: Identification, Near Infrared, Plastic, Separation, Spectroscopy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10011452 Comparison of Machine Learning and Deep Learning Algorithms for Automatic Classification of 80 Different Pollen Species
Authors: Endrick Barnacin, Jean-Luc Henry, Jimmy Nagau, Jack Molinié
Abstract:
Palynology is a field of interest in many disciplines due to its multiple applications: chronological dating, climatology, allergy treatment, and honey characterization. Unfortunately, the analysis of a pollen slide is a complicated and time consuming task that requires the intervention of experts in the field, which are becoming increasingly rare due to economic and social conditions. In this context, the automation of this task is urgent. In this work, we compare classical feature extraction methods (Shape, GLCM, LBP, and others) and Deep Learning (CNN and Transfer Learning) to perform a recognition task over 80 regional pollen species. It has been found that the use of Transfer Learning seems to be more precise than the other approaches.
Keywords: Image segmentation, stuck particles separation, Sobel operator, thresholding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 200451 Kohonen Self-Organizing Maps as a New Method for Determination of Salt Composition of Multi-Component Solutions
Authors: Sergey A. Burikov, Tatiana A. Dolenko, Kirill A. Gushchin, Sergey A. Dolenko
Abstract:
The paper presents the results of clusterization by Kohonen self-organizing maps (SOM) applied for analysis of array of Raman spectra of multi-component solutions of inorganic salts, for determination of types of salts present in the solution. It is demonstrated that use of SOM is a promising method for solution of clusterization and classification problems in spectroscopy of multicomponent objects, as attributing a pattern to some cluster may be used for recognition of component composition of the object.
Keywords: Kohonen self-organizing maps, clusterization, multicomponent solutions, Raman spectroscopy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1760450 Comparison of MODIS-Based Rice Extent Map and Landsat-Based Rice Classification Map in Determining Biomass Energy Potential of Rice Hull in Nueva Ecija, Philippines
Authors: Klathea Sevilla, Marjorie Remolador, Bryan Baltazar, Imee Saladaga, Loureal Camille Inocencio, Ma. Rosario Concepcion Ang
Abstract:
The underutilization of biomass resources in the Philippines, combined with its growing population and the rise in fossil fuel prices confirms demand for alternative energy sources. The goal of this paper is to provide a comparison of MODIS-based and Landsat-based agricultural land cover maps when used in the estimation of rice hull’s available energy potential. Biomass resource assessment was done using mathematical models and remote sensing techniques employed in a GIS platform.Keywords: Biomass, geographic information system, GIS, renewable energy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2240449 On Decomposition of Maximal Prefix Codes
Authors: Nikolai Krainiukov, Boris Melnikov
Abstract:
We study the properties of maximal prefix codes. The codes have many applications in computer science, theory of formal languages, data processing and data classification. Our approaches to study use finite state automata (so-called flower automata) for the representation of prefix codes. An important task is the decomposition of prefix codes into prime prefix codes (factors). We discuss properties of such prefix code decompositions. A linear time algorithm is designed to find the prime decomposition. We used the GAP computer algebra system, which allows us to perform algebraic operations for free semigroups, monoids and automata.
Keywords: Maximal prefix code, regular languages, flower automata, prefix code decomposing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 66448 Automatic Segmentation of Dermoscopy Images Using Histogram Thresholding on Optimal Color Channels
Authors: Rahil Garnavi, Mohammad Aldeen, M. Emre Celebi, Alauddin Bhuiyan, Constantinos Dolianitis, George Varigos
Abstract:
Automatic segmentation of skin lesions is the first step towards development of a computer-aided diagnosis of melanoma. Although numerous segmentation methods have been developed, few studies have focused on determining the most discriminative and effective color space for melanoma application. This paper proposes a novel automatic segmentation algorithm using color space analysis and clustering-based histogram thresholding, which is able to determine the optimal color channel for segmentation of skin lesions. To demonstrate the validity of the algorithm, it is tested on a set of 30 high resolution dermoscopy images and a comprehensive evaluation of the results is provided, where borders manually drawn by four dermatologists, are compared to automated borders detected by the proposed algorithm. The evaluation is carried out by applying three previously used metrics of accuracy, sensitivity, and specificity and a new metric of similarity. Through ROC analysis and ranking the metrics, it is shown that the best results are obtained with the X and XoYoR color channels which results in an accuracy of approximately 97%. The proposed method is also compared with two state-ofthe- art skin lesion segmentation methods, which demonstrates the effectiveness and superiority of the proposed segmentation method.Keywords: Border detection, Color space analysis, Dermoscopy, Histogram thresholding, Melanoma, Segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2083447 Analyzing Multi-Labeled Data Based on the Roll of a Concept against a Semantic Range
Authors: Masahiro Kuzunishi, Tetsuya Furukawa, Ke Lu
Abstract:
Classifying data hierarchically is an efficient approach to analyze data. Data is usually classified into multiple categories, or annotated with a set of labels. To analyze multi-labeled data, such data must be specified by giving a set of labels as a semantic range. There are some certain purposes to analyze data. This paper shows which multi-labeled data should be the target to be analyzed for those purposes, and discusses the role of a label against a set of labels by investigating the change when a label is added to the set of labels. These discussions give the methods for the advanced analysis of multi-labeled data, which are based on the role of a label against a semantic range.Keywords: Classification Hierarchies, Data Analysis, Multilabeled Data, Orders of Sets of Labels
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1207446 A Two-Stage Expert System for Diagnosis of Leukemia Based on Type-2 Fuzzy Logic
Authors: Ali Akbar Sadat Asl
Abstract:
Diagnosis and deciding about diseases in medical fields is facing innate uncertainty which can affect the whole process of treatment. This decision is made based on expert knowledge and the way in which an expert interprets the patient's condition, and the interpretation of the various experts from the patient's condition may be different. Fuzzy logic can provide mathematical modeling for many concepts, variables, and systems that are unclear and ambiguous and also it can provide a framework for reasoning, inference, control, and decision making in conditions of uncertainty. In systems with high uncertainty and high complexity, fuzzy logic is a suitable method for modeling. In this paper, we use type-2 fuzzy logic for uncertainty modeling that is in diagnosis of leukemia. The proposed system uses an indirect-direct approach and consists of two stages: In the first stage, the inference of blood test state is determined. In this step, we use an indirect approach where the rules are extracted automatically by implementing a clustering approach. In the second stage, signs of leukemia, duration of disease until its progress and the output of the first stage are combined and the final diagnosis of the system is obtained. In this stage, the system uses a direct approach and final diagnosis is determined by the expert. The obtained results show that the type-2 fuzzy expert system can diagnose leukemia with the average accuracy about 97%.
Keywords: Expert system, leukemia, medical diagnosis, type-2 fuzzy logic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1052445 An Approach for the Prediction of Cardiovascular Diseases
Authors: Nebi Gedik
Abstract:
Regardless of age or gender, cardiovascular illnesses are a serious health concern because of things like poor eating habits, stress, a sedentary lifestyle, hard work schedules, alcohol use, and weight. It tends to happen suddenly and has a high rate of recurrence. Machine learning models can be implemented to assist healthcare systems in the accurate detection and diagnosis of cardiovascular disease (CVD) in patients. Improved heart failure prediction is one of the primary goals of researchers using the heart disease dataset. The purpose of this study is to identify the feature or features that offer the best classification prediction for CVD detection. The support vector machine classifier is used to compare each feature's performance. It has been determined which feature produces the best results.
Keywords: Cardiovascular disease, feature extraction, supervised learning, support vector machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 164444 Artificial Neural Networks for Classifying Magnetic Measurements in Tokamak Reactors
Authors: A. Greco, N. Mammone, F.C. Morabito, M.Versaci
Abstract:
This paper is mainly concerned with the application of a novel technique of data interpretation to the characterization and classification of measurements of plasma columns in Tokamak reactors for nuclear fusion applications. The proposed method exploits several concepts derived from soft computing theory. In particular, Artifical Neural Networks have been exploited to classify magnetic variables useful to determine shape and position of the plasma with a reduced computational complexity. The proposed technique is used to analyze simulated databases of plasma equilibria based on ITER geometry configuration. As well as demonstrating the successful recovery of scalar equilibrium parameters, we show that the technique can yield practical advantages compares with earlier methods.
Keywords: Tokamak, sensors, artificial neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1821443 Surrogate based Evolutionary Algorithm for Design Optimization
Authors: Maumita Bhattacharya
Abstract:
Optimization is often a critical issue for most system design problems. Evolutionary Algorithms are population-based, stochastic search techniques, widely used as efficient global optimizers. However, finding optimal solution to complex high dimensional, multimodal problems often require highly computationally expensive function evaluations and hence are practically prohibitive. The Dynamic Approximate Fitness based Hybrid EA (DAFHEA) model presented in our earlier work [14] reduced computation time by controlled use of meta-models to partially replace the actual function evaluation by approximate function evaluation. However, the underlying assumption in DAFHEA is that the training samples for the meta-model are generated from a single uniform model. Situations like model formation involving variable input dimensions and noisy data certainly can not be covered by this assumption. In this paper we present an enhanced version of DAFHEA that incorporates a multiple-model based learning approach for the SVM approximator. DAFHEA-II (the enhanced version of the DAFHEA framework) also overcomes the high computational expense involved with additional clustering requirements of the original DAFHEA framework. The proposed framework has been tested on several benchmark functions and the empirical results illustrate the advantages of the proposed technique.Keywords: Evolutionary algorithm, Fitness function, Optimization, Meta-model, Stochastic method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1575442 Stock Movement Prediction Using Price Factor and Deep Learning
Abstract:
The development of machine learning methods and techniques has opened doors for investigation in many areas such as medicines, economics, finance, etc. One active research area involving machine learning is stock market prediction. This research paper tries to consider multiple techniques and methods for stock movement prediction using historical price or price factors. The paper explores the effectiveness of some deep learning frameworks for forecasting stock. Moreover, an architecture (TimeStock) is proposed which takes the representation of time into account apart from the price information itself. Our model achieves a promising result that shows a potential approach for the stock movement prediction problem.
Keywords: Classification, machine learning, time representation, stock prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1152441 Mineralogical Characterization and Petrographic Classification of the Soil of Casablanca City
Authors: I. Fahi, T. Remmal, F. El Kamel, B. Ayoub
Abstract:
The treatment of the geotechnical database of the region of Casablanca was difficult to achieve due to the heterogeneity of the nomenclature of the lithological formations composing its soil. It appears necessary to harmonize the nomenclature of the facies and to produce cartographic documents useful for construction projects and studies before any investment program. To achieve this, more than 600 surveys made by the Public Laboratory for Testing and Studies (LPEE) in the agglomeration of Casablanca, were studied. Moreover, some local observations were made in different places of the metropolis. Each survey was the subject of a sheet containing lithological succession, macro and microscopic description of petrographic facies with photographic illustration, as well as measurements of geomechanical tests. In addition, an X-ray diffraction analysis was made in order to characterize the surficial formations of the region.
Keywords: Casablanca, guidebook, petrography, soil.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 918440 Performance Comparison and Evaluation of AdaBoost and SoftBoost Algorithms on Generic Object Recognition
Authors: Doaa Hegazy, Joachim Denzler
Abstract:
SoftBoost is a recently presented boosting algorithm, which trades off the size of achieved classification margin and generalization performance. This paper presents a performance evaluation of SoftBoost algorithm on the generic object recognition problem. An appearance-based generic object recognition model is used. The evaluation experiments are performed using a difficult object recognition benchmark. An assessment with respect to different degrees of label noise as well as a comparison to the well known AdaBoost algorithm is performed. The obtained results reveal that SoftBoost is encouraged to be used in cases when the training data is known to have a high degree of noise. Otherwise, using Adaboost can achieve better performance.Keywords: SoftBoost algorithm, AdaBoost algorithm, Generic object recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1828439 Automatic Detection of Breast Tumors in Sonoelastographic Images Using DWT
Authors: A. Sindhuja, V. Sadasivam
Abstract:
Breast Cancer is the most common malignancy in women and the second leading cause of death for women all over the world. Earlier the detection of cancer, better the treatment. The diagnosis and treatment of the cancer rely on segmentation of Sonoelastographic images. Texture features has not considered for Sonoelastographic segmentation. Sonoelastographic images of 15 patients containing both benign and malignant tumorsare considered for experimentation.The images are enhanced to remove noise in order to improve contrast and emphasize tumor boundary. It is then decomposed into sub-bands using single level Daubechies wavelets varying from single co-efficient to six coefficients. The Grey Level Co-occurrence Matrix (GLCM), Local Binary Pattern (LBP) features are extracted and then selected by ranking it using Sequential Floating Forward Selection (SFFS) technique from each sub-band. The resultant images undergo K-Means clustering and then few post-processing steps to remove the false spots. The tumor boundary is detected from the segmented image. It is proposed that Local Binary Pattern (LBP) from the vertical coefficients of Daubechies wavelet with two coefficients is best suited for segmentation of Sonoelastographic breast images among the wavelet members using one to six coefficients for decomposition. The results are also quantified with the help of an expert radiologist. The proposed work can be used for further diagnostic process to decide if the segmented tumor is benign or malignant.
Keywords: Breast Cancer, Segmentation, Sonoelastography, Tumor Detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2206438 An AI-Based Dynamical Resource Allocation Calculation Algorithm for Unmanned Aerial Vehicle
Authors: Zhou Luchen, Wu Yubing, Burra Venkata Durga Kumar
Abstract:
As the scale of the network becomes larger and more complex than before, the density of user devices is also increasing. The development of Unmanned Aerial Vehicle (UAV) networks is able to collect and transform data in an efficient way by using software-defined networks (SDN) technology. This paper proposed a three-layer distributed and dynamic cluster architecture to manage UAVs by using an AI-based resource allocation calculation algorithm to address the overloading network problem. Through separating services of each UAV, the UAV hierarchical cluster system performs the main function of reducing the network load and transferring user requests, with three sub-tasks including data collection, communication channel organization, and data relaying. In this cluster, a head node and a vice head node UAV are selected considering the CPU, RAM, and ROM memory of devices, battery charge, and capacity. The vice head node acts as a backup that stores all the data in the head node. The k-means clustering algorithm is used in order to detect high load regions and form the UAV layered clusters. The whole process of detecting high load areas, forming and selecting UAV clusters, and moving the selected UAV cluster to that area is proposed as offloading traffic algorithm.
Keywords: k-means, resource allocation, SDN, UAV network, unmanned aerial vehicles.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 348437 Non-Overlapping Hierarchical Index Structure for Similarity Search
Authors: Mounira Taileb, Sid Lamrous, Sami Touati
Abstract:
In order to accelerate the similarity search in highdimensional database, we propose a new hierarchical indexing method. It is composed of offline and online phases. Our contribution concerns both phases. In the offline phase, after gathering the whole of the data in clusters and constructing a hierarchical index, the main originality of our contribution consists to develop a method to construct bounding forms of clusters to avoid overlapping. For the online phase, our idea improves considerably performances of similarity search. However, for this second phase, we have also developed an adapted search algorithm. Our method baptized NOHIS (Non-Overlapping Hierarchical Index Structure) use the Principal Direction Divisive Partitioning (PDDP) as algorithm of clustering. The principle of the PDDP is to divide data recursively into two sub-clusters; division is done by using the hyper-plane orthogonal to the principal direction derived from the covariance matrix and passing through the centroid of the cluster to divide. Data of each two sub-clusters obtained are including by a minimum bounding rectangle (MBR). The two MBRs are directed according to the principal direction. Consequently, the nonoverlapping between the two forms is assured. Experiments use databases containing image descriptors. Results show that the proposed method outperforms sequential scan and SRtree in processing k-nearest neighbors.
Keywords: K-nearest neighbour search, multi-dimensional indexing, multimedia databases, similarity search.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1561436 Learning of Class Membership Values by Ellipsoidal Decision Regions
Authors: Leehter Yao, Chin-Chin Lin
Abstract:
A novel method of learning complex fuzzy decision regions in the n-dimensional feature space is proposed. Through the fuzzy decision regions, a given pattern's class membership value of every class is determined instead of the conventional crisp class the pattern belongs to. The n-dimensional fuzzy decision region is approximated by union of hyperellipsoids. By explicitly parameterizing these hyperellipsoids, the decision regions are determined by estimating the parameters of each hyperellipsoid.Genetic Algorithm is applied to estimate the parameters of each region component. With the global optimization ability of GA, the learned decision region can be arbitrarily complex.
Keywords: Ellipsoid, genetic algorithm, decision regions, classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1427435 Categorical Missing Data Imputation Using Fuzzy Neural Networks with Numerical and Categorical Inputs
Authors: Pilar Rey-del-Castillo, Jesús Cardeñosa
Abstract:
There are many situations where input feature vectors are incomplete and methods to tackle the problem have been studied for a long time. A commonly used procedure is to replace each missing value with an imputation. This paper presents a method to perform categorical missing data imputation from numerical and categorical variables. The imputations are based on Simpson-s fuzzy min-max neural networks where the input variables for learning and classification are just numerical. The proposed method extends the input to categorical variables by introducing new fuzzy sets, a new operation and a new architecture. The procedure is tested and compared with others using opinion poll data.
Keywords: Classifier, imputation techniques, fuzzy systems, fuzzy min-max neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1778434 A Classical Method of Optimizing Manufacturing Systems Using a Number of Industrial Engineering Techniques
Authors: John M. Ikome, Martha E. Ikome, Therese Van Wyk
Abstract:
Productivity optimization of a company can significantly increase the company’s output and productivity which can be in the form of corrective actions of ineffective activities, process simplification, and reduction of variations, responsiveness, and reduction of set-up-time which are all under the classification of waste within the manufacturing environment. Deriving a means to eliminate a number of these issues has a key importance for manufacturing organization. This paper focused on a number of industrial engineering techniques which include a cause and effect diagram, to identify and optimize the method or systems being used. Based on our results, it shows that there are a number of variations within the production processes that can significantly disrupt the expected output.
Keywords: Optimization, fishbone diagram, productivity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 999433 A Study of Classification Models to Predict Drill-Bit Breakage Using Degradation Signals
Authors: Bharatendra Rai
Abstract:
Cutting tools are widely used in manufacturing processes and drilling is the most commonly used machining process. Although drill-bits used in drilling may not be expensive, their breakage can cause damage to expensive work piece being drilled and at the same time has major impact on productivity. Predicting drill-bit breakage, therefore, is important in reducing cost and improving productivity. This study uses twenty features extracted from two degradation signals viz., thrust force and torque. The methodology used involves developing and comparing decision tree, random forest, and multinomial logistic regression models for classifying and predicting drill-bit breakage using degradation signals.
Keywords: Degradation signal, drill-bit breakage, random forest, multinomial logistic regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2241432 Performance Appraisal System using Multifactorial Evaluation Model
Abstract:
Performance appraisal of employee is important in managing the human resource of an organization. With the change towards knowledge-based capitalism, maintaining talented knowledge workers is critical. However, management classification of “outstanding", “poor" and “average" performance may not be an easy decision. Besides that, superior might also tend to judge the work performance of their subordinates informally and arbitrarily especially without the existence of a system of appraisal. In this paper, we propose a performance appraisal system using multifactorial evaluation model in dealing with appraisal grades which are often express vaguely in linguistic terms. The proposed model is for evaluating staff performance based on specific performance appraisal criteria. The project was collaboration with one of the Information and Communication Technology company in Malaysia with reference to its performance appraisal process.Keywords: Multifactorial Evaluation Model, performance appraisal system, decision support system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4267431 Scene Adaptive Shadow Detection Algorithm
Authors: Mohammed Ibrahim M, Anupama R.
Abstract:
Robustness is one of the primary performance criteria for an Intelligent Video Surveillance (IVS) system. One of the key factors in enhancing the robustness of dynamic video analysis is,providing accurate and reliable means for shadow detection. If left undetected, shadow pixels may result in incorrect object tracking and classification, as it tends to distort localization and measurement information. Most of the algorithms proposed in literature are computationally expensive; some to the extent of equalling computational requirement of motion detection. In this paper, the homogeneity property of shadows is explored in a novel way for shadow detection. An adaptive division image (which highlights homogeneity property of shadows) analysis followed by a relatively simpler projection histogram analysis for penumbra suppression is the key novelty in our approach.
Keywords: homogeneity, penumbra, projection histogram, shadow correction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1901430 Vulnerability of Groundwater Resources Selected for Emergency Water Supply
Authors: Frantisek Bozek, Alena Bumbova, Eduard Bakos
Abstract:
Paper is dealing with vulnerability concerning elements of hydrological structures and elements of technological equipments which are acceptable for groundwater resources. The vulnerability assessment stems from the application of the register of hazards and a potential threat to individual water source elements within each type of hazard. The proposed procedure is pattern for assessing the risks of disturbance, damage, or destruction of water source by the identified natural or technological hazards and consequently for classification of these risks in relation to emergency water supply. Using of this procedure was verified on selected groundwater resource in particular region, which seems to be as potentially useful for crisis planning system.
Keywords: Hazard, Hydrogeological Structure, Elements, Index, Sensitivity, Water Source, Vulnerability
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1439