Search results for: Feature extraction/selection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2298

Search results for: Feature extraction/selection

1998 An Adequate Choice of Initial Sample Size for Selection Approach

Authors: Mohammad H. Almomani, Rosmanjawati Abdul Rahman

Abstract:

In this paper, we consider the effect of the initial sample size on the performance of a sequential approach that used in selecting a good enough simulated system, when the number of alternatives is very large. We implement a sequential approach on M=M=1 queuing system under some parameter settings, with a different choice of the initial sample sizes to explore the impacts on the performance of this approach. The results show that the choice of the initial sample size does affect the performance of our selection approach.

Keywords: Ranking and Selection, Ordinal Optimization, Optimal Computing Budget Allocation, Subset Selection, Indifference-Zone, Initial Sample Size.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1257
1997 Eclectic Rule-Extraction from Support Vector Machines

Authors: Nahla Barakat, Joachim Diederich

Abstract:

Support vector machines (SVMs) have shown superior performance compared to other machine learning techniques, especially in classification problems. Yet one limitation of SVMs is the lack of an explanation capability which is crucial in some applications, e.g. in the medical and security domains. In this paper, a novel approach for eclectic rule-extraction from support vector machines is presented. This approach utilizes the knowledge acquired by the SVM and represented in its support vectors as well as the parameters associated with them. The approach includes three stages; training, propositional rule-extraction and rule quality evaluation. Results from four different experiments have demonstrated the value of the approach for extracting comprehensible rules of high accuracy and fidelity.

Keywords: Data mining, hybrid rule-extraction algorithms, medical diagnosis, SVMs

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1657
1996 Preliminary Evaluation of Feasibility for Wind Energy Production on Offshore Extraction Platforms

Authors: M. Raciti Castelli, S. De Betta, E. Benini

Abstract:

A preliminary evaluation of the feasibility of installing small wind turbines on offshore oil and gas extraction platforms is presented. Some aerodynamic considerations are developed in order to determine the best rotor architecture to exploit the wind potential on such installations, assuming that wind conditions over the platforms are similar to those registered on the roofs of urban buildings. Economical considerations about both advantages and disadvantages of the exploitation of wind energy on offshore extraction platforms with respect to conventional offshore wind plants, is also presented. Finally, wind charts of European offshore winds are presented together with a map of the major offshore installations.

Keywords: Extraction platform, offshore wind energy, verticalaxis wind turbine (VAWT).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1765
1995 Vehicle Detection Method using Haar-like Feature on Real Time System

Authors: Sungji Han, Youngjoon Han, Hernsoo Hahn

Abstract:

This paper presents a robust vehicle detection approach using Haar-like feature. It is possible to get a strong edge feature from this Haar-like feature. Therefore it is very effective to remove the shadow of a vehicle on the road. And we can detect the boundary of vehicles accurately. In the paper, the vehicle detection algorithm can be divided into two main steps. One is hypothesis generation, and the other is hypothesis verification. In the first step, it determines vehicle candidates using features such as a shadow, intensity, and vertical edge. And in the second step, it determines whether the candidate is a vehicle or not by using the symmetry of vehicle edge features. In this research, we can get the detection rate over 15 frames per second on our embedded system.

Keywords: vehicle detection, haar-like feauture, single camera, real time

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3268
1994 Influences of Juice Extraction and Drying Methods on the Chemical Analysis of Lemon Peels

Authors: Azza A. Abou-Arab, Marwa H. Mahmoud, Ferial M. Abu-Salem

Abstract:

This study aimed to determine the influence of some different juice extraction methods (screw type hand operated juice extractor and pressed squeeze juice extractor) as well as drying methods (microwave, solar and oven drying) on the chemical properties of lemon peels. It could be concluded that extraction of juice by screw type and drying of peel using the microwave drying method were the best preparative processing steps methods for lemon peel utilization as food additives.

Keywords: Lemon peel, extraction of juice methods, chemical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 968
1993 Stroke Extraction and Approximation with Interpolating Lagrange Curves

Authors: Bence Kővári, ZSolt Kertész

Abstract:

This paper proposes a stroke extraction method for use in off-line signature verification. After giving a brief overview of the current ongoing researches an algorithm is introduced for detecting and following strokes in static images of signatures. Problems like the handling of junctions and variations in line width and line intensity are discussed in detail. Results are validated by both using an existing on-line signature database and by employing image registration methods.

Keywords: Stroke extraction, spline fitting, off-line signatureverification, image registration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1936
1992 Biocompatible Ionic Liquids in Liquid – Liquid Extraction of Lactic Acid: A Comparative Study

Authors: Konstantza Tonova, Ivan Svinyarov, Milen G. Bogdanov

Abstract:

Ionic liquids consisting of a phosphonium cationic moiety and a saccharinate anion are synthesized and compared with their precursors, phosphonium chlorides, in reference to their extraction efficiency towards L-lactic acid. On the base of measurements of the acid and the water partitioning in the equilibrium biphasic systems, the molar ratios between acid, water and ionic liquid are estimated which allows to deduce the lactic acid extractive pathway. The effect of a salting-out addition that strengthens hydrophobicity in both phases is studied in view to reveal the best biphasic system with respect to IL low toxicity and high extraction efficiency.

Keywords: Biphasic system, Extraction, Ionic liquids, Lactic acid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2672
1991 Enhanced Bidirectional Selection Sort

Authors: Jyoti Dua

Abstract:

An algorithm is a well-defined procedure that takes some input in the form of some values, processes them and gives the desired output. It forms the basis of many other algorithms such as searching, pattern matching, digital filters etc., and other applications have been found in database systems, data statistics and processing, data communications and pattern matching. This paper introduces algorithmic “Enhanced Bidirectional Selection” sort which is bidirectional, stable. It is said to be bidirectional as it selects two values smallest from the front and largest from the rear and assigns them to their appropriate locations thus reducing the number of passes by half the total number of elements as compared to selection sort.

Keywords: Bubble sort, cocktail sort, selection sort, heap sort.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2319
1990 A Model for Test Case Selection in the Software-Development Life Cycle

Authors: Adtha Lawanna

Abstract:

Software maintenance is one of the essential processes of Software-Development Life Cycle. The main philosophies of retaining software concern the improvement of errors, the revision of codes, the inhibition of future errors, and the development in piece and capacity. While the adjustment has been employing, the software structure has to be retested to an upsurge a level of assurance that it will be prepared due to the requirements. According to this state, the test cases must be considered for challenging the revised modules and the whole software. A concept of resolving this problem is ongoing by regression test selection such as the retest-all selections, random/ad-hoc selection and the safe regression test selection. Particularly, the traditional techniques concern a mapping between the test cases in a test suite and the lines of code it executes. However, there are not only the lines of code as one of the requirements that can affect the size of test suite but including the number of functions and faulty versions. Therefore, a model for test case selection is developed to cover those three requirements by the integral technique which can produce the smaller size of the test cases when compared with the traditional regression selection techniques.

Keywords: Software maintenance, regression test selection, test case.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1657
1989 A Model for Test Case Selection in the Software-Development Life Cycle

Authors: Adtha Lawanna

Abstract:

Software maintenance is one of the essential processes of Software-Development Life Cycle. The main philosophies of retaining software concern the improvement of errors, the revision of codes, the inhibition of future errors, and the development in piece and capacity. While the adjustment has been employing, the software structure has to be retested to an upsurge a level of assurance that it will be prepared due to the requirements. According to this state, the test cases must be considered for challenging the revised modules and the whole software. A concept of resolving this problem is ongoing by regression test selection such as the retest-all selections, random/ad-hoc selection and the safe regression test selection. Particularly, the traditional techniques concern a mapping between the test cases in a test suite and the lines of code it executes. However, there are not only the lines of code as one of the requirements that can affect the size of test suite but including the number of functions and faulty versions. Therefore, a model for test case selection is developed to cover those three requirements by the integral technique which can produce the smaller size of the test cases when compared with the traditional regression selection techniques.

Keywords: Software maintenance, regression test selection, test case.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1552
1988 Towards a Systematic, Cost-Effective Approach for ERP Selection

Authors: Hassan Haghighi, Omid Mafi

Abstract:

Existing experiences indicate that one of the most prominent reasons that some ERP implementations fail is related to selecting an improper ERP package. Among those important factors resulting in inappropriate ERP selections, one is to ignore preliminary activities that should be done before the evaluation of ERP packages. Another factor yielding these unsuitable selections is that usually organizations employ prolonged and costly selection processes in such extent that sometimes the process would never be finalized or sometimes the evaluation team might perform many key final activities in an incomplete or inaccurate way due to exhaustion, lack of interest or out-of-date data. In this paper, a systematic approach that recommends some activities to be done before and after the main selection phase is introduced for choosing an ERP package. On the other hand, the proposed approach has utilized some ideas that accelerates the selection process at the same time that reduces the probability of an erroneous final selection.

Keywords: enterprise resource planning, evaluation and selectionof ERP packages, organizational readiness for employing ERP, evaluationlists.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1499
1987 Comparison of Different Solvents and Extraction Methods for Isolation of Phenolic Compounds from Horseradish Roots (Armoracia rusticana)

Authors: Lolita Tomsone, Zanda Kruma, Ruta Galoburda

Abstract:

Horseradish (Armoracia rusticana) is a perennial herb belonging to the Brassicaceae family and contains biologically active substances. The aim of the current research was to determine best method for extraction of phenolic compounds from horseradish roots showing high antiradical activity. Three genotypes (No. 105; No. 106 and variety ‘Turku’) of horseradish roots were extracted with eight different solvents: n-hexane, ethyl acetate, diethyl ether, 2-propanol, acetone, ethanol (95%), ethanol / water / acetic acid (80/20/1 v/v/v) and ethanol / water (80/20 by volume) using two extraction methods (conventional and Soxhlet). As the best solvents ethanol and ethanol / water solutions can be chosen. Although in Soxhlet extracts TPC was higher, scavenging activity of DPPH˙ radicals did not increase. It can be concluded that using Soxhlet extraction method more compounds that are not effective antioxidants.

Keywords: DPPH˙, extraction, solvent, Soxhlet, TPC

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14448
1986 Personal Authentication Using FDOST in Finger Knuckle-Print Biometrics

Authors: N. B. Mahesh Kumar, K. Premalatha

Abstract:

The inherent skin patterns created at the joints in the finger exterior are referred as finger knuckle-print. It is exploited to identify a person in a unique manner because the finger knuckle print is greatly affluent in textures. In biometric system, the region of interest is utilized for the feature extraction algorithm. In this paper, local and global features are extracted separately. Fast Discrete Orthonormal Stockwell Transform is exploited to extract the local features. Global feature is attained by escalating the size of Fast Discrete Orthonormal Stockwell Transform to infinity. Two features are fused to increase the recognition accuracy. A matching distance is calculated for both the features individually. Then two distances are merged mutually to acquire the final matching distance. The proposed scheme gives the better performance in terms of equal error rate and correct recognition rate.

Keywords: Hamming distance, Instantaneous phase, Region of Interest, Recognition accuracy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2712
1985 Development of an Ensemble Classification Model Based on Hybrid Filter-Wrapper Feature Selection for Email Phishing Detection

Authors: R. B. Ibrahim, M. S. Argungu, I. M. Mungadi

Abstract:

It is obvious in this present time, internet has become an indispensable part of human life since its inception. The Internet has provided diverse opportunities to make life so easy for human beings, through the adoption of various channels. Among these channels are email, internet banking, video conferencing, and the like. Email is one of the easiest means of communication hugely accepted among individuals and organizations globally. But over decades the security integrity of this platform has been challenged with malicious activities like Phishing. Email phishing is designed by phishers to fool the recipient into handing over sensitive personal information such as passwords, credit card numbers, account credentials, social security numbers, etc. This activity has caused a lot of financial damage to email users globally which has resulted in bankruptcy, sudden death of victims, and other health-related sicknesses. Although many methods have been proposed to detect email phishing, in this research, the results of multiple machine-learning methods for predicting email phishing have been compared with the use of filter-wrapper feature selection. It is worth noting that all three models performed substantially but one outperformed the other. The dataset used for these models is obtained from Kaggle online data repository, while three classifiers: decision tree, Naïve Bayes, and Logistic regression are ensemble (Bagging) respectively. Results from the study show that the Decision Tree (CART) bagging ensemble recorded the highest accuracy of 98.13% using PEF (Phishing Essential Features). This result further demonstrates the dependability of the proposed model.

Keywords: Ensemble, hybrid, filter-wrapper, phishing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 113
1984 Relay Node Selection Algorithm for Cooperative Communications in Wireless Networks

Authors: Sunmyeng Kim

Abstract:

IEEE 802.11a/b/g standards support multiple transmission rates. Even though the use of multiple transmission rates increase the WLAN capacity, this feature leads to the performance anomaly problem. Cooperative communication was introduced to relieve the performance anomaly problem. Data packets are delivered to the destination much faster through a relay node with high rate than through direct transmission to the destination at low rate. In the legacy cooperative protocols, a source node chooses a relay node only based on the transmission rate. Therefore, they are not so feasible in multi-flow environments since they do not consider the effect of other flows. To alleviate the effect, we propose a new relay node selection algorithm based on the transmission rate and channel contention level. Performance evaluation is conducted using simulation, and shows that the proposed protocol significantly outperforms the previous protocol in terms of throughput and delay.

Keywords: Cooperative communications, MAC protocol, Relay node, WLAN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2886
1983 Edge-end Pixel Extraction for Edge-based Image Segmentation

Authors: Mahinda P. Pathegama, Özdemir Göl

Abstract:

Extraction of edge-end-pixels is an important step for the edge linking process to achieve edge-based image segmentation. This paper presents an algorithm to extract edge-end pixels together with their directional sensitivities as an augmentation to the currently available mathematical models. The algorithm is implemented in the Java environment because of its inherent compatibility with web interfaces since its main use is envisaged to be for remote image analysis on a virtual instrumentation platform.

Keywords: edge-end pixels, image processing, imagesegmentation, pixel extraction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2113
1982 Comparison of MFCC and Cepstral Coefficients as a Feature Set for PCG Biometric Systems

Authors: Justin Leo Cheang Loong, Khazaimatol S Subari, Muhammad Kamil Abdullah, Nurul Nadia Ahmad, RosliBesar

Abstract:

Heart sound is an acoustic signal and many techniques used nowadays for human recognition tasks borrow speech recognition techniques. One popular choice for feature extraction of accoustic signals is the Mel Frequency Cepstral Coefficients (MFCC) which maps the signal onto a non-linear Mel-Scale that mimics the human hearing. However the Mel-Scale is almost linear in the frequency region of heart sounds and thus should produce similar results with the standard cepstral coefficients (CC). In this paper, MFCC is investigated to see if it produces superior results for PCG based human identification system compared to CC. Results show that the MFCC system is still superior to CC despite linear filter-banks in the lower frequency range, giving up to 95% correct recognition rate for MFCC and 90% for CC. Further experiments show that the high recognition rate is due to the implementation of filter-banks and not from Mel-Scaling.

Keywords: Biometric, Phonocardiogram, Cepstral Coefficients, Mel Frequency

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3503
1981 Dataset Analysis Using Membership-Deviation Graph

Authors: Itgel Bayarsaikhan, Jimin Lee, Sejong Oh

Abstract:

Classification is one of the primary themes in computational biology. The accuracy of classification strongly depends on quality of a dataset, and we need some method to evaluate this quality. In this paper, we propose a new graphical analysis method using 'Membership-Deviation Graph (MDG)' for analyzing quality of a dataset. MDG represents degree of membership and deviations for instances of a class in the dataset. The result of MDG analysis is used for understanding specific feature and for selecting best feature for classification.

Keywords: feature, classification, machine learning algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1403
1980 Measuring Text-Based Semantics Relatedness Using WordNet

Authors: Madiha Khan, Sidrah Ramzan, Seemab Khan, Shahzad Hassan, Kamran Saeed

Abstract:

Measuring semantic similarity between texts is calculating semantic relatedness between texts using various techniques. Our web application (Measuring Relatedness of Concepts-MRC) allows user to input two text corpuses and get semantic similarity percentage between both using WordNet. Our application goes through five stages for the computation of semantic relatedness. Those stages are: Preprocessing (extracts keywords from content), Feature Extraction (classification of words into Parts-of-Speech), Synonyms Extraction (retrieves synonyms against each keyword), Measuring Similarity (using keywords and synonyms, similarity is measured) and Visualization (graphical representation of similarity measure). Hence the user can measure similarity on basis of features as well. The end result is a percentage score and the word(s) which form the basis of similarity between both texts with use of different tools on same platform. In future work we look forward for a Web as a live corpus application that provides a simpler and user friendly tool to compare documents and extract useful information.

Keywords: GraphViz representation, semantic relatedness, similarity measurement, WordNet similarity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 772
1979 Study on Extraction of Ceric Oxide from Monazite Concentrate

Authors: Lwin Thuzar Shwe, Nwe Nwe Soe, Kay Thi Lwin

Abstract:

Cerium oxide is to be recovered from monazite, which contains about 27.35% CeO2. The principal objective of this study is to be able to extract cerium oxide from monazite of Moemeik Myitsone Area. The treatment of monazite in this study involves three main steps; extraction of cerium hydroxide from monazite, solvent extraction of cerium hydroxide, and precipitation with oxalic acid and calcination of cerium oxalate.

Keywords: Calcination, Digestion, Precipitation, SolventExtraction

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2545
1978 Selection Initial modes for Belief K-modes Method

Authors: Sarra Ben Hariz, Zied Elouedi, Khaled Mellouli

Abstract:

The belief K-modes method (BKM) approach is a new clustering technique handling uncertainty in the attribute values of objects in both the cluster construction task and the classification one. Like the standard version of this method, the BKM results depend on the chosen initial modes. So, one selection method of initial modes is developed, in this paper, aiming at improving the performances of the BKM approach. Experiments with several sets of real data show that by considered the developed selection initial modes method, the clustering algorithm produces more accurate results.

Keywords: Clustering, Uncertainty, Belief function theory, Belief K-modes Method, Initial modes selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1773
1977 Optimal Classifying and Extracting Fuzzy Relationship from Query Using Text Mining Techniques

Authors: Faisal Alshuwaier, Ali Areshey

Abstract:

Text mining techniques are generally applied for classifying the text, finding fuzzy relations and structures in data sets. This research provides plenty text mining capabilities. One common application is text classification and event extraction, which encompass deducing specific knowledge concerning incidents referred to in texts. The main contribution of this paper is the clarification of a concept graph generation mechanism, which is based on a text classification and optimal fuzzy relationship extraction. Furthermore, the work presented in this paper explains the application of fuzzy relationship extraction and branch and bound (BB) method to simplify the texts.

Keywords: Extraction, Max-Prod, Fuzzy Relations, Text Mining, Memberships, Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2135
1976 An Adaptive Dimensionality Reduction Approach for Hyperspectral Imagery Semantic Interpretation

Authors: Akrem Sellami, Imed Riadh Farah, Basel Solaiman

Abstract:

With the development of HyperSpectral Imagery (HSI) technology, the spectral resolution of HSI became denser, which resulted in large number of spectral bands, high correlation between neighboring, and high data redundancy. However, the semantic interpretation is a challenging task for HSI analysis due to the high dimensionality and the high correlation of the different spectral bands. In fact, this work presents a dimensionality reduction approach that allows to overcome the different issues improving the semantic interpretation of HSI. Therefore, in order to preserve the spatial information, the Tensor Locality Preserving Projection (TLPP) has been applied to transform the original HSI. In the second step, knowledge has been extracted based on the adjacency graph to describe the different pixels. Based on the transformation matrix using TLPP, a weighted matrix has been constructed to rank the different spectral bands based on their contribution score. Thus, the relevant bands have been adaptively selected based on the weighted matrix. The performance of the presented approach has been validated by implementing several experiments, and the obtained results demonstrate the efficiency of this approach compared to various existing dimensionality reduction techniques. Also, according to the experimental results, we can conclude that this approach can adaptively select the relevant spectral improving the semantic interpretation of HSI.

Keywords: Band selection, dimensionality reduction, feature extraction, hyperspectral imagery, semantic interpretation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1124
1975 TOPSIS Method for Supplier Selection Problem

Authors: Omid Jadidi, Fatemeh Firouzi, Enzo Bagliery

Abstract:

Supplier selection, in real situation, is affected by several qualitative and quantitative factors and is one of the most important activities of purchasing department. Since at the time of evaluating suppliers against the criteria or factors, decision makers (DMS) do not have precise, exact and complete information, supplier selection becomes more difficult. In this case, Grey theory helps us to deal with this problem of uncertainty. Here, we apply Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method to evaluate and select the best supplier by using interval fuzzy numbers. Through this article, we compare TOPSIS with some other approaches and afterward demonstrate that the concept of TOPSIS is very important for ranking and selecting right supplier.

Keywords: TOPSIS, fuzzy number, MADM, Supplier selection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12938
1974 Hospital Facility Location Selection Using Permanent Analytics Process

Authors: C. Ardil

Abstract:

In this paper, a new MCDMA approach, the permanent analytics process is proposed to assess the immovable valuation criteria and their significance in the placement of the healthcare facility. Five decision factors are considered for the value and selection of immovables. In the multiple factor selection problems, the priority vector of the criteria used to compare several immovables is first determined using the permanent analytics method, a mathematical model for the multiple criteria decisionmaking process. Then, to demonstrate the viability and efficacy of the suggested approach, twenty potential candidate locations were evaluated using the hospital site selection problem's decision criteria. The ranking accuracy of estimation was evaluated using composite programming, which took into account both the permanent analytics process and the weighted multiplicative model. 

Keywords: Hospital Facility Location Selection, Permanent Analytics Process, Multiple Criteria Decision Making (MCDM)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 346
1973 Lung Cancer Detection and Multi Level Classification Using Discrete Wavelet Transform Approach

Authors: V. Veeraprathap, G. S. Harish, G. Narendra Kumar

Abstract:

Uncontrolled growth of abnormal cells in the lung in the form of tumor can be either benign (non-cancerous) or malignant (cancerous). Patients with Lung Cancer (LC) have an average of five years life span expectancy provided diagnosis, detection and prediction, which reduces many treatment options to risk of invasive surgery increasing survival rate. Computed Tomography (CT), Positron Emission Tomography (PET), and Magnetic Resonance Imaging (MRI) for earlier detection of cancer are common. Gaussian filter along with median filter used for smoothing and noise removal, Histogram Equalization (HE) for image enhancement gives the best results without inviting further opinions. Lung cavities are extracted and the background portion other than two lung cavities is completely removed with right and left lungs segmented separately. Region properties measurements area, perimeter, diameter, centroid and eccentricity measured for the tumor segmented image, while texture is characterized by Gray-Level Co-occurrence Matrix (GLCM) functions, feature extraction provides Region of Interest (ROI) given as input to classifier. Two levels of classifications, K-Nearest Neighbor (KNN) is used for determining patient condition as normal or abnormal, while Artificial Neural Networks (ANN) is used for identifying the cancer stage is employed. Discrete Wavelet Transform (DWT) algorithm is used for the main feature extraction leading to best efficiency. The developed technology finds encouraging results for real time information and on line detection for future research.

Keywords: ANN, DWT, GLCM, KNN, ROI, artificial neural networks, discrete wavelet transform, gray-level co-occurrence matrix, k-nearest neighbor, region of interest.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 903
1972 Image Analysis for Obturator Foramen Based on Marker-Controlled Watershed Segmentation and Zernike Moments

Authors: Seda Sahin, Emin Akata

Abstract:

Obturator Foramen is a specific structure in Pelvic bone images and recognition of it is a new concept in medical image processing. Moreover, segmentation of bone structures such as Obturator Foramen plays an essential role for clinical research in orthopedics. In this paper, we present a novel method to analyze the similarity between the substructures of the imaged region and a hand drawn template as a preprocessing step for computation of Pelvic bone rotation on hip radiographs. This method consists of integrated usage of Marker-controlled Watershed segmentation and Zernike moment feature descriptor and it is used to detect Obturator Foramen accurately. Marker-controlled Watershed segmentation is applied to separate Obturator Foramen from the background effectively. Then, Zernike moment feature descriptor is used to provide matching between binary template image and the segmented binary image for final extraction of Obturator Foramens. Finally, Pelvic bone rotation rate calculation for each hip radiograph is performed automatically to select and eliminate hip radiographs for further studies which depend on Pelvic bone angle measurements. The proposed method is tested on randomly selected 100 hip radiographs. The experimental results demonstrated that the proposed method is able to segment Obturator Foramen with 96% accuracy.

Keywords: Medical image analysis, marker-controlled watershed segmentation, segmentation of bone structures on hip radiographs, pelvic bone rotation rate, zernike moment feature descriptor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1944
1971 Applying Fuzzy Decision Making Approach to IT Outsourcing Supplier Selection

Authors: Gülcin Büyüközkan, Mehmet Sakir Ersoy

Abstract:

The decision of information technology (IT) outsourcing requires close attention to the evaluation of supplier selection process because the selection decision involves conflicting multiple criteria and is replete with complex decision making problems. Selecting the most appropriate suppliers is considered an important strategic decision that may impact the performance of outsourcing engagements. The objective of this paper is to aid decision makers to evaluate and assess possible IT outsourcing suppliers. An axiomatic design based fuzzy group decision making is adopted to evaluate supplier alternatives. Finally, a case study is given to demonstrate the potential of the methodology. KeywordsIT outsourcing, Supplier selection, Multi-criteria decision making, Axiomatic design, Fuzzy logic.

Keywords: IT outsourcing, Supplier selection, Multi-criteria decision making, Axiomatic design, Fuzzy logic

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1912
1970 Using Data Fusion for Biometric Verification

Authors: Richard A. Wasniowski

Abstract:

A wide spectrum of systems require reliable personal recognition schemes to either confirm or determine the identity of an individual person. This paper considers multimodal biometric system and their applicability to access control, authentication and security applications. Strategies for feature extraction and sensor fusion are considered and contrasted. Issues related to performance assessment, deployment and standardization are discussed. Finally future directions of biometric systems development are discussed.

Keywords: Multimodal, biometric, recognition, fusion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715
1969 Efficient and Effective Gabor Feature Representation for Face Detection

Authors: Yasuomi D. Sato, Yasutaka Kuriya

Abstract:

We here propose improved version of elastic graph matching (EGM) as a face detector, called the multi-scale EGM (MS-EGM). In this improvement, Gabor wavelet-based pyramid reduces computational complexity for the feature representation often used in the conventional EGM, but preserving a critical amount of information about an image. The MS-EGM gives us higher detection performance than Viola-Jones object detection algorithm of the AdaBoost Haar-like feature cascade. We also show rapid detection speeds of the MS-EGM, comparable to the Viola-Jones method. We find fruitful benefits in the MS-EGM, in terms of topological feature representation for a face.

Keywords: Face detection, Gabor wavelet based pyramid, elastic graph matching, topological preservation, redundancy of computational complexity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1825