Search results for: Fault Detection and Diagnosis (FDD)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2032

Search results for: Fault Detection and Diagnosis (FDD)

442 Polyethylenimine Coated Carbon Nanotube for Detecting Rancidity in Frying Oil

Authors: Vincent Lau Chun Fai, Yang Doo Lee, Kyongsoo Lee, Keun-Soo Lee, Shin-Kyung, Byeong-Kwon Ju

Abstract:

Chemical detection is still a continuous challenge when it comes to designing single-walled carbon nanotube (SWCNT) sensors with high selectivity, especially in complex chemical environments. A perfect example of such an environment would be in thermally oxidized soybean oil. At elevated temperatures, oil oxidizes through a series of chemical reactions which results in the formation of monoacylglycerols, diacylglycerols, oxidized triacylglycerols, dimers, trimers, polymers, free fatty acids, ketones, aldehydes, alcohols, esters, and other minor products. In order to detect the rancidity of oxidized soybean oil, carbon nanotube chemiresistor sensors have been coated with polyethylenimine (PEI) to enhance the sensitivity and selectivity. PEI functionalized SWCNTs are known to have a high selectivity towards strong electron withdrawing molecules. The sensors were very responsive to different oil oxidation levels and furthermore, displayed a rapid recovery in ambient air without the need of heating or UV exposure.

Keywords: Carbon nanotubes, polyethylenimine, sensor, oxidized oil

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1750
441 An Event Based Approach to Extract the Run Time Execution Path of BPEL Process for Monitoring QoS in the Cloud

Authors: Rima Grati, Khouloud Boukadi, Hanene Ben-Abdallah

Abstract:

Due to the dynamic nature of the Cloud, continuous monitoring of QoS requirements is necessary to manage the Cloud computing environment. The process of QoS monitoring and SLA violation detection consists of: collecting low and high level information pertinent to the service, analyzing the collected information, and taking corrective actions when SLA violations are detected. In this paper, we detail the architecture and the implementation of the first step of this process. More specifically, we propose an event-based approach to obtain run time information of services developed as BPEL processes. By catching particular events (i.e., the low level information), our approach recognizes the run-time execution path of a monitored service and uses the BPEL execution patterns to compute QoS of the composite service (i.e., the high level information).

Keywords: Monitoring of Web service composition, Cloud environment, Run-time extraction of execution path of BPEL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1662
440 Expert Witness Testimony in the Battered Woman Syndrome

Authors: Ana Pauna

Abstract:

The Expert Witness Testimony in the Battered Woman Syndrome Expert witness testimony (EWT) is a kind of information given by an expert specialized in the field (here in BWS) to the jury in order to help the court better understand the case. EWT does not always work in favor of the battered women. Two main decision-making models are discussed in the paper: the Mathematical model and the Explanation model. In the first model, the jurors calculate ″the importance and strength of each piece of evidence″ whereas in the second model they try to integrate the EWT with the evidence and create a coherent story that would describe the crime. The jury often misunderstands and misjudges battered women for their action (or in this case inaction). They assume that these women are masochists and accept being mistreated for if a man abuses a woman constantly, she should and could divorce him or simply leave at any time. The research in the domain found that indeed, expert witness testimony has a powerful influence on juror’s decisions thus its quality needs to be further explored. One of the important factors that need further studies is a bias called the dispositionist worldview (a belief that what happens to people is of their own doing). This kind of attributional bias represents a tendency to think that a person’s behavior is due to his or her disposition, even when the behavior is clearly attributed to the situation. Hypothesis The hypothesis of this paper is that if a juror has a dispositionist worldview then he or she will blame the rape victim for triggering the assault. The juror would therefore commit the fundamental attribution error and believe that the victim’s disposition caused the rape and not the situation she was in. Methods The subjects in the study were 500 randomly sampled undergraduate students from McGill, Concordia, Université de Montréal and UQAM. Dispositional Worldview was scored on the Dispositionist Worldview Questionnaire. After reading the Rape Scenarios, each student was asked to play the role of a juror and answer a questionnaire consisting of 7 questions about the responsibility, causality and fault of the victim. Results The results confirm the hypothesis which states that if a juror has a dispositionist worldview then he or she will blame the rape victim for triggering the assault. By doing so, the juror commits the fundamental attribution error because he will believe that the victim’s disposition, and not the constraints or opportunities of the situation, caused the rape scenario.

Keywords: bias, expert/witness testimony, attribution error, jury, rape myth

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2148
439 Statistical Measures and Optimization Algorithms for Gene Selection in Lung and Ovarian Tumor

Authors: C. Gunavathi, K. Premalatha

Abstract:

Microarray technology is universally used in the study of disease diagnosis using gene expression levels. The main shortcoming of gene expression data is that it includes thousands of genes and a small number of samples. Abundant methods and techniques have been proposed for tumor classification using microarray gene expression data. Feature or gene selection methods can be used to mine the genes that directly involve in the classification and to eliminate irrelevant genes. In this paper statistical measures like T-Statistics, Signal-to-Noise Ratio (SNR) and F-Statistics are used to rank the genes. The ranked genes are used for further classification. Particle Swarm Optimization (PSO) algorithm and Shuffled Frog Leaping (SFL) algorithm are used to find the significant genes from the top-m ranked genes. The Naïve Bayes Classifier (NBC) is used to classify the samples based on the significant genes. The proposed work is applied on Lung and Ovarian datasets. The experimental results show that the proposed method achieves 100% accuracy in all the three datasets and the results are compared with previous works.

Keywords: Microarray, T-Statistics, Signal-to-Noise Ratio, FStatistics, Particle Swarm Optimization, Shuffled Frog Leaping, Naïve Bayes Classifier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1924
438 The Application of Fuzzy Set Theory to Mobile Internet Advertisement Fraud Detection

Authors: Jinming Ma, Tianbing Xia, Janusz R. Getta

Abstract:

This paper presents the application of fuzzy set theory to implement of mobile advertisement anti-fraud systems. Mobile anti-fraud is a method aiming to identify mobile advertisement fraudsters. One of the main problems of mobile anti-fraud is the lack of evidence to prove a user to be a fraudster. In this paper, we implement an application by using fuzzy set theory to demonstrate how to detect cheaters. The advantage of our method is that the hardship in detecting fraudsters in small data samples has been avoided. We achieved this by giving each user a suspicious degree showing how likely the user is cheating and decide whether a group of users (like all users of a certain APP) together to be fraudsters according to the average suspicious degree. This makes the process more accurate as the data of a single user is too small to be predictable.

Keywords: Mobile internet, advertisement, anti-fraud, fuzzy set theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 556
437 PET/CT Patient Dosage Assay

Authors: Gulten Yilmaz, A. Beril Tugrul, Mustafa Demir, Dogan Yasar, Bayram Demir, Bulent Buyuk

Abstract:

A Positron Emission Tomography (PET) is a radioisotope imaging technique that illustrates the organs and the metabolisms of the human body. This technique is based on the simultaneous detection of 511 keV annihilation photons, annihilated as a result of electrons annihilating positrons that radiate from positron-emitting radioisotopes that enter biological active molecules in the body. This study was conducted on ten patients in an effort to conduct patient-related experimental studies. Dosage monitoring for the bladder, which was the organ that received the highest dose during PET applications, was conducted for 24 hours. Assessment based on measuring urination activities after injecting patients was also a part of this study. The MIRD method was used to conduct dosage calculations for results obtained from experimental studies. Results obtained experimentally and theoretically were assessed comparatively.

Keywords: PET/CT, TLD, MIRD, Dose measurement, Patient doses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1920
436 Multi-Layer Perceptron and Radial Basis Function Neural Network Models for Classification of Diabetic Retinopathy Disease Using Video-Oculography Signals

Authors: Ceren Kaya, Okan Erkaymaz, Orhan Ayar, Mahmut Özer

Abstract:

Diabetes Mellitus (Diabetes) is a disease based on insulin hormone disorders and causes high blood glucose. Clinical findings determine that diabetes can be diagnosed by electrophysiological signals obtained from the vital organs. 'Diabetic Retinopathy' is one of the most common eye diseases resulting on diabetes and it is the leading cause of vision loss due to structural alteration of the retinal layer vessels. In this study, features of horizontal and vertical Video-Oculography (VOG) signals have been used to classify non-proliferative and proliferative diabetic retinopathy disease. Twenty-five features are acquired by using discrete wavelet transform with VOG signals which are taken from 21 subjects. Two models, based on multi-layer perceptron and radial basis function, are recommended in the diagnosis of Diabetic Retinopathy. The proposed models also can detect level of the disease. We show comparative classification performance of the proposed models. Our results show that proposed the RBF model (100%) results in better classification performance than the MLP model (94%).

Keywords: Diabetic retinopathy, discrete wavelet transform, multi-layer perceptron, radial basis function, video-oculography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1321
435 A Simple Affymetrix Ratio-transformation Method Yields Comparable Expression Level Quantifications with cDNA Data

Authors: Chintanu K. Sarmah, Sandhya Samarasinghe, Don Kulasiri, Daniel Catchpoole

Abstract:

Gene expression profiling is rapidly evolving into a powerful technique for investigating tumor malignancies. The researchers are overwhelmed with the microarray-based platforms and methods that confer them the freedom to conduct large-scale gene expression profiling measurements. Simultaneously, investigations into cross-platform integration methods have started gaining momentum due to their underlying potential to help comprehend a myriad of broad biological issues in tumor diagnosis, prognosis, and therapy. However, comparing results from different platforms remains to be a challenging task as various inherent technical differences exist between the microarray platforms. In this paper, we explain a simple ratio-transformation method, which can provide some common ground for cDNA and Affymetrix platform towards cross-platform integration. The method is based on the characteristic data attributes of Affymetrix- and cDNA- platform. In the work, we considered seven childhood leukemia patients and their gene expression levels in either platform. With a dataset of 822 differentially expressed genes from both these platforms, we carried out a specific ratio-treatment to Affymetrix data, which subsequently showed an improvement in the relationship with the cDNA data.

Keywords: Gene expression profiling, microarray, cDNA, Affymetrix, childhood leukaemia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1496
434 Identification of Phenolic Contents in Malaysian Variety of Pummelo (Citrus Grandis L. Osbeck) Fruit Juice Using HPLC-DAD

Authors: N. N. A. K. Shah, R. A. Rahman, R. Shamsuddin, N. M. Adzahan

Abstract:

Pummelo is known to be the largest of all citrus fruits, with expected ratio of 2:1 (w/v) of producing juice, is an attractive opportunity for Malaysia to expand pummelo-s influence and marketability over the international market of juices. The purpose of this study is to identify and quantify the phenolic compounds in two Malaysian varieties of pummelo fruit juice: Ledang (PO55) and Tambun (PO52). Identifications of polyphenols composition were done using High Performance Liquid Chromatography Diode Array Detection (HPLC-DAD). The phenolic compounds that were found in both varieties were hydroxycinnamic acids and flavonones. This study proved that Tambun variety has the highest antioxidant and phenolic compounds in comparison to Ledang variety. However, considerations have to be made to suit consumer-s taste bud and the amount of enzyme needed to clarify the juice for its marketability.

Keywords: Antioxidant, HPLC, phenolic contents and pummelo fruit juice

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2335
433 Integration of Image and Patient Data, Software and International Coding Systems for Use in a Mammography Research Project

Authors: V. Balanica, W. I. D. Rae, M. Caramihai, S. Acho, C. P. Herbst

Abstract:

Mammographic images and data analysis to facilitate modelling or computer aided diagnostic (CAD) software development should best be done using a common database that can handle various mammographic image file formats and relate these to other patient information. This would optimize the use of the data as both primary reporting and enhanced information extraction of research data could be performed from the single dataset. One desired improvement is the integration of DICOM file header information into the database, as an efficient and reliable source of supplementary patient information intrinsically available in the images. The purpose of this paper was to design a suitable database to link and integrate different types of image files and gather common information that can be further used for research purposes. An interface was developed for accessing, adding, updating, modifying and extracting data from the common database, enhancing the future possible application of the data in CAD processing. Technically, future developments envisaged include the creation of an advanced search function to selects image files based on descriptor combinations. Results can be further used for specific CAD processing and other research. Design of a user friendly configuration utility for importing of the required fields from the DICOM files must be done.

Keywords: Database Integration, Mammogram Classification, Tumour Classification, Computer Aided Diagnosis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1926
432 Shot Transition Detection with Minimal Decoding of MPEG Video Streams

Authors: Mona A. Fouad, Fatma M. Bayoumi, Hoda M. Onsi, Mohamed G. Darwish

Abstract:

Digital libraries become more and more necessary in order to support users with powerful and easy-to-use tools for searching, browsing and retrieving media information. The starting point for these tasks is the segmentation of video content into shots. To segment MPEG video streams into shots, a fully automatic procedure to detect both abrupt and gradual transitions (dissolve and fade-groups) with minimal decoding in real time is developed in this study. Each was explored through two phases: macro-block type's analysis in B-frames, and on-demand intensity information analysis. The experimental results show remarkable performance in detecting gradual transitions of some kinds of input data and comparable results of the rest of the examined video streams. Almost all abrupt transitions could be detected with very few false positive alarms.

Keywords: Adaptive threshold, abrupt transitions, gradual transitions, MPEG video streams.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535
431 Performance Analysis of Brain Tumor Detection Based On Image Fusion

Authors: S. Anbumozhi, P. S. Manoharan

Abstract:

Medical Image fusion plays a vital role in medical field to diagnose the brain tumors which can be classified as benign or malignant. It is the process of integrating multiple images of the same scene into a single fused image to reduce uncertainty and minimizing redundancy while extracting all the useful information from the source images. Fuzzy logic is used to fuse two brain MRI images with different vision. The fused image will be more informative than the source images. The texture and wavelet features are extracted from the fused image. The multilevel Adaptive Neuro Fuzzy Classifier classifies the brain tumors based on trained and tested features. The proposed method achieved 80.48% sensitivity, 99.9% specificity and 99.69% accuracy. Experimental results obtained from fusion process prove that the use of the proposed image fusion approach shows better performance while compared with conventional fusion methodologies.

Keywords: Image fusion, Fuzzy rules, Neuro-fuzzy classifier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3034
430 Overview of CARDIOSENSOR Project on the Development of a Nanosensor for Assessing the Risk of Cardiovascular Disease

Authors: A.C. Duarte, C.I.L. Justino, K. Duarte, A.C. Freitas, R. Pereira, P. Chaves, P. Bettencourt, S. Cardoso, T.A.P. Rocha-Santos

Abstract:

This paper aims at overviewing the topics of a research project (CARDIOSENSOR) on the field of health sciences (biomaterials and biomedical engineering). The project has focused on the development of a nanosensor for the assessment of the risk of cardiovascular diseases by the monitoring of C-reactive protein (CRP), which has been currently considered as the best validated inflammatory biomarker associated to cardiovascular diseases. The project involves tasks such as: 1) the development of sensor devices based on field effect transistors (FET): assembly, optimization and validation; 2) application of sensors to the detection of CRP in standard solutions and comparison with enzyme-linked immunosorbent assay (ELISA); and 3) application of sensors to real samples such as blood and saliva and evaluation of their ability to predict the risk of cardiovascular disease.

Keywords: Carbon nanotubes field effect transistors, cardiovascular diseases, C-reactive protein, sensor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2017
429 Single-Camera Basketball Tracker through Pose and Semantic Feature Fusion

Authors: Adrià Arbués-Sangüesa, Coloma Ballester, Gloria Haro

Abstract:

Tracking sports players is a widely challenging scenario, specially in single-feed videos recorded in tight courts, where cluttering and occlusions cannot be avoided. This paper presents an analysis of several geometric and semantic visual features to detect and track basketball players. An ablation study is carried out and then used to remark that a robust tracker can be built with Deep Learning features, without the need of extracting contextual ones, such as proximity or color similarity, nor applying camera stabilization techniques. The presented tracker consists of: (1) a detection step, which uses a pretrained deep learning model to estimate the players pose, followed by (2) a tracking step, which leverages pose and semantic information from the output of a convolutional layer in a VGG network. Its performance is analyzed in terms of MOTA over a basketball dataset with more than 10k instances.

Keywords: Basketball, deep learning, feature extraction, single-camera, tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 668
428 A Survey of Model Comparison Strategies and Techniques in Model Driven Engineering

Authors: Junaid Rashid, Waqar Mehmood, Muhammad Wasif Nisar

Abstract:

This survey paper shows the recent state of model comparison as it’s applies to Model Driven engineering. In Model Driven Engineering to calculate the difference between the models is a very important and challenging task. There are number of tasks involved in model differencing that firstly starts with identifying and matching the elements of the model. In this paper, we discuss how model matching is accomplished, the strategies, techniques and the types of the model. We also discuss the future direction. We found out that many of the latest model comparison strategies are geared near enabling Meta model and similarity based matching. Therefore model versioning is the most dominant application of the model comparison. Recently to work on comparison for versioning has begun to deteriorate, giving way to different applications. Ultimately there is wide change among the tools in the measure of client exertion needed to perform model comparisons, as some require more push to encourage more sweeping statement and expressive force.

Keywords: Model comparison, model clone detection, model versioning, EMF Model, model diff.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2150
427 Automatic Detection and Spatio-temporal Analysis of Commercial Accumulations Using Digital Yellow Page Data

Authors: Yuki. Akiyama, Hiroaki. Sengoku, Ryosuke. Shibasaki

Abstract:

In this study, the locations and areas of commercial accumulations were detected by using digital yellow page data. An original buffering method that can accurately create polygons of commercial accumulations is proposed in this paper.; by using this method, distribution of commercial accumulations can be easily created and monitored over a wide area. The locations, areas, and time-series changes of commercial accumulations in the South Kanto region can be monitored by integrating polygons of commercial accumulations with the time-series data of digital yellow page data. The circumstances of commercial accumulations were shown to vary according to areas, that is, highly- urbanized regions such as the city center of Tokyo and prefectural capitals, suburban areas near large cities, and suburban and rural areas.

Keywords: Commercial accumulations, Spatio-temporal analysis, Urban monitoring, Yellow page data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1241
426 Fast and Robust Long-term Tracking with Effective Searching Model

Authors: Thang V. Kieu, Long P. Nguyen

Abstract:

Kernelized Correlation Filter (KCF) based trackers have gained a lot of attention recently because of their accuracy and fast calculation speed. However, this algorithm is not robust in cases where the object is lost by a sudden change of direction, being obscured or going out of view. In order to improve KCF performance in long-term tracking, this paper proposes an anomaly detection method for target loss warning by analyzing the response map of each frame, and a classification algorithm for reliable target re-locating mechanism by using Random fern. Being tested with Visual Tracker Benchmark and Visual Object Tracking datasets, the experimental results indicated that the precision and success rate of the proposed algorithm were 2.92 and 2.61 times higher than that of the original KCF algorithm, respectively. Moreover, the proposed tracker handles occlusion better than many state-of-the-art long-term tracking methods while running at 60 frames per second.

Keywords: Correlation filter, long-term tracking, random fern, real-time tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 737
425 Dynamic Fast Tracing and Smoothing Technique for Geiger-Muller Dosimeter

Authors: M. Ebrahimi Shohani, S. M. Taheri, S. M. Golgoun

Abstract:

Environmental radiation dosimeter is a kind of detector that measures the dose of the radiation area. Dosimeter registers the radiation and converts it to the dose according to the calibration parameters. The limit of a dose is different at each radiation area and this limit should be notified and reported to the user and health physics department. The stochastic nature of radiation is the reason for the fluctuation of any gamma detector dosimetry. In this research we investigated Geiger-Muller type of dosimeter and tried to improve the dose measurement. Geiger-Muller dosimeter is a counter that converts registered radiation to the dose. Therefore, for better data analysis, it is necessary to apply an algorithm to smooth statistical variations of registered radiation. We proposed a method to smooth these fluctuations much more and also proposed a dynamic way to trace rapid changes of radiations. Results show that our method is fast and reliable method in comparison the traditional method.

Keywords: Geiger-Muller, radiation detection, smoothing algorithms, dosimeter, dose calculation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 404
424 Mechanism of Alcohol Related Disruption of the Error Monitoring and Processing System

Authors: M. O. Welcome, Y. E. Razvodovsky, E. V. Pereverzeva, V. A. Pereverzev

Abstract:

The error monitoring and processing system, EMPS is the system located in the substantia nigra of the midbrain, basal ganglia and cortex of the forebrain, and plays a leading role in error detection and correction. The main components of EMPS are the dopaminergic system and anterior cingulate cortex. Although, recent studies show that alcohol disrupts the EMPS, the ways in which alcohol affects this system are poorly understood. Based on current literature data, here we suggest a hypothesis of alcohol-related glucose-dependent system of error monitoring and processing, which holds that the disruption of the EMPS is related to the competency of glucose homeostasis regulation, which in turn may determine the dopamine level as a major component of EMPS. Alcohol may indirectly disrupt the EMPS by affecting dopamine level through disorders in blood glucose homeostasis regulation.

Keywords: Alcohol related disruption, Error monitoring andprocessing system, Mechanism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1332
423 Critical Analysis of Heat Exchanger Cycle for its Maintainability Using Failure Modes and Effect Analysis and Pareto Analysis

Authors: Sayali Vyas, Atharva Desai, Shreyas Badave, Apurv Kulkarni, B. Rajiv

Abstract:

The Failure Modes and Effect Analysis (FMEA) is an efficient evaluation technique to identify potential failures in products, processes, and services. FMEA is designed to identify and prioritize failure modes. It proves to be a useful method for identifying and correcting possible failures at its earliest possible level so that one can avoid consequences of poor performance. In this paper, FMEA tool is used in detection of failures of various components of heat exchanger cycle and to identify critical failures of the components which may hamper the system’s performance. Further, a detailed Pareto analysis is done to find out the most critical components of the cycle, the causes of its failures, and possible recommended actions. This paper can be used as a checklist which will help in maintainability of the system.

Keywords: FMEA, heat exchanger cycle, Ishikawa diagram, Pareto analysis, risk priority number.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1622
422 An Advanced Hybrid P2p Botnet 2.0

Authors: T. T. Lu, H.Y. Liao, M .F. Chen

Abstract:

Recently, malware attacks have become more serious over the Internet by e-mail, denial of service (DoS) or distributed denial of service (DDoS). The Botnets have become a significant part of the Internet malware attacks. The traditional botnets include three parts – botmaster, command and control (C&C) servers and bots. The C&C servers receive commands from botmaster and control the distributions of computers remotely. Bots use DNS to find the positions of C&C server. In this paper, we propose an advanced hybrid peer-to-peer (P2P) botnet 2.0 (AHP2P botnet 2.0) using web 2.0 technology to hide the instructions from botmaster into social sites, which are regarded as C&C servers. Servent bots are regarded as sub-C&C servers to get the instructions from social sites. The AHP2P botnet 2.0 can evaluate the performance of servent bots, reduce DNS traffics from bots to C&C servers, and achieve harder detection bots actions than IRC-based botnets over the Internet.

Keywords: Peer-to-peer, Botnets, Botnet 2.0, Hybridpeer-to-peer

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2402
421 Clustering Unstructured Text Documents Using Fading Function

Authors: Pallav Roxy, Durga Toshniwal

Abstract:

Clustering unstructured text documents is an important issue in data mining community and has a number of applications such as document archive filtering, document organization and topic detection and subject tracing. In the real world, some of the already clustered documents may not be of importance while new documents of more significance may evolve. Most of the work done so far in clustering unstructured text documents overlooks this aspect of clustering. This paper, addresses this issue by using the Fading Function. The unstructured text documents are clustered. And for each cluster a statistics structure called Cluster Profile (CP) is implemented. The cluster profile incorporates the Fading Function. This Fading Function keeps an account of the time-dependent importance of the cluster. The work proposes a novel algorithm Clustering n-ary Merge Algorithm (CnMA) for unstructured text documents, that uses Cluster Profile and Fading Function. Experimental results illustrating the effectiveness of the proposed technique are also included.

Keywords: Clustering, Text Mining, Unstructured TextDocuments, Fading Function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1965
420 Diagnosing Dangerous Arrhythmia of Patients by Automatic Detecting of QRS Complexes in ECG

Authors: Jia-Rong Yeh, Ai-Hsien Li, Jiann-Shing Shieh, Yen-An Su, Chi-Yu Yang

Abstract:

In this paper, an automatic detecting algorithm for QRS complex detecting was applied for analyzing ECG recordings and five criteria for dangerous arrhythmia diagnosing are applied for a protocol type of automatic arrhythmia diagnosing system. The automatic detecting algorithm applied in this paper detected the distribution of QRS complexes in ECG recordings and related information, such as heart rate and RR interval. In this investigation, twenty sampled ECG recordings of patients with different pathologic conditions were collected for off-line analysis. A combinative application of four digital filters for bettering ECG signals and promoting detecting rate for QRS complex was proposed as pre-processing. Both of hardware filters and digital filters were applied to eliminate different types of noises mixed with ECG recordings. Then, an automatic detecting algorithm of QRS complex was applied for verifying the distribution of QRS complex. Finally, the quantitative clinic criteria for diagnosing arrhythmia were programmed in a practical application for automatic arrhythmia diagnosing as a post-processor. The results of diagnoses by automatic dangerous arrhythmia diagnosing were compared with the results of off-line diagnoses by experienced clinic physicians. The results of comparison showed the application of automatic dangerous arrhythmia diagnosis performed a matching rate of 95% compared with an experienced physician-s diagnoses.

Keywords: Signal processing, electrocardiography (ECG), QRS complex, arrhythmia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1499
419 Hot-Spot Blob Merging for Real-Time Image Segmentation

Authors: K. Kraus, M. Uiberacker, O. Martikainen, R. Reda

Abstract:

One of the major, difficult tasks in automated video surveillance is the segmentation of relevant objects in the scene. Current implementations often yield inconsistent results on average from frame to frame when trying to differentiate partly occluding objects. This paper presents an efficient block-based segmentation algorithm which is capable of separating partly occluding objects and detecting shadows. It has been proven to perform in real time with a maximum duration of 47.48 ms per frame (for 8x8 blocks on a 720x576 image) with a true positive rate of 89.2%. The flexible structure of the algorithm enables adaptations and improvements with little effort. Most of the parameters correspond to relative differences between quantities extracted from the image and should therefore not depend on scene and lighting conditions. Thus presenting a performance oriented segmentation algorithm which is applicable in all critical real time scenarios.

Keywords: Image segmentation, Model-based, Region growing, Blob Analysis, Occlusion, Shadow detection, Intelligent videosurveillance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1487
418 Performance Analysis of the Time-Based and Periodogram-Based Energy Detector for Spectrum Sensing

Authors: Sadaf Nawaz, Adnan Ahmed Khan, Asad Mahmood, Chaudhary Farrukh Javed

Abstract:

Classically, an energy detector is implemented in time domain (TD). However, frequency domain (FD) based energy detector has demonstrated an improved performance. This paper presents a comparison between the two approaches as to analyze their pros and cons. A detailed performance analysis of the classical TD energy-detector and the periodogram based detector is performed. Exact and approximate mathematical expressions for probability of false alarm (Pf) and probability of detection (Pd) are derived for both approaches. The derived expressions naturally lead to an analytical as well as intuitive reasoning for the improved performance of (Pf) and (Pd) in different scenarios. Our analysis suggests the dependence improvement on buffer sizes. Pf is improved in FD, whereas Pd is enhanced in TD based energy detectors. Finally, Monte Carlo simulations results demonstrate the analysis reached by the derived expressions.

Keywords: Cognitive radio, energy detector, periodogram, spectrum sensing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1008
417 Learning Example of a Biomedical Project from a Real Problem of Muscle Fatigue

Authors: M. Rezki, A. Belaidi

Abstract:

This paper deals with a method of learning to solve a real problem in biomedical engineering from a technical study of muscle fatigue. Electromyography (EMG) is a technique for evaluating and recording the electrical activity produced by skeletal muscles (viewpoint: anatomical and physiological). EMG is used as a diagnostics tool for identifying neuromuscular diseases, assessing low-back pain and muscle fatigue in general. In order to study the EMG signal for detecting fatigue in a muscle, we have taken a real problem which touches the tramway conductor the handle bar. For the study, we have used a typical autonomous platform in order to get signals at real time. In our case study, we were confronted with complex problem to do our experiments in a tram. This type of problem is recurring among students. To teach our students the method to solve this kind of problem, we built a similar system. Through this study, we realized a lot of objectives such as making the equipment for simulation, the study of detection of muscle fatigue and especially how to manage a study of biomedical looking.

Keywords: EMG, health platform, conductor’s tram, muscle fatigue.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1703
416 Performance Analysis of Genetic Algorithm with kNN and SVM for Feature Selection in Tumor Classification

Authors: C. Gunavathi, K. Premalatha

Abstract:

Tumor classification is a key area of research in the field of bioinformatics. Microarray technology is commonly used in the study of disease diagnosis using gene expression levels. The main drawback of gene expression data is that it contains thousands of genes and a very few samples. Feature selection methods are used to select the informative genes from the microarray. These methods considerably improve the classification accuracy. In the proposed method, Genetic Algorithm (GA) is used for effective feature selection. Informative genes are identified based on the T-Statistics, Signal-to-Noise Ratio (SNR) and F-Test values. The initial candidate solutions of GA are obtained from top-m informative genes. The classification accuracy of k-Nearest Neighbor (kNN) method is used as the fitness function for GA. In this work, kNN and Support Vector Machine (SVM) are used as the classifiers. The experimental results show that the proposed work is suitable for effective feature selection. With the help of the selected genes, GA-kNN method achieves 100% accuracy in 4 datasets and GA-SVM method achieves in 5 out of 10 datasets. The GA with kNN and SVM methods are demonstrated to be an accurate method for microarray based tumor classification.

Keywords: F-Test, Gene Expression, Genetic Algorithm, k- Nearest-Neighbor, Microarray, Signal-to-Noise Ratio, Support Vector Machine, T-statistics, Tumor Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4513
415 Tracking Objects in Color Image Sequences: Application to Football Images

Authors: Mourad Moussa, Ali Douik, Hassani Messaoud

Abstract:

In this paper, we present a comparative study between two computer vision systems for objects recognition and tracking, these algorithms describe two different approach based on regions constituted by a set of pixels which parameterized objects in shot sequences. For the image segmentation and objects detection, the FCM technique is used, the overlapping between cluster's distribution is minimized by the use of suitable color space (other that the RGB one). The first technique takes into account a priori probabilities governing the computation of various clusters to track objects. A Parzen kernel method is described and allows identifying the players in each frame, we also show the importance of standard deviation value research of the Gaussian probability density function. Region matching is carried out by an algorithm that operates on the Mahalanobis distance between region descriptors in two subsequent frames and uses singular value decomposition to compute a set of correspondences satisfying both the principle of proximity and the principle of exclusion.

Keywords: Image segmentation, objects tracking, Parzen window, singular value decomposition, target recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1956
414 Segmentation of Gray Scale Images of Dropwise Condensation on Textured Surfaces

Authors: Helene Martin, Solmaz Boroomandi Barati, Jean-Charles Pinoli, Stephane Valette, Yann Gavet

Abstract:

In the present work we developed an image processing algorithm to measure water droplets characteristics during dropwise condensation on pillared surfaces. The main problem in this process is the similarity between shape and size of water droplets and the pillars. The developed method divides droplets into four main groups based on their size and applies the corresponding algorithm to segment each group. These algorithms generate binary images of droplets based on both their geometrical and intensity properties. The information related to droplets evolution during time including mean radius and drops number per unit area are then extracted from the binary images. The developed image processing algorithm is verified using manual detection and applied to two different sets of images corresponding to two kinds of pillared surfaces.

Keywords: Dropwise condensation, textured surface, image processing, watershed.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 668
413 Artificial Intelligence-Based Detection of Individuals Suffering from Vestibular Disorder

Authors: D. Hişam, S. İkizoğlu

Abstract:

Identifying the problem behind balance disorder is one of the most interesting topics in medical literature. This study has considerably enhanced the development of artificial intelligence (AI) algorithms applying multiple machine learning (ML) models to sensory data on gait collected from humans to classify between normal people and those suffering from Vestibular System (VS) problems. Although AI is widely utilized as a diagnostic tool in medicine, AI models have not been used to perform feature extraction and identify VS disorders through training on raw data. In this study, three ML models, the Random Forest Classifier (RF), Extreme Gradient Boosting (XGB), and K-Nearest Neighbor (KNN), have been trained to detect VS disorder, and the performance comparison of the algorithms has been made using accuracy, recall, precision, and f1-score. With an accuracy of 95.28 %, Random Forest (RF) Classifier was the most accurate model.

Keywords: Vestibular disorder, machine learning, random forest classifier, k-nearest neighbor, extreme gradient boosting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 124