Search results for: fault detection and classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5524

Search results for: fault detection and classification

5014 Classification of Cochannel Signals Using Cyclostationary Signal Processing and Deep Learning

Authors: Bryan Crompton, Daniel Giger, Tanay Mehta, Apurva Mody

Abstract:

The task of classifying radio frequency (RF) signals has seen recent success in employing deep neural network models. In this work, we present a combined signal processing and machine learning approach to signal classification for cochannel anomalous signals. The power spectral density and cyclostationary signal processing features of a captured signal are computed and fed into a neural net to produce a classification decision. Our combined signal preprocessing and machine learning approach allows for simpler neural networks with fast training times and small computational resource requirements for inference with longer preprocessing time.

Keywords: signal processing, machine learning, cyclostationary signal processing, signal classification

Procedia PDF Downloads 89
5013 Estimation of the Length and Location of Ground Surface Deformation Caused by the Reverse Faulting

Authors: Nader Khalafian, Mohsen Ghaderi

Abstract:

Field observations have revealed many examples of structures which were damaged due to ground surface deformation caused by the faulting phenomena. In this paper some efforts were made in order to estimate the length and location of the ground surface where large displacements were created due to the reverse faulting. This research has conducted in two steps; (1) in the first step, a 2D explicit finite element model were developed using ABAQUS software. A subroutine for Mohr-Coulomb failure criterion with strain softening model was developed by the authors in order to properly model the stress strain behavior of the soil in the fault rapture zone. The results of the numerical analysis were verified with the results of available centrifuge experiments. Reasonable coincidence was found between the numerical and experimental data. (2) In the second step, the effects of the fault dip angle (δ), depth of soil layer (H), dilation and friction angle of sand (ψ and φ) and the amount of fault offset (d) on the soil surface displacement and fault rupture path were investigated. An artificial neural network-based model (ANN), as a powerful prediction tool, was developed to generate a general model for predicting faulting characteristics. A properly sized database was created to train and test network. It was found that the length and location of the zone of displaced ground surface can be accurately estimated using the proposed model.

Keywords: reverse faulting, surface deformation, numerical, neural network

Procedia PDF Downloads 409
5012 Shark Detection and Classification with Deep Learning

Authors: Jeremy Jenrette, Z. Y. C. Liu, Pranav Chimote, Edward Fox, Trevor Hastie, Francesco Ferretti

Abstract:

Suitable shark conservation depends on well-informed population assessments. Direct methods such as scientific surveys and fisheries monitoring are adequate for defining population statuses, but species-specific indices of abundance and distribution coming from these sources are rare for most shark species. We can rapidly fill these information gaps by boosting media-based remote monitoring efforts with machine learning and automation. We created a database of shark images by sourcing 24,546 images covering 219 species of sharks from the web application spark pulse and the social network Instagram. We used object detection to extract shark features and inflate this database to 53,345 images. We packaged object-detection and image classification models into a Shark Detector bundle. We developed the Shark Detector to recognize and classify sharks from videos and images using transfer learning and convolutional neural networks (CNNs). We applied these models to common data-generation approaches of sharks: boosting training datasets, processing baited remote camera footage and online videos, and data-mining Instagram. We examined the accuracy of each model and tested genus and species prediction correctness as a result of training data quantity. The Shark Detector located sharks in baited remote footage and YouTube videos with an average accuracy of 89\%, and classified located subjects to the species level with 69\% accuracy (n =\ eight species). The Shark Detector sorted heterogeneous datasets of images sourced from Instagram with 91\% accuracy and classified species with 70\% accuracy (n =\ 17 species). Data-mining Instagram can inflate training datasets and increase the Shark Detector’s accuracy as well as facilitate archiving of historical and novel shark observations. Base accuracy of genus prediction was 68\% across 25 genera. The average base accuracy of species prediction within each genus class was 85\%. The Shark Detector can classify 45 species. All data-generation methods were processed without manual interaction. As media-based remote monitoring strives to dominate methods for observing sharks in nature, we developed an open-source Shark Detector to facilitate common identification applications. Prediction accuracy of the software pipeline increases as more images are added to the training dataset. We provide public access to the software on our GitHub page.

Keywords: classification, data mining, Instagram, remote monitoring, sharks

Procedia PDF Downloads 98
5011 Using Data Mining Technique for Scholarship Disbursement

Authors: J. K. Alhassan, S. A. Lawal

Abstract:

This work is on decision tree-based classification for the disbursement of scholarship. Tree-based data mining classification technique is used in other to determine the generic rule to be used to disburse the scholarship. The system based on the defined rules from the tree is able to determine the class (status) to which an applicant shall belong whether Granted or Not Granted. The applicants that fall to the class of granted denote a successful acquirement of scholarship while those in not granted class are unsuccessful in the scheme. An algorithm that can be used to classify the applicants based on the rules from tree-based classification was also developed. The tree-based classification is adopted because of its efficiency, effectiveness, and easy to comprehend features. The system was tested with the data of National Information Technology Development Agency (NITDA) Abuja, a Parastatal of Federal Ministry of Communication Technology that is mandated to develop and regulate information technology in Nigeria. The system was found working according to the specification. It is therefore recommended for all scholarship disbursement organizations.

Keywords: classification, data mining, decision tree, scholarship

Procedia PDF Downloads 353
5010 Internet of Things Networks: Denial of Service Detection in Constrained Application Protocol Using Machine Learning Algorithm

Authors: Adamu Abdullahi, On Francisca, Saidu Isah Rambo, G. N. Obunadike, D. T. Chinyio

Abstract:

The paper discusses the potential threat of Denial of Service (DoS) attacks in the Internet of Things (IoT) networks on constrained application protocols (CoAP). As billions of IoT devices are expected to be connected to the internet in the coming years, the security of these devices is vulnerable to attacks, disrupting their functioning. This research aims to tackle this issue by applying mixed methods of qualitative and quantitative for feature selection, extraction, and cluster algorithms to detect DoS attacks in the Constrained Application Protocol (CoAP) using the Machine Learning Algorithm (MLA). The main objective of the research is to enhance the security scheme for CoAP in the IoT environment by analyzing the nature of DoS attacks and identifying a new set of features for detecting them in the IoT network environment. The aim is to demonstrate the effectiveness of the MLA in detecting DoS attacks and compare it with conventional intrusion detection systems for securing the CoAP in the IoT environment. Findings: The research identifies the appropriate node to detect DoS attacks in the IoT network environment and demonstrates how to detect the attacks through the MLA. The accuracy detection in both classification and network simulation environments shows that the k-means algorithm scored the highest percentage in the training and testing of the evaluation. The network simulation platform also achieved the highest percentage of 99.93% in overall accuracy. This work reviews conventional intrusion detection systems for securing the CoAP in the IoT environment. The DoS security issues associated with the CoAP are discussed.

Keywords: algorithm, CoAP, DoS, IoT, machine learning

Procedia PDF Downloads 53
5009 Multi-Sensor Target Tracking Using Ensemble Learning

Authors: Bhekisipho Twala, Mantepu Masetshaba, Ramapulana Nkoana

Abstract:

Multiple classifier systems combine several individual classifiers to deliver a final classification decision. However, an increasingly controversial question is whether such systems can outperform the single best classifier, and if so, what form of multiple classifiers system yields the most significant benefit. Also, multi-target tracking detection using multiple sensors is an important research field in mobile techniques and military applications. In this paper, several multiple classifiers systems are evaluated in terms of their ability to predict a system’s failure or success for multi-sensor target tracking tasks. The Bristol Eden project dataset is utilised for this task. Experimental and simulation results show that the human activity identification system can fulfill requirements of target tracking due to improved sensors classification performances with multiple classifier systems constructed using boosting achieving higher accuracy rates.

Keywords: single classifier, ensemble learning, multi-target tracking, multiple classifiers

Procedia PDF Downloads 243
5008 Conformance to Spatial Planning between the Kampala Physical Development Plan of 2012 and the Existing Land Use in 2021

Authors: Brendah Nagula, Omolo Fredrick Okalebo, Ronald Ssengendo, Ivan Bamweyana

Abstract:

The Kampala Physical Development Plan (KPDP) was developed in 2012 and projected both long term and short term developments within the City .The purpose of the plan was to not only shape the city into a spatially planned area but also to control the urban sprawl trends that had expanded with pronounced instances of informal settlements. This plan was approved by the National Physical Planning Board and a signature was appended by the Minister in 2013. Much as the KPDP plan has been implemented using different approaches such as detailed planning, development control, subdivision planning, carrying out construction inspections, greening and beautification, there is still limited knowledge on the level of conformance towards this plan. Therefore, it is yet to be determined whether it has been effective in shaping the City into an ideal spatially planned area. Attaining a clear picture of the level of conformance towards the KPDP 2012 through evaluation between the planned and the existing land use in Kampala City was performed. Methods such as Supervised Classification and Post Classification Change Detection were adopted to perform this evaluation. Scrutiny of findings revealed Central Division registered the lowest level of conformance to the planning standards specified in the KPDP 2012 followed by Nakawa, Rubaga, Kawempe, and Makindye. Furthermore, mixed-use development was identified as the land use with the highest level of non-conformity of 25.11% and institutional land use registered the highest level of conformance of 84.45 %. The results show that the aspect of location was not carefully considered while allocating uses in the KPDP whereby areas located near the Central Business District have higher land rents and hence require uses that ensure profit maximization. Also, the prominence of development towards mixed-use denotes an increased demand for land towards compact development that was not catered for in the plan. Therefore in order to transform Kampala city into a spatially planned area, there is need to carefully develop detailed plans especially for all the Central Division planning precincts indicating considerations for land use densification.

Keywords: spatial plan, post classification change detection, Kampala city, landuse

Procedia PDF Downloads 72
5007 Synthetic Aperture Radar Remote Sensing Classification Using the Bag of Visual Words Model to Land Cover Studies

Authors: Reza Mohammadi, Mahmod R. Sahebi, Mehrnoosh Omati, Milad Vahidi

Abstract:

Classification of high resolution polarimetric Synthetic Aperture Radar (PolSAR) images plays an important role in land cover and land use management. Recently, classification algorithms based on Bag of Visual Words (BOVW) model have attracted significant interest among scholars and researchers in and out of the field of remote sensing. In this paper, BOVW model with pixel based low-level features has been implemented to classify a subset of San Francisco bay PolSAR image, acquired by RADARSAR 2 in C-band. We have used segment-based decision-making strategy and compared the result with the result of traditional Support Vector Machine (SVM) classifier. 90.95% overall accuracy of the classification with the proposed algorithm has shown that the proposed algorithm is comparable with the state-of-the-art methods. In addition to increase in the classification accuracy, the proposed method has decreased undesirable speckle effect of SAR images.

Keywords: Bag of Visual Words (BOVW), classification, feature extraction, land cover management, Polarimetric Synthetic Aperture Radar (PolSAR)

Procedia PDF Downloads 191
5006 Incorporating Anomaly Detection in a Digital Twin Scenario Using Symbolic Regression

Authors: Manuel Alves, Angelica Reis, Armindo Lobo, Valdemar Leiras

Abstract:

In industry 4.0, it is common to have a lot of sensor data. In this deluge of data, hints of possible problems are difficult to spot. The digital twin concept aims to help answer this problem, but it is mainly used as a monitoring tool to handle the visualisation of data. Failure detection is of paramount importance in any industry, and it consumes a lot of resources. Any improvement in this regard is of tangible value to the organisation. The aim of this paper is to add the ability to forecast test failures, curtailing detection times. To achieve this, several anomaly detection algorithms were compared with a symbolic regression approach. To this end, Isolation Forest, One-Class SVM and an auto-encoder have been explored. For the symbolic regression PySR library was used. The first results show that this approach is valid and can be added to the tools available in this context as a low resource anomaly detection method since, after training, the only requirement is the calculation of a polynomial, a useful feature in the digital twin context.

Keywords: anomaly detection, digital twin, industry 4.0, symbolic regression

Procedia PDF Downloads 103
5005 Novel Inference Algorithm for Gaussian Process Classification Model with Multiclass and Its Application to Human Action Classification

Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park

Abstract:

In this paper, we propose a novel inference algorithm for the multi-class Gaussian process classification model that can be used in the field of human behavior recognition. This algorithm can drive simultaneously both a posterior distribution of a latent function and estimators of hyper-parameters in a Gaussian process classification model with multi-class. Our algorithm is based on the Laplace approximation (LA) technique and variational EM framework. This is performed in two steps: called expectation and maximization steps. First, in the expectation step, using the Bayesian formula and LA technique, we derive approximately the posterior distribution of the latent function indicating the possibility that each observation belongs to a certain class in the Gaussian process classification model. Second, in the maximization step, using a derived posterior distribution of latent function, we compute the maximum likelihood estimator for hyper-parameters of a covariance matrix necessary to define prior distribution for latent function. These two steps iteratively repeat until a convergence condition satisfies. Moreover, we apply the proposed algorithm with human action classification problem using a public database, namely, the KTH human action data set. Experimental results reveal that the proposed algorithm shows good performance on this data set.

Keywords: bayesian rule, gaussian process classification model with multiclass, gaussian process prior, human action classification, laplace approximation, variational EM algorithm

Procedia PDF Downloads 317
5004 High Resolution Satellite Imagery and Lidar Data for Object-Based Tree Species Classification in Quebec, Canada

Authors: Bilel Chalghaf, Mathieu Varin

Abstract:

Forest characterization in Quebec, Canada, is usually assessed based on photo-interpretation at the stand level. For species identification, this often results in a lack of precision. Very high spatial resolution imagery, such as DigitalGlobe, and Light Detection and Ranging (LiDAR), have the potential to overcome the limitations of aerial imagery. To date, few studies have used that data to map a large number of species at the tree level using machine learning techniques. The main objective of this study is to map 11 individual high tree species ( > 17m) at the tree level using an object-based approach in the broadleaf forest of Kenauk Nature, Quebec. For the individual tree crown segmentation, three canopy-height models (CHMs) from LiDAR data were assessed: 1) the original, 2) a filtered, and 3) a corrected model. The corrected CHM gave the best accuracy and was then coupled with imagery to refine tree species crown identification. When compared with photo-interpretation, 90% of the objects represented a single species. For modeling, 313 variables were derived from 16-band WorldView-3 imagery and LiDAR data, using radiance, reflectance, pixel, and object-based calculation techniques. Variable selection procedures were employed to reduce their number from 313 to 16, using only 11 bands to aid reproducibility. For classification, a global approach using all 11 species was compared to a semi-hierarchical hybrid classification approach at two levels: (1) tree type (broadleaf/conifer) and (2) individual broadleaf (five) and conifer (six) species. Five different model techniques were used: (1) support vector machine (SVM), (2) classification and regression tree (CART), (3) random forest (RF), (4) k-nearest neighbors (k-NN), and (5) linear discriminant analysis (LDA). Each model was tuned separately for all approaches and levels. For the global approach, the best model was the SVM using eight variables (overall accuracy (OA): 80%, Kappa: 0.77). With the semi-hierarchical hybrid approach, at the tree type level, the best model was the k-NN using six variables (OA: 100% and Kappa: 1.00). At the level of identifying broadleaf and conifer species, the best model was the SVM, with OA of 80% and 97% and Kappa values of 0.74 and 0.97, respectively, using seven variables for both models. This paper demonstrates that a hybrid classification approach gives better results and that using 16-band WorldView-3 with LiDAR data leads to more precise predictions for tree segmentation and classification, especially when the number of tree species is large.

Keywords: tree species, object-based, classification, multispectral, machine learning, WorldView-3, LiDAR

Procedia PDF Downloads 115
5003 An Image Processing Scheme for Skin Fungal Disease Identification

Authors: A. A. M. A. S. S. Perera, L. A. Ranasinghe, T. K. H. Nimeshika, D. M. Dhanushka Dissanayake, Namalie Walgampaya

Abstract:

Nowadays, skin fungal diseases are mostly found in people of tropical countries like Sri Lanka. A skin fungal disease is a particular kind of illness caused by fungus. These diseases have various dangerous effects on the skin and keep on spreading over time. It becomes important to identify these diseases at their initial stage to control it from spreading. This paper presents an automated skin fungal disease identification system implemented to speed up the diagnosis process by identifying skin fungal infections in digital images. An image of the diseased skin lesion is acquired and a comprehensive computer vision and image processing scheme is used to process the image for the disease identification. This includes colour analysis using RGB and HSV colour models, texture classification using Grey Level Run Length Matrix, Grey Level Co-Occurrence Matrix and Local Binary Pattern, Object detection, Shape Identification and many more. This paper presents the approach and its outcome for identification of four most common skin fungal infections, namely, Tinea Corporis, Sporotrichosis, Malassezia and Onychomycosis. The main intention of this research is to provide an automated skin fungal disease identification system that increase the diagnostic quality, shorten the time-to-diagnosis and improve the efficiency of detection and successful treatment for skin fungal diseases.

Keywords: Circularity Index, Grey Level Run Length Matrix, Grey Level Co-Occurrence Matrix, Local Binary Pattern, Object detection, Ring Detection, Shape Identification

Procedia PDF Downloads 212
5002 Brain Tumor Detection and Classification Using Pre-Trained Deep Learning Models

Authors: Aditya Karade, Sharada Falane, Dhananjay Deshmukh, Vijaykumar Mantri

Abstract:

Brain tumors pose a significant challenge in healthcare due to their complex nature and impact on patient outcomes. The application of deep learning (DL) algorithms in medical imaging have shown promise in accurate and efficient brain tumour detection. This paper explores the performance of various pre-trained DL models ResNet50, Xception, InceptionV3, EfficientNetB0, DenseNet121, NASNetMobile, VGG19, VGG16, and MobileNet on a brain tumour dataset sourced from Figshare. The dataset consists of MRI scans categorizing different types of brain tumours, including meningioma, pituitary, glioma, and no tumour. The study involves a comprehensive evaluation of these models’ accuracy and effectiveness in classifying brain tumour images. Data preprocessing, augmentation, and finetuning techniques are employed to optimize model performance. Among the evaluated deep learning models for brain tumour detection, ResNet50 emerges as the top performer with an accuracy of 98.86%. Following closely is Xception, exhibiting a strong accuracy of 97.33%. These models showcase robust capabilities in accurately classifying brain tumour images. On the other end of the spectrum, VGG16 trails with the lowest accuracy at 89.02%.

Keywords: brain tumour, MRI image, detecting and classifying tumour, pre-trained models, transfer learning, image segmentation, data augmentation

Procedia PDF Downloads 52
5001 Functional Variants Detection by RNAseq

Authors: Raffaele A. Calogero

Abstract:

RNAseq represents an attractive methodology for the detection of functional genomic variants. RNAseq results obtained from polyA+ RNA selection protocol (POLYA) and from exonic regions capturing protocol (ACCESS) indicate that ACCESS detects 10% more coding SNV/INDELs with respect to POLYA. ACCESS requires less reads for coding SNV detection with respect to POLYA. However, if the analysis aims at identifying SNV/INDELs also in the 5’ and 3’ UTRs, POLYA is definitively the preferred method. No particular advantage comes from ACCESS or POLYA in the detection of fusion transcripts.

Keywords: fusion transcripts, INDEL, RNA-seq, WES, SNV

Procedia PDF Downloads 271
5000 Polarimetric Synthetic Aperture Radar Data Classification Using Support Vector Machine and Mahalanobis Distance

Authors: Najoua El Hajjaji El Idrissi, Necip Gokhan Kasapoglu

Abstract:

Polarimetric Synthetic Aperture Radar-based imaging is a powerful technique used for earth observation and classification of surfaces. Forest evolution has been one of the vital areas of attention for the remote sensing experts. The information about forest areas can be achieved by remote sensing, whether by using active radars or optical instruments. However, due to several weather constraints, such as cloud cover, limited information can be recovered using optical data and for that reason, Polarimetric Synthetic Aperture Radar (PolSAR) is used as a powerful tool for forestry inventory. In this [14paper, we applied support vector machine (SVM) and Mahalanobis distance to the fully polarimetric AIRSAR P, L, C-bands data from the Nezer forest areas, the classification is based in the separation of different tree ages. The classification results were evaluated and the results show that the SVM performs better than the Mahalanobis distance and SVM achieves approximately 75% accuracy. This result proves that SVM classification can be used as a useful method to evaluate fully polarimetric SAR data with sufficient value of accuracy.

Keywords: classification, synthetic aperture radar, SAR polarimetry, support vector machine, mahalanobis distance

Procedia PDF Downloads 116
4999 Calculation of Detection Efficiency of Horizontal Large Volume Source Using Exvol Code

Authors: M. Y. Kang, Euntaek Yoon, H. D. Choi

Abstract:

To calculate the full energy (FE) absorption peak efficiency for arbitrary volume sample, we developed and verified the EXVol (Efficiency calculator for EXtended Voluminous source) code which is based on effective solid angle method. EXVol is possible to describe the source area as a non-uniform three-dimensional (x, y, z) source. And decompose and set it into several sets of volume units. Users can equally divide (x, y, z) coordinate system to calculate the detection efficiency at a specific position of a cylindrical volume source. By determining the detection efficiency for differential volume units, the total radiative absolute distribution and the correction factor of the detection efficiency can be obtained from the nondestructive measurement of the source. In order to check the performance of the EXVol code, Si ingot of 20 cm in diameter and 50 cm in height were used as a source. The detector was moved at the collimation geometry to calculate the detection efficiency at a specific position and compared with the experimental values. In this study, the performance of the EXVol code was extended to obtain the detection efficiency distribution at a specific position in a large volume source.

Keywords: attenuation, EXVol, detection efficiency, volume source

Procedia PDF Downloads 170
4998 An Earth Mover’s Distance Algorithm Based DDoS Detection Mechanism in SDN

Authors: Yang Zhou, Kangfeng Zheng, Wei Ni, Ren Ping Liu

Abstract:

Software-defined networking (SDN) provides a solution for scalable network framework with decoupled control and data plane. However, this architecture also induces a particular distributed denial-of-service (DDoS) attack that can affect or even overwhelm the SDN network. DDoS attack detection problem has to date been mostly researched as entropy comparison problem. However, this problem lacks the utilization of SDN, and the results are not accurate. In this paper, we propose a DDoS attack detection method, which interprets DDoS detection as a signature matching problem and is formulated as Earth Mover’s Distance (EMD) model. Considering the feasibility and accuracy, we further propose to define the cost function of EMD to be a generalized Kullback-Leibler divergence. Simulation results show that our proposed method can detect DDoS attacks by comparing EMD values with the ones computed in the case without attacks. Moreover, our method can significantly increase the true positive rate of detection.

Keywords: DDoS detection, EMD, relative entropy, SDN

Procedia PDF Downloads 318
4997 Subjective Evaluation of Mathematical Morphology Edge Detection on Computed Tomography (CT) Images

Authors: Emhimed Saffor

Abstract:

In this paper, the problem of edge detection in digital images is considered. Three methods of edge detection based on mathematical morphology algorithm were applied on two sets (Brain and Chest) CT images. 3x3 filter for first method, 5x5 filter for second method and 7x7 filter for third method under MATLAB programming environment. The results of the above-mentioned methods are subjectively evaluated. The results show these methods are more efficient and satiable for medical images, and they can be used for different other applications.

Keywords: CT images, Matlab, medical images, edge detection

Procedia PDF Downloads 312
4996 Classification of Opaque Exterior Walls of Buildings from a Sustainable Point of View

Authors: Michelle Sánchez de León Brajkovich, Nuria Martí Audi

Abstract:

The envelope is one of the most important elements when one analyzes the operation of the building in terms of sustainability. Taking this into consideration, this research focuses on setting a classification system of the envelopes opaque systems, crossing the knowledge and parameters of construction systems with requirements in terms of sustainability that they may have, to have a better understanding of how these systems work with respect to their sustainable contribution to the building. Therefore, this paper evaluates the importance of the envelope design on the building sustainability. It analyses the parameters that make the construction systems behave differently in terms of sustainability. At the same time it explains the classification process generated from this analysis that results in a classification where all opaque vertical envelope construction systems enter.

Keywords: sustainable, exterior walls, envelope, facades, construction systems, energy efficiency

Procedia PDF Downloads 553
4995 Modified CUSUM Algorithm for Gradual Change Detection in a Time Series Data

Authors: Victoria Siriaki Jorry, I. S. Mbalawata, Hayong Shin

Abstract:

The main objective in a change detection problem is to develop algorithms for efficient detection of gradual and/or abrupt changes in the parameter distribution of a process or time series data. In this paper, we present a modified cumulative (MCUSUM) algorithm to detect the start and end of a time-varying linear drift in mean value of a time series data based on likelihood ratio test procedure. The design, implementation and performance of the proposed algorithm for a linear drift detection is evaluated and compared to the existing CUSUM algorithm using different performance measures. An approach to accurately approximate the threshold of the MCUSUM is also provided. Performance of the MCUSUM for gradual change-point detection is compared to that of standard cumulative sum (CUSUM) control chart designed for abrupt shift detection using Monte Carlo Simulations. In terms of the expected time for detection, the MCUSUM procedure is found to have a better performance than a standard CUSUM chart for detection of the gradual change in mean. The algorithm is then applied and tested to a randomly generated time series data with a gradual linear trend in mean to demonstrate its usefulness.

Keywords: average run length, CUSUM control chart, gradual change detection, likelihood ratio test

Procedia PDF Downloads 273
4994 Multi-Classification Deep Learning Model for Diagnosing Different Chest Diseases

Authors: Bandhan Dey, Muhsina Bintoon Yiasha, Gulam Sulaman Choudhury

Abstract:

Chest disease is one of the most problematic ailments in our regular life. There are many known chest diseases out there. Diagnosing them correctly plays a vital role in the process of treatment. There are many methods available explicitly developed for different chest diseases. But the most common approach for diagnosing these diseases is through X-ray. In this paper, we proposed a multi-classification deep learning model for diagnosing COVID-19, lung cancer, pneumonia, tuberculosis, and atelectasis from chest X-rays. In the present work, we used the transfer learning method for better accuracy and fast training phase. The performance of three architectures is considered: InceptionV3, VGG-16, and VGG-19. We evaluated these deep learning architectures using public digital chest x-ray datasets with six classes (i.e., COVID-19, lung cancer, pneumonia, tuberculosis, atelectasis, and normal). The experiments are conducted on six-classification, and we found that VGG16 outperforms other proposed models with an accuracy of 95%.

Keywords: deep learning, image classification, X-ray images, Tensorflow, Keras, chest diseases, convolutional neural networks, multi-classification

Procedia PDF Downloads 71
4993 Application of Envelope Spectrum Analysis and Spectral Kurtosis to Diagnose Debris Fault in Bearing Using Acoustic Signals

Authors: Henry Ogbemudia Omoregbee, Mabel Usunobun Olanipekun

Abstract:

Debris fault diagnosis based on acoustic signals in rolling element bearing running at low speed and high radial loads are more of low amplitudes, particularly in the case of debris faults whose signals necessitate high sensitivity analyses. As the rollers in the bearing roll over debris trapped in grease used to lubricate the bearings, the envelope signal created by amplitude demodulation carries additional diagnostic information that is not available through ordinary spectrum analysis of the raw signal. The kurtosis value obtained for three different scenarios (debris induced, outer crack induced, and a normal good bearing) couldn't be used to easily identify whether the used bearings were defective or not. It was established in this work that the envelope spectrum analysis detected the fault signature and its harmonics induced in the debris bearings when bandpass filtering of the raw signal with the frequency band specified by kurtogram and spectral kurtosis was made.

Keywords: rolling bearings, rolling element bearing noise, bandpass filtering, harmonics, envelope spectrum analysis, spectral kurtosis

Procedia PDF Downloads 66
4992 Normalizing Scientometric Indicators of Individual Publications Using Local Cluster Detection Methods on Citation Networks

Authors: Levente Varga, Dávid Deritei, Mária Ercsey-Ravasz, Răzvan Florian, Zsolt I. Lázár, István Papp, Ferenc Járai-Szabó

Abstract:

One of the major shortcomings of widely used scientometric indicators is that different disciplines cannot be compared with each other. The issue of cross-disciplinary normalization has been long discussed, but even the classification of publications into scientific domains poses problems. Structural properties of citation networks offer new possibilities, however, the large size and constant growth of these networks asks for precaution. Here we present a new tool that in order to perform cross-field normalization of scientometric indicators of individual publications relays on the structural properties of citation networks. Due to the large size of the networks, a systematic procedure for identifying scientific domains based on a local community detection algorithm is proposed. The algorithm is tested with different benchmark and real-world networks. Then, by the use of this algorithm, the mechanism of the scientometric indicator normalization process is shown for a few indicators like the citation number, P-index and a local version of the PageRank indicator. The fat-tail trend of the article indicator distribution enables us to successfully perform the indicator normalization process.

Keywords: citation networks, cross-field normalization, local cluster detection, scientometric indicators

Procedia PDF Downloads 182
4991 The Role of the Injured Party's Fault in the Apportionment of Damages in Tort Law: A Comparative-Historical Study between Common Law and Islamic Law

Authors: Alireza Tavakoli Nia

Abstract:

In order to understand the role of the injured party's fault in dividing liability, we studied its historical background. In common law, the traditional contributory negligence rule was a complete defense. Then the legislature and judicial procedure modified that rule to one of apportionment. In Islamic law, too, the Action rule was at first used when the injured party was the sole cause, but jurists expanded the scope of this rule, so this rule was used in cases where both the injured party's fault and that of the other party are involved. There are some popular approaches for apportionment of damages. Some common law countries like Britain had chosen ‘the causal potency approach’ and ‘fixed apportionment’. Islamic countries like Iran have chosen both ‘the relative blameworthiness’ and ‘equal apportionment’ approaches. The article concludes that both common law and Islamic law believe in the division of responsibility between a wrongdoer claimant and the defendant. In contrast, in the apportionment of responsibility, Islamic law mostly believes in equal apportionment that is way easier and saves time and money, but common law legal systems have chosen the causal potency approach, which is more complicated than the rival approach but is fairer.

Keywords: contributory negligence, tort law, damage apportionment, common law, Islamic law

Procedia PDF Downloads 128
4990 Experimental Study of Hyperparameter Tuning a Deep Learning Convolutional Recurrent Network for Text Classification

Authors: Bharatendra Rai

Abstract:

The sequence of words in text data has long-term dependencies and is known to suffer from vanishing gradient problems when developing deep learning models. Although recurrent networks such as long short-term memory networks help to overcome this problem, achieving high text classification performance is a challenging problem. Convolutional recurrent networks that combine the advantages of long short-term memory networks and convolutional neural networks can be useful for text classification performance improvements. However, arriving at suitable hyperparameter values for convolutional recurrent networks is still a challenging task where fitting a model requires significant computing resources. This paper illustrates the advantages of using convolutional recurrent networks for text classification with the help of statistically planned computer experiments for hyperparameter tuning.

Keywords: long short-term memory networks, convolutional recurrent networks, text classification, hyperparameter tuning, Tukey honest significant differences

Procedia PDF Downloads 102
4989 A Novel Method for Face Detection

Authors: H. Abas Nejad, A. R. Teymoori

Abstract:

Facial expression recognition is one of the open problems in computer vision. Robust neutral face recognition in real time is a major challenge for various supervised learning based facial expression recognition methods. This is due to the fact that supervised methods cannot accommodate all appearance variability across the faces with respect to race, pose, lighting, facial biases, etc. in the limited amount of training data. Moreover, processing each and every frame to classify emotions is not required, as the user stays neutral for the majority of the time in usual applications like video chat or photo album/web browsing. Detecting neutral state at an early stage, thereby bypassing those frames from emotion classification would save the computational power. In this work, we propose a light-weight neutral vs. emotion classification engine, which acts as a preprocessor to the traditional supervised emotion classification approaches. It dynamically learns neutral appearance at Key Emotion (KE) points using a textural statistical model, constructed by a set of reference neutral frames for each user. The proposed method is made robust to various types of user head motions by accounting for affine distortions based on a textural statistical model. Robustness to dynamic shift of KE points is achieved by evaluating the similarities on a subset of neighborhood patches around each KE point using the prior information regarding the directionality of specific facial action units acting on the respective KE point. The proposed method, as a result, improves ER accuracy and simultaneously reduces the computational complexity of ER system, as validated on multiple databases.

Keywords: neutral vs. emotion classification, Constrained Local Model, procrustes analysis, Local Binary Pattern Histogram, statistical model

Procedia PDF Downloads 325
4988 Object Oriented Fault Tree Analysis Methodology

Authors: Yi Xiong, Tao Kong

Abstract:

Traditional safety, risk and reliability analysis approaches are problem-oriented, which make it great workload when analyzing complicated and huge system, besides, too much repetitive work would to do if the analyzed system composed by many similar components. It is pressing need an object and function oriented approach to maintain high consistency with problem domain. A new approach is proposed to overcome these shortcomings of traditional approaches, the concepts: class, abstract, inheritance, polymorphism and encapsulation are introduced into FTA and establish the professional class library that the abstractions of physical objects in real word, four areas relevant information also be proposed as the establish help guide. The interaction between classes is completed by the inside or external methods that mapping the attributes to base events through fully search the knowledge base, which forms good encapsulation. The object oriented fault tree analysis system that analyze and evaluate the system safety and reliability according to the original appearance of the problem is set up, where could mapped directly from the class and object to the problem domain of the fault tree analysis. All the system failure situations can be analyzed through this bottom-up fault tree construction approach. Under this approach architecture, FTA approach is developed, which avoids the human influence of the analyst on analysis results. It reveals the inherent safety problems of analyzed system itself and provides a new way of thinking and development for safety analysis. So that object oriented technology in the field of safety applications and development, safety theory is conducive to innovation.

Keywords: FTA, knowledge base, object-oriented technology, reliability analysis

Procedia PDF Downloads 234
4987 Cervical Cell Classification Using Random Forests

Authors: Dalwinder Singh, Amandeep Verma, Manpreet Kaur, Birmohan Singh

Abstract:

The detection of pre-cancerous changes using a Pap smear test of cervical cell is the important step for the early diagnosis of cervical cancer. The Pap smear test consists of a sample of human cells taken from the cervix which are analysed to detect cancerous and pre-cancerous stage of the given subject. The manual analysis of these cells is labor intensive and time consuming process which relies on expert cytotechnologist. In this paper, a computer assisted system for the automated analysis of the cervical cells has been proposed. We propose a morphology based approach to the nucleus detection and segmentation of the cytoplasmic region of the given single or multiple overlapped cell. Further, various texture and region based features are calculated from these cells to classify these into normal and abnormal cell. Experimental results on public available dataset show that our system has achieved satisfactory success rate.

Keywords: cervical cancer, cervical tissue, mathematical morphology, texture features

Procedia PDF Downloads 505
4986 Seismic Behavior of Steel Moment-Resisting Frames for Uplift Permitted in Near-Fault Regions

Authors: M. Tehranizadeh, E. Shoushtari Rezvani

Abstract:

Seismic performance of steel moment-resisting frame structures is investigated considering nonlinear soil-structure interaction (SSI) effects. 10-, 15-, and 20-story planar building frames with aspect ratio of 3 are designed in accordance with current building codes. Inelastic seismic demands of the superstructure are considered using concentrated plasticity model. The raft foundation system is designed for different soil types. Beam-on-nonlinear Winkler foundation (BNWF) is used to represent dynamic impedance of the underlying soil. Two sets of pulse-like as well as no-pulse near-fault earthquakes are used as input ground motions. The results show that the reduction in drift demands due to nonlinear SSI is characterized by a more uniform distribution pattern along the height when compared to the fixed-base and linear SSI condition. It is also concluded that beneficial effects of nonlinear SSI on displacement demands is more significant in case of pulse-like ground motions and performance level of the steel moment-resisting frames can be enhanced.

Keywords: soil-structure interaction, uplifting, soil plasticity, near-fault earthquake, tall building

Procedia PDF Downloads 538
4985 An Architectural Model for APT Detection

Authors: Nam-Uk Kim, Sung-Hwan Kim, Tai-Myoung Chung

Abstract:

Typical security management systems are not suitable for detecting APT attack, because they cannot draw the big picture from trivial events of security solutions. Although SIEM solutions have security analysis engine for that, their security analysis mechanisms need to be verified in academic field. Although this paper proposes merely an architectural model for APT detection, we will keep studying on correlation analysis mechanism in the future.

Keywords: advanced persistent threat, anomaly detection, data mining

Procedia PDF Downloads 508