Search results for: random forest classifier
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1007

Search results for: random forest classifier

467 Application of Gamma Frailty Model in Survival of Liver Cirrhosis Patients

Authors: Elnaz Saeedi, Jamileh Abolaghasemi, Mohsen Nasiri Tousi, Saeedeh Khosravi

Abstract:

Goals and Objectives: A typical analysis of survival data involves the modeling of time-to-event data, such as the time till death. A frailty model is a random effect model for time-to-event data, where the random effect has a multiplicative influence on the baseline hazard function. This article aims to investigate the use of gamma frailty model with concomitant variable in order to individualize the prognostic factors that influence the liver cirrhosis patients’ survival times. Methods: During the one-year study period (May 2008-May 2009), data have been used from the recorded information of patients with liver cirrhosis who were scheduled for liver transplantation and were followed up for at least seven years in Imam Khomeini Hospital in Iran. In order to determine the effective factors for cirrhotic patients’ survival in the presence of latent variables, the gamma frailty distribution has been applied. In this article, it was considering the parametric model, such as Exponential and Weibull distributions for survival time. Data analysis is performed using R software, and the error level of 0.05 was considered for all tests. Results: 305 patients with liver cirrhosis including 180 (59%) men and 125 (41%) women were studied. The age average of patients was 39.8 years. At the end of the study, 82 (26%) patients died, among them 48 (58%) were men and 34 (42%) women. The main cause of liver cirrhosis was found hepatitis 'B' with 23%, followed by cryptogenic with 22.6% were identified as the second factor. Generally, 7-year’s survival was 28.44 months, for dead patients and for censoring was 19.33 and 31.79 months, respectively. Using multi-parametric survival models of progressive and regressive, Exponential and Weibull models with regard to the gamma frailty distribution were fitted to the cirrhosis data. In both models, factors including, age, bilirubin serum, albumin serum, and encephalopathy had a significant effect on survival time of cirrhotic patients. Conclusion: To investigate the effective factors for the time of patients’ death with liver cirrhosis in the presence of latent variables, gamma frailty model with parametric distributions seems desirable.

Keywords: Frailty model, latent variables, liver cirrhosis, parametric distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1058
466 Blood Glucose Level Measurement from Breath Analysis

Authors: Tayyab Hassan, Talha Rehman, Qasim Abdul Aziz, Ahmad Salman

Abstract:

The constant monitoring of blood glucose level is necessary for maintaining health of patients and to alert medical specialists to take preemptive measures before the onset of any complication as a result of diabetes. The current clinical monitoring of blood glucose uses invasive methods repeatedly which are uncomfortable and may result in infections in diabetic patients. Several attempts have been made to develop non-invasive techniques for blood glucose measurement. In this regard, the existing methods are not reliable and are less accurate. Other approaches claiming high accuracy have not been tested on extended dataset, and thus, results are not statistically significant. It is a well-known fact that acetone concentration in breath has a direct relation with blood glucose level. In this paper, we have developed the first of its kind, reliable and high accuracy breath analyzer for non-invasive blood glucose measurement. The acetone concentration in breath was measured using MQ 138 sensor in the samples collected from local hospitals in Pakistan involving one hundred patients. The blood glucose levels of these patients are determined using conventional invasive clinical method. We propose a linear regression classifier that is trained to map breath acetone level to the collected blood glucose level achieving high accuracy.

Keywords: Blood glucose level, breath acetone concentration, diabetes, linear regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1551
465 Certain Data Dimension Reduction Techniques for application with ANN based MCS for Study of High Energy Shower

Authors: Gitanjali Devi, Kandarpa Kumar Sarma, Pranayee Datta, Anjana Kakoti Mahanta

Abstract:

Cosmic showers, from their places of origin in space, after entering earth generate secondary particles called Extensive Air Shower (EAS). Detection and analysis of EAS and similar High Energy Particle Showers involve a plethora of experimental setups with certain constraints for which soft-computational tools like Artificial Neural Network (ANN)s can be adopted. The optimality of ANN classifiers can be enhanced further by the use of Multiple Classifier System (MCS) and certain data - dimension reduction techniques. This work describes the performance of certain data dimension reduction techniques like Principal Component Analysis (PCA), Independent Component Analysis (ICA) and Self Organizing Map (SOM) approximators for application with an MCS formed using Multi Layer Perceptron (MLP), Recurrent Neural Network (RNN) and Probabilistic Neural Network (PNN). The data inputs are obtained from an array of detectors placed in a circular arrangement resembling a practical detector grid which have a higher dimension and greater correlation among themselves. The PCA, ICA and SOM blocks reduce the correlation and generate a form suitable for real time practical applications for prediction of primary energy and location of EAS from density values captured using detectors in a circular grid.

Keywords: EAS, Shower, Core, ANN, Location.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1608
464 Dispersed Error Control based on Error Filter Design for Improving Halftone Image Quality

Authors: Sang-Chul Kim, Sung-Il Chien

Abstract:

The error diffusion method generates worm artifacts, and weakens the edge of the halftone image when the continuous gray scale image is reproduced by a binary image. First, to enhance the edges, we propose the edge-enhancing filter by considering the quantization error information and gradient of the neighboring pixels. Furthermore, to remove worm artifacts often appearing in a halftone image, we add adaptively random noise into the weights of an error filter.

Keywords: Artifact suppression, Edge enhancement, Error diffusion method, Halftone image

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1424
463 A Neural Network Classifier for Estimation of the Degree of Infestation by Late Blight on Tomato Leaves

Authors: Gizelle K. Vianna, Gabriel V. Cunha, Gustavo S. Oliveira

Abstract:

Foliage diseases in plants can cause a reduction in both quality and quantity of agricultural production. Intelligent detection of plant diseases is an essential research topic as it may help monitoring large fields of crops by automatically detecting the symptoms of foliage diseases. This work investigates ways to recognize the late blight disease from the analysis of tomato digital images, collected directly from the field. A pair of multilayer perceptron neural network analyzes the digital images, using data from both RGB and HSL color models, and classifies each image pixel. One neural network is responsible for the identification of healthy regions of the tomato leaf, while the other identifies the injured regions. The outputs of both networks are combined to generate the final classification of each pixel from the image and the pixel classes are used to repaint the original tomato images by using a color representation that highlights the injuries on the plant. The new images will have only green, red or black pixels, if they came from healthy or injured portions of the leaf, or from the background of the image, respectively. The system presented an accuracy of 97% in detection and estimation of the level of damage on the tomato leaves caused by late blight.

Keywords: Artificial neural networks, digital image processing, pattern recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2553
462 Wavelet - Based Classification of Outdoor Natural Scenes by Resilient Neural Network

Authors: Amitabh Wahi, Sundaramurthy S.

Abstract:

Natural outdoor scene classification is active and promising research area around the globe. In this study, the classification is carried out in two phases. In the first phase, the features are extracted from the images by wavelet decomposition method and stored in a database as feature vectors. In the second phase, the neural classifiers such as back-propagation neural network (BPNN) and resilient back-propagation neural network (RPNN) are employed for the classification of scenes. Four hundred color images are considered from MIT database of two classes as forest and street. A comparative study has been carried out on the performance of the two neural classifiers BPNN and RPNN on the increasing number of test samples. RPNN showed better classification results compared to BPNN on the large test samples.

Keywords: BPNN, Classification, Feature extraction, RPNN, Wavelet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1943
461 Developing NAND Flash-Memory SSD-Based File System Design

Authors: Jaechun No

Abstract:

This paper focuses on I/O optimizations of N-hybrid (New-Form of hybrid), which provides a hybrid file system space constructed on SSD and HDD. Although the promising potentials of SSD, such as the absence of mechanical moving overhead and high random I/O throughput, have drawn a lot of attentions from IT enterprises, its high ratio of cost/capacity makes it less desirable to build a large-scale data storage subsystem composed of only SSDs. In this paper, we present N-hybrid that attempts to integrate the strengths of SSD and HDD, to offer a single, large hybrid file system space. Several experiments were conducted to verify the performance of N-hybrid.

Keywords: SSD, data section, I/O optimizations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1803
460 Texture Feature Extraction of Infrared River Ice Images using Second-Order Spatial Statistics

Authors: Bharathi P. T, P. Subashini

Abstract:

Ice cover County has a significant impact on rivers as it affects with the ice melting capacity which results in flooding, restrict navigation, modify the ecosystem and microclimate. River ices are made up of different ice types with varying ice thickness, so surveillance of river ice plays an important role. River ice types are captured using infrared imaging camera which captures the images even during the night times. In this paper the river ice infrared texture images are analysed using first-order statistical methods and secondorder statistical methods. The second order statistical methods considered are spatial gray level dependence method, gray level run length method and gray level difference method. The performance of the feature extraction methods are evaluated by using Probabilistic Neural Network classifier and it is found that the first-order statistical method and second-order statistical method yields low accuracy. So the features extracted from the first-order statistical method and second-order statistical method are combined and it is observed that the result of these combined features (First order statistical method + gray level run length method) provides higher accuracy when compared with the features from the first-order statistical method and second-order statistical method alone.

Keywords: Gray Level Difference Method, Gray Level Run Length Method, Kurtosis, Probabilistic Neural Network, Skewness, Spatial Gray Level Dependence Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2909
459 Active Segment Selection Method in EEG Classification Using Fractal Features

Authors: Samira Vafaye Eslahi

Abstract:

BCI (Brain Computer Interface) is a communication machine that translates brain massages to computer commands. These machines with the help of computer programs can recognize the tasks that are imagined. Feature extraction is an important stage of the process in EEG classification that can effect in accuracy and the computation time of processing the signals. In this study we process the signal in three steps of active segment selection, fractal feature extraction, and classification. One of the great challenges in BCI applications is to improve classification accuracy and computation time together. In this paper, we have used student’s 2D sample t-statistics on continuous wavelet transforms for active segment selection to reduce the computation time. In the next level, the features are extracted from some famous fractal dimension estimation of the signal. These fractal features are Katz and Higuchi. In the classification stage we used ANFIS (Adaptive Neuro-Fuzzy Inference System) classifier, FKNN (Fuzzy K-Nearest Neighbors), LDA (Linear Discriminate Analysis), and SVM (Support Vector Machines). We resulted that active segment selection method would reduce the computation time and Fractal dimension features with ANFIS analysis on selected active segments is the best among investigated methods in EEG classification.

Keywords: EEG, Student’s t- statistics, BCI, Fractal Features, ANFIS, FKNN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2120
458 Fused Structure and Texture (FST) Features for Improved Pedestrian Detection

Authors: Hussin K. Ragb, Vijayan K. Asari

Abstract:

In this paper, we present a pedestrian detection descriptor called Fused Structure and Texture (FST) features based on the combination of the local phase information with the texture features. Since the phase of the signal conveys more structural information than the magnitude, the phase congruency concept is used to capture the structural features. On the other hand, the Center-Symmetric Local Binary Pattern (CSLBP) approach is used to capture the texture information of the image. The dimension less quantity of the phase congruency and the robustness of the CSLBP operator on the flat images, as well as the blur and illumination changes, lead the proposed descriptor to be more robust and less sensitive to the light variations. The proposed descriptor can be formed by extracting the phase congruency and the CSLBP values of each pixel of the image with respect to its neighborhood. The histogram of the oriented phase and the histogram of the CSLBP values for the local regions in the image are computed and concatenated to construct the FST descriptor. Several experiments were conducted on INRIA and the low resolution DaimlerChrysler datasets to evaluate the detection performance of the pedestrian detection system that is based on the FST descriptor. A linear Support Vector Machine (SVM) is used to train the pedestrian classifier. These experiments showed that the proposed FST descriptor has better detection performance over a set of state of the art feature extraction methodologies.

Keywords: Pedestrian detection, phase congruency, local phase, LBP features, CSLBP features, FST descriptor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1489
457 A Trainable Neural Network Ensemble for ECG Beat Classification

Authors: Atena Sajedin, Shokoufeh Zakernejad, Soheil Faridi, Mehrdad Javadi, Reza Ebrahimpour

Abstract:

This paper illustrates the use of a combined neural network model for classification of electrocardiogram (ECG) beats. We present a trainable neural network ensemble approach to develop customized electrocardiogram beat classifier in an effort to further improve the performance of ECG processing and to offer individualized health care. We process a three stage technique for detection of premature ventricular contraction (PVC) from normal beats and other heart diseases. This method includes a denoising, a feature extraction and a classification. At first we investigate the application of stationary wavelet transform (SWT) for noise reduction of the electrocardiogram (ECG) signals. Then feature extraction module extracts 10 ECG morphological features and one timing interval feature. Then a number of multilayer perceptrons (MLPs) neural networks with different topologies are designed. The performance of the different combination methods as well as the efficiency of the whole system is presented. Among them, Stacked Generalization as a proposed trainable combined neural network model possesses the highest recognition rate of around 95%. Therefore, this network proves to be a suitable candidate in ECG signal diagnosis systems. ECG samples attributing to the different ECG beat types were extracted from the MIT-BIH arrhythmia database for the study.

Keywords: ECG beat Classification; Combining Classifiers;Premature Ventricular Contraction (PVC); Multi Layer Perceptrons;Wavelet Transform

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2216
456 MPPT Operation for PV Grid-connected System using RBFNN and Fuzzy Classification

Authors: A. Chaouachi, R. M. Kamel, K. Nagasaka

Abstract:

This paper presents a novel methodology for Maximum Power Point Tracking (MPPT) of a grid-connected 20 kW Photovoltaic (PV) system using neuro-fuzzy network. The proposed method predicts the reference PV voltage guarantying optimal power transfer between the PV generator and the main utility grid. The neuro-fuzzy network is composed of a fuzzy rule-based classifier and three Radial Basis Function Neural Networks (RBFNN). Inputs of the network (irradiance and temperature) are classified before they are fed into the appropriated RBFNN for either training or estimation process while the output is the reference voltage. The main advantage of the proposed methodology, comparing to a conventional single neural network-based approach, is the distinct generalization ability regarding to the nonlinear and dynamic behavior of a PV generator. In fact, the neuro-fuzzy network is a neural network based multi-model machine learning that defines a set of local models emulating the complex and non-linear behavior of a PV generator under a wide range of operating conditions. Simulation results under several rapid irradiance variations proved that the proposed MPPT method fulfilled the highest efficiency comparing to a conventional single neural network.

Keywords: MPPT, neuro-fuzzy, RBFN, grid-connected, photovoltaic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3182
455 Urban Environmental Challenges in Developing Cities: The Case of Ethiopian Capital Addis Ababa

Authors: Dubbale Daniel A., Tsutsumi J., Michael J. Bendewald

Abstract:

Addis Ababa is a seat of African Union (AU), United Nations Economic Commission for Africa (UN-ECA) and hundreds of embassies and consular representatives. Addis Ababa is one of the highest capitals in the world with an average 2400 meters above sea level. It is dichotomous city with a blend of modern high-rise and deteriorating slum quarters. Water supply and sanitation, waste management and housing are continuing to be serious problems. Forest wood based domestic energy use as well as uncontrolled emissions from mobile and fixed sources has endangered the state of the urban environment. Analysis based on satellite imagery has revealed the deteriorating urban environment within the last three decades. The recently restructured city administration has brought improvements in the condition of the urban environment. However, the overwhelming size of the challenges faced by the city dwarfed their fairly good results.

Keywords: Addis Ababa, Urban environment, Slum, Housing, Relocation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6472
454 The Response Relation between Climate Change and NDVI over the Qinghai-Tibet plateau

Authors: Shen Weishou, Ji Di, Zhang Hui, Yan Shouguang, Li Haidong, Lin Naifeng

Abstract:

Based on a long-term vegetation index dataset of NDVI and meteorological data from 68 meteorological stations in the Qinghai-Tibet plateau and their relations with major climate factors were analyzed. The results show the following: 1) The linear trends of temperature in the Qinghai-Tibet plateau indicate that the temperature in the plateau generally increased, but it rose faster in the last 20 years. 2) The most significant NDVI increase occurred in the eastern and southern plateau. However, the western and northern plateau demonstrate a decreasing trend. 3) There is a significant positive linear correlation between NDVI and temperature and a negative correlation between NDVI and mean wind speed. However, no significant statistical relationship was found between NDVI and relative humidity, precipitation or sunshine duration.4) The changes in NDVI for the plateau are driven by temperature-precipitation, but for the desert and forest areas, the relation changes to precipitation-temperature-wind velocity and wind velocity-temperature-precipitation.

Keywords: Qinghai-Tibet plateau, NDVI, climate warming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2218
453 Night-Time Traffic Light Detection Based On SVM with Geometric Moment Features

Authors: Hyun-Koo Kim, Young-Nam Shin, Sa-gong Kuk, Ju H. Park, Ho-Youl Jung

Abstract:

This paper presents an effective traffic lights detection method at the night-time. First, candidate blobs of traffic lights are extracted from RGB color image. Input image is represented on the dominant color domain by using color transform proposed by Ruta, then red and green color dominant regions are selected as candidates. After candidate blob selection, we carry out shape filter for noise reduction using information of blobs such as length, area, area of boundary box, etc. A multi-class classifier based on SVM (Support Vector Machine) applies into the candidates. Three kinds of features are used. We use basic features such as blob width, height, center coordinate, area, area of blob. Bright based stochastic features are also used. In particular, geometric based moment-s values between candidate region and adjacent region are proposed and used to improve the detection performance. The proposed system is implemented on Intel Core CPU with 2.80 GHz and 4 GB RAM and tested with the urban and rural road videos. Through the test, we show that the proposed method using PF, BMF, and GMF reaches up to 93 % of detection rate with computation time of in average 15 ms/frame.

Keywords: Night-time traffic light detection, multi-class classification, driving assistance system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3885
452 Effective Traffic Lights Recognition Method for Real Time Driving Assistance Systemin the Daytime

Authors: Hyun-Koo Kim, Ju H. Park, Ho-Youl Jung

Abstract:

This paper presents an effective traffic lights recognition method at the daytime. First, Potential Traffic Lights Detector (PTLD) use whole color source of YCbCr channel image and make each binary image of green and red traffic lights. After PTLD step, Shape Filter (SF) use to remove noise such as traffic sign, street tree, vehicle, and building. At this time, noise removal properties consist of information of blobs of binary image; length, area, area of boundary box, etc. Finally, after an intermediate association step witch goal is to define relevant candidates region from the previously detected traffic lights, Adaptive Multi-class Classifier (AMC) is executed. The classification method uses Haar-like feature and Adaboost algorithm. For simulation, we are implemented through Intel Core CPU with 2.80 GHz and 4 GB RAM and tested in the urban and rural roads. Through the test, we are compared with our method and standard object-recognition learning processes and proved that it reached up to 94 % of detection rate which is better than the results achieved with cascade classifiers. Computation time of our proposed method is 15 ms.

Keywords: Traffic Light Detection, Multi-class Classification, Driving Assistance System, Haar-like Feature, Color SegmentationMethod, Shape Filter

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2780
451 Self Watermarking based on Visual Cryptography

Authors: Mahmoud A. Hassan, Mohammed A. Khalili

Abstract:

We are proposing a simple watermarking method based on visual cryptography. The method is based on selection of specific pixels from the original image instead of random selection of pixels as per Hwang [1] paper. Verification information is generated which will be used to verify the ownership of the image without the need to embed the watermark pattern into the original digital data. Experimental results show the proposed method can recover the watermark pattern from the marked data even if some changes are made to the original digital data.

Keywords: Watermarking, visual cryptography, visualthreshold.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1738
450 Isolation and Classification of Red Blood Cells in Anemic Microscopic Images

Authors: Jameela Ali Alkrimi, Loay E. George, Azizah Suliman, Abdul Rahim Ahmad, Karim Al-Jashamy

Abstract:

Red blood cells (RBCs) are among the most commonly and intensively studied type of blood cells in cell biology. Anemia is a lack of RBCs is characterized by its level compared to the normal hemoglobin level. In this study, a system based image processing methodology was developed to localize and extract RBCs from microscopic images. Also, the machine learning approach is adopted to classify the localized anemic RBCs images. Several textural and geometrical features are calculated for each extracted RBCs. The training set of features was analyzed using principal component analysis (PCA). With the proposed method, RBCs were isolated in 4.3secondsfrom an image containing 18 to 27 cells. The reasons behind using PCA are its low computation complexity and suitability to find the most discriminating features which can lead to accurate classification decisions. Our classifier algorithm yielded accuracy rates of 100%, 99.99%, and 96.50% for K-nearest neighbor (K-NN) algorithm, support vector machine (SVM), and neural network RBFNN, respectively. Classification was evaluated in highly sensitivity, specificity, and kappa statistical parameters. In conclusion, the classification results were obtained within short time period, and the results became better when PCA was used.

Keywords: Red blood cells, pre-processing image algorithms, classification algorithms, principal component analysis PCA, confusion matrix, kappa statistical parameters, ROC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3199
449 Combined Feature Based Hyperspectral Image Classification Technique Using Support Vector Machines

Authors: Mrs.K.Kavitha, S.Arivazhagan

Abstract:

A spatial classification technique incorporating a State of Art Feature Extraction algorithm is proposed in this paper for classifying a heterogeneous classes present in hyper spectral images. The classification accuracy can be improved if and only if both the feature extraction and classifier selection are proper. As the classes in the hyper spectral images are assumed to have different textures, textural classification is entertained. Run Length feature extraction is entailed along with the Principal Components and Independent Components. A Hyperspectral Image of Indiana Site taken by AVIRIS is inducted for the experiment. Among the original 220 bands, a subset of 120 bands is selected. Gray Level Run Length Matrix (GLRLM) is calculated for the selected forty bands. From GLRLMs the Run Length features for individual pixels are calculated. The Principle Components are calculated for other forty bands. Independent Components are calculated for next forty bands. As Principal & Independent Components have the ability to represent the textural content of pixels, they are treated as features. The summation of Run Length features, Principal Components, and Independent Components forms the Combined Features which are used for classification. SVM with Binary Hierarchical Tree is used to classify the hyper spectral image. Results are validated with ground truth and accuracies are calculated.

Keywords: Multi-class, Run Length features, PCA, ICA, classification and Support Vector Machines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1522
448 Influence of Deficient Materials on the Reliability of Reinforced Concrete Members

Authors: Sami W. Tabsh

Abstract:

The strength of reinforced concrete depends on the member dimensions and material properties. The properties of concrete and steel materials are not constant but random variables. The variability of concrete strength is due to batching errors, variations in mixing, cement quality uncertainties, differences in the degree of compaction and disparity in curing. Similarly, the variability of steel strength is attributed to the manufacturing process, rolling conditions, characteristics of base material, uncertainties in chemical composition, and the microstructure-property relationships. To account for such uncertainties, codes of practice for reinforced concrete design impose resistance factors to ensure structural reliability over the useful life of the structure. In this investigation, the effects of reductions in concrete and reinforcing steel strengths from the nominal values, beyond those accounted for in the structural design codes, on the structural reliability are assessed. The considered limit states are flexure, shear and axial compression based on the ACI 318-11 structural concrete building code. Structural safety is measured in terms of a reliability index. Probabilistic resistance and load models are compiled from the available literature. The study showed that there is a wide variation in the reliability index for reinforced concrete members designed for flexure, shear or axial compression, especially when the live-to-dead load ratio is low. Furthermore, variations in concrete strength have minor effect on the reliability of beams in flexure, moderate effect on the reliability of beams in shear, and sever effect on the reliability of columns in axial compression. On the other hand, changes in steel yield strength have great effect on the reliability of beams in flexure, moderate effect on the reliability of beams in shear, and mild effect on the reliability of columns in axial compression. Based on the outcome, it can be concluded that the reliability of beams is sensitive to changes in the yield strength of the steel reinforcement, whereas the reliability of columns is sensitive to variations in the concrete strength. Since the embedded target reliability in structural design codes results in lower structural safety in beams than in columns, large reductions in material strengths compromise the structural safety of beams much more than they affect columns.

Keywords: Code, flexure, limit states, random variables, reinforced concrete, reliability, reliability index, shear, structural safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2584
447 Intelligent Transport System: Classification of Traffic Signs Using Deep Neural Networks in Real Time

Authors: Anukriti Kumar, Tanmay Singh, Dinesh Kumar Vishwakarma

Abstract:

Traffic control has been one of the most common and irritating problems since the time automobiles have hit the roads. Problems like traffic congestion have led to a significant time burden around the world and one significant solution to these problems can be the proper implementation of the Intelligent Transport System (ITS). It involves the integration of various tools like smart sensors, artificial intelligence, position technologies and mobile data services to manage traffic flow, reduce congestion and enhance driver's ability to avoid accidents during adverse weather. Road and traffic signs’ recognition is an emerging field of research in ITS. Classification problem of traffic signs needs to be solved as it is a major step in our journey towards building semi-autonomous/autonomous driving systems. The purpose of this work focuses on implementing an approach to solve the problem of traffic sign classification by developing a Convolutional Neural Network (CNN) classifier using the GTSRB (German Traffic Sign Recognition Benchmark) dataset. Rather than using hand-crafted features, our model addresses the concern of exploding huge parameters and data method augmentations. Our model achieved an accuracy of around 97.6% which is comparable to various state-of-the-art architectures.

Keywords: Multiclass classification, convolution neural network, OpenCV, Data Augmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 814
446 A 2D-3D Hybrid Vision System for Robotic Manipulation of Randomly Oriented Objects

Authors: Moulay A. Akhloufi

Abstract:

This paper presents an new vision technique for robotic manipulation of randomly oriented objects in industrial applications. The proposed approach uses 2D and 3D vision for efficiently extracting the 3D pose of an object in the presence of multiple randomly positioned objects. 2D vision permits to quickly select the objects of interest for 3D processing with a new modified ICP algorithm (FaR-ICP), thus reducing significantly the processing time. The extracted 3D pose is then sent to the robot manipulator for picking. The tests show that the proposed system achieves high performances

Keywords: 3D vision, Hand-Eye calibration, robot visual servoing, random bin picking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1814
445 Long-Term Simulation of Digestive Sound Signals by CEPSTRAL Technique

Authors: Einalou Z., Najafi Z., Maghooli K. Zandi Y, Sheibeigi A

Abstract:

In this study, an investigation over digestive diseases has been done in which the sound acts as a detector medium. Pursue to the preprocessing the extracted signal in cepstrum domain is registered. After classification of digestive diseases, the system selects random samples based on their features and generates the interest nonstationary, long-term signals via inverse transform in cepstral domain which is presented in digital and sonic form as the output. This structure is updatable or on the other word, by receiving a new signal the corresponding disease classification is updated in the feature domain.

Keywords: Cepstrum, databank, digestive disease, acousticsignal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1556
444 Path Planning of a Robot Manipulator using Retrieval RRT Strategy

Authors: K. Oh, J. P. Hwang, E. Kim, H. Lee

Abstract:

This paper presents an algorithm which extends the rapidly-exploring random tree (RRT) framework to deal with change of the task environments. This algorithm called the Retrieval RRT Strategy (RRS) combines a support vector machine (SVM) and RRT and plans the robot motion in the presence of the change of the surrounding environment. This algorithm consists of two levels. At the first level, the SVM is built and selects a proper path from the bank of RRTs for a given environment. At the second level, a real path is planned by the RRT planners for the given environment. The suggested method is applied to the control of KUKA™,, a commercial 6 DOF robot manipulator, and its feasibility and efficiency are demonstrated via the cosimulatation of MatLab™, and RecurDyn™,.

Keywords: Path planning, RRT, 6 DOF manipulator, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2531
443 Performance Comparison of Situation-Aware Models for Activating Robot Vacuum Cleaner in a Smart Home

Authors: Seongcheol Kwon, Jeongmin Kim, Kwang Ryel Ryu

Abstract:

We assume an IoT-based smart-home environment where the on-off status of each of the electrical appliances including the room lights can be recognized in a real time by monitoring and analyzing the smart meter data. At any moment in such an environment, we can recognize what the household or the user is doing by referring to the status data of the appliances. In this paper, we focus on a smart-home service that is to activate a robot vacuum cleaner at right time by recognizing the user situation, which requires a situation-aware model that can distinguish the situations that allow vacuum cleaning (Yes) from those that do not (No). We learn as our candidate models a few classifiers such as naïve Bayes, decision tree, and logistic regression that can map the appliance-status data into Yes and No situations. Our training and test data are obtained from simulations of user behaviors, in which a sequence of user situations such as cooking, eating, dish washing, and so on is generated with the status of the relevant appliances changed in accordance with the situation changes. During the simulation, both the situation transition and the resulting appliance status are determined stochastically. To compare the performances of the aforementioned classifiers we obtain their learning curves for different types of users through simulations. The result of our empirical study reveals that naïve Bayes achieves a slightly better classification accuracy than the other compared classifiers.

Keywords: Situation-awareness, Smart home, IoT, Machine learning, Classifier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1856
442 Simulated Annealing Application for Structural Optimization

Authors: Farhad Kolahan, M. Hossein Abolbashari, Samaeddin Mohitzadeh

Abstract:

Several methods are available for weight and shape optimization of structures, among which Evolutionary Structural Optimization (ESO) is one of the most widely used methods. In ESO, however, the optimization criterion is completely case-dependent. Moreover, only the improving solutions are accepted during the search. In this paper a Simulated Annealing (SA) algorithm is used for structural optimization problem. This algorithm differs from other random search methods by accepting non-improving solutions. The implementation of SA algorithm is done through reducing the number of finite element analyses (function evaluations). Computational results show that SA can efficiently and effectively solve such optimization problems within short search time.

Keywords: Simulated annealing, Structural optimization, Compliance, C.V. product.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1956
441 Comparison of Frequency Estimation Methods for Reflected Signals in Mobile Platforms

Authors: Kathrin Reinhold

Abstract:

Precise frequency estimation methods for pulseshaped echoes are a prerequisite to determine the relative velocity between sensor and reflector. Signal frequencies are analysed using three different methods: Fourier Transform, Chirp ZTransform and the MUSIC algorithm. Simulations of echoes are performed varying both the noise level and the number of reflecting points. The superposition of echoes with a random initial phase is found to influence the precision of frequency estimation severely for FFT and MUSIC. The standard deviation of the frequency using FFT is larger than for MUSIC. However, MUSIC is more noise-sensitive. The distorting effect of superpositions is less pronounced in experimental data.

Keywords: Frequency estimation, pulse-echo-method, superposition, echoes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1168
440 Travel Time Model for Cylinder Type Parking System

Authors: Jing Zhang, Jie Chen

Abstract:

In this paper, we mainly analyze an automated parking system where the storage and retrieval requests are performed by a tower crane. In this parking system, the S/R crane which is located at the middle of the bottom of the cylinder parking area can rotate in both clockwise and counterclockwise and three kinds of movements can be done simultaneously. We develop some mathematical travel time models for the single command cycle under the random storage assignment using the characteristics of this system. Finally, we compare these travel models with discrete case and it is shown that these travel models display a good satisfactory performance.

Keywords: Parking system, travel time model, tower crane.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 793
439 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone

Abstract:

Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.

Keywords: Artificial Neural Network, Data Mining, Electroencephalogram, Epilepsy, Feature Extraction, Seizure Detection, Signal Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1314
438 Performance Assessment of Multi-Level Ensemble for Multi-Class Problems

Authors: Rodolfo Lorbieski, Silvia Modesto Nassar

Abstract:

Many supervised machine learning tasks require decision making across numerous different classes. Multi-class classification has several applications, such as face recognition, text recognition and medical diagnostics. The objective of this article is to analyze an adapted method of Stacking in multi-class problems, which combines ensembles within the ensemble itself. For this purpose, a training similar to Stacking was used, but with three levels, where the final decision-maker (level 2) performs its training by combining outputs from the tree-based pair of meta-classifiers (level 1) from Bayesian families. These are in turn trained by pairs of base classifiers (level 0) of the same family. This strategy seeks to promote diversity among the ensembles forming the meta-classifier level 2. Three performance measures were used: (1) accuracy, (2) area under the ROC curve, and (3) time for three factors: (a) datasets, (b) experiments and (c) levels. To compare the factors, ANOVA three-way test was executed for each performance measure, considering 5 datasets by 25 experiments by 3 levels. A triple interaction between factors was observed only in time. The accuracy and area under the ROC curve presented similar results, showing a double interaction between level and experiment, as well as for the dataset factor. It was concluded that level 2 had an average performance above the other levels and that the proposed method is especially efficient for multi-class problems when compared to binary problems.

Keywords: Stacking, multi-layers, ensemble, multi-class.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1093