Search results for: Global Classifier.
998 Fused Structure and Texture (FST) Features for Improved Pedestrian Detection
Authors: Hussin K. Ragb, Vijayan K. Asari
Abstract:
In this paper, we present a pedestrian detection descriptor called Fused Structure and Texture (FST) features based on the combination of the local phase information with the texture features. Since the phase of the signal conveys more structural information than the magnitude, the phase congruency concept is used to capture the structural features. On the other hand, the Center-Symmetric Local Binary Pattern (CSLBP) approach is used to capture the texture information of the image. The dimension less quantity of the phase congruency and the robustness of the CSLBP operator on the flat images, as well as the blur and illumination changes, lead the proposed descriptor to be more robust and less sensitive to the light variations. The proposed descriptor can be formed by extracting the phase congruency and the CSLBP values of each pixel of the image with respect to its neighborhood. The histogram of the oriented phase and the histogram of the CSLBP values for the local regions in the image are computed and concatenated to construct the FST descriptor. Several experiments were conducted on INRIA and the low resolution DaimlerChrysler datasets to evaluate the detection performance of the pedestrian detection system that is based on the FST descriptor. A linear Support Vector Machine (SVM) is used to train the pedestrian classifier. These experiments showed that the proposed FST descriptor has better detection performance over a set of state of the art feature extraction methodologies.Keywords: Pedestrian detection, phase congruency, local phase, LBP features, CSLBP features, FST descriptor.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1489997 A New Stability Analysis and Stabilization of Discrete-Time Switched Linear Systems Using Vector Norms Approach
Authors: Marwen Kermani, Anis Sakly, Faouzi M'sahli
Abstract:
In this paper, we aim to investigate a new stability analysis for discrete-time switched linear systems based on the comparison, the overvaluing principle, the application of Borne-Gentina criterion and the Kotelyanski conditions. This stability conditions issued from vector norms correspond to a vector Lyapunov function. In fact, the switched system to be controlled will be represented in the Companion form. A comparison system relative to a regular vector norm is used in order to get the simple arrow form of the state matrix that yields to a suitable use of Borne-Gentina criterion for the establishment of sufficient conditions for global asymptotic stability. This proposed approach could be a constructive solution to the state and static output feedback stabilization problems.
Keywords: Discrete-time switched linear systems, Global asymptotic stability, Vector norms, Borne-Gentina criterion, Arrow form state matrix, Arbitrary switching, State feedback controller, Static output feedback controller.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1639996 Carbon Dioxide Capture and Storage: A General Review on Adsorbents
Authors: Mohammad Songolzadeh, Maryam Takht Ravanchi, Mansooreh Soleimani
Abstract:
CO2 is the primary anthropogenic greenhouse gas, accounting for 77% of the human contribution to the greenhouse effect in 2004. In the recent years, global concentration of CO2 in the atmosphere is increasing rapidly. CO2 emissions have an impact on global climate change. Anthropogenic CO2 is emitted primarily from fossil fuel combustion. Carbon capture and storage (CCS) is one option for reducing CO2 emissions. There are three major approaches for CCS: post-combustion capture, pre-combustion capture and oxyfuel process. Post-combustion capture offers some advantages as existing combustion technologies can still be used without radical changes on them. There are several post combustion gas separation and capture technologies being investigated, namely; (a) absorption, (b) cryogenic separation, (c) membrane separation (d) micro algal biofixation and (e) adsorption. Apart from establishing new techniques, the exploration of capture materials with high separation performance and low capital cost are paramount importance. However, the application of adsorption from either technology, require easily regenerable and durable adsorbents with a high CO2 adsorption capacity. It has recently been reported that the cost of the CO2 capture can be reduced by using this technology. In this paper, the research progress (from experimental results) in adsorbents for CO2 adsorption, storage, and separations were reviewed and future research directions were suggested as well.Keywords: Carbon capture and storage, pre-combustion, postcombustion, adsorption
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7200995 Analysis of a Spatiotemporal Phytoplankton Dynamics: Higher Order Stability and Pattern Formation
Authors: Randhir Singh Baghel, Joydip Dhar, Renu Jain
Abstract:
In this paper, for the understanding of the phytoplankton dynamics in marine ecosystem, a susceptible and an infected class of phytoplankton population is considered in spatiotemporal domain. Here, the susceptible phytoplankton is growing logistically and the growth of infected phytoplankton is due to the instantaneous Holling type-II infection response function. The dynamics are studied in terms of the local and global stabilities for the system and further explore the possibility of Hopf -bifurcation, taking the half saturation period as (i.e., ) the bifurcation parameter in temporal domain. It is also observe that the reaction diffusion system exhibits spatiotemporal chaos and pattern formation in phytoplankton dynamics, which is particularly important role play for the spatially extended phytoplankton system. Also the effect of the diffusion coefficient on the spatial system for both one and two dimensional case is obtained. Furthermore, we explore the higher-order stability analysis of the spatial phytoplankton system for both linear and no-linear system. Finally, few numerical simulations are carried out for pattern formation.Keywords: Phytoplankton dynamics, Reaction-diffusion system, Local stability, Hopf-bifurcation, Global stability, Chaos, Pattern Formation, Higher-order stability analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1650994 The Impact of Corporate Governance Regulation in the Nigerian Banking Sector
Authors: Simisola I. Akintoye, Sunday K. Iyaniwura
Abstract:
Recent global corporate failures have called for increase in the need to regulate corporate governance across the world. In Nigeria, the impact of corporate governance regulation in the banking sector has reached epidemic levels contributing to the country’s economic depression. This study critically evaluates Nigeria’s corporate governance regime and explores how weak regulation has impacted on the banking sector. By adopting a socio legal methodology, the study analyses both theoretical and empirical works from a socio-scientific point of view to examine the role of Nigeria’s legal, cultural and social arrangements in corporate governance regulation. The study reveals that Nigeria’s institutional arrangement has contributed to its weak system of corporate governance regulation with adverse effects on the banking sector. The research mainly impacts on current global corporate governance literature in sub-Saharan Africa by contributing to knowledge of the peculiarities of corporate governance regulation in different institutional jurisdictions. The particular focus on emerging economies such as Nigeria expands on the need for countries to develop a bespoke system of corporate governance regulation that takes into consideration the peculiarities of individual countries devoid of external influence.
Keywords: Banks, corporate governance, emerging economies, Nigeria.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640993 Impact of Terrorism as an Asymmetrical Threat on the State's Conventional Security Forces
Authors: Igor Pejic
Abstract:
The main focus of this research will be on analyzing correlative links between terrorism as an asymmetrical threat and the consequences it leaves on conventional security forces. The methodology behind the research will include qualitative research methods focusing on comparative analysis of books, scientific papers, documents and other sources, in order to deduce, explore and formulate the results of the research. With the coming of the 21st century and the rising multi-polar, new world threats quickly emerged. The realistic approach in international relations deems that relations among nations are in a constant state of anarchy since there are no definitive rules and the distribution of power varies widely. International relations are further characterized by egoistic and self-orientated human nature, anarchy or absence of a higher government, security and lack of morality. The asymmetry of power is also reflected on countries' security capabilities and its abilities to project power. With the coming of the new millennia and the rising multi-polar world order, the asymmetry of power can be also added as an important trait of the global society which consequently brought new threats. Among various others, terrorism is probably the most well-known, well-based and well-spread asymmetric threat. In today's global political arena, terrorism is used by state and non-state actors to fulfill their political agendas. Terrorism is used as an all-inclusive tool for regime change, subversion or a revolution. Although the nature of terrorist groups is somewhat inconsistent, terrorism as a security and social phenomenon has a one constant which is reflected in its political dimension. The state's security apparatus, which was embodied in the form of conventional armed forces, is now becoming fragile, unable to tackle new threats and to a certain extent outdated. Conventional security forces were designed to defend or engage an exterior threat which is more or less symmetric and visible. On the other hand, terrorism as an asymmetrical threat is a part of hybrid, special or asymmetric warfare in which specialized units, institutions or facilities represent the primary pillars of security. In today's global society, terrorism is probably the most acute problem which can paralyze entire countries and their political systems. This problem, however, cannot be engaged on an open field of battle, but rather it requires a different approach in which conventional armed forces cannot be used traditionally and their role must be adjusted. The research will try to shed light on the phenomena of modern day terrorism and to prove its correlation with the state conventional armed forces. States are obliged to adjust their security apparatus to the new realism of global society and terrorism as an asymmetrical threat which is a side-product of the unbalanced world.
Keywords: Asymmetrical warfare, conventional forces, security, terrorism.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1276992 Assessment of Water Resources and Inculcation of Controlled Water Consumption System
Authors: Vakhtang Geladze, Nana Bolashvili, Tamazi Karalashvili, Nino Machavariani, Vajha Neidze, Nana Kvirkvelia, Tamar Chichinadze
Abstract:
Deficiency of fresh water is a vital global problem today. It must be taken into consideration that in the nearest future fresh water crisis will become even more acute owing to the global climate warming and fast desertification processes in the world. Georgia has signed the association agreement with Euro Union last year where the priority spheres of cooperation are the management of water resources, development of trans-boundary approach to the problem and active participation in the “Euro Union water initiative” component of “the East Europe, Caucasus and the Central Asia”. Fresh water resources are the main natural wealth of Georgia. According to the average water layer height, Georgia is behind such European countries only as Norway, Switzerland and Austria. The annual average water provision of Georgia is 4-8 times higher than in its neighbor countries Armenia and Azerbaijan. Despite abundant water resources in Georgia, there is considerable discrepancy between their volume and use in some regions because of the uneven territorial distribution. In the East Georgia, water supply of the territory and population is four times less than in the West Georgia.
Keywords: GIS, sociological survey, water consumption, water resources.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 907991 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model
Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin
Abstract:
Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.
Keywords: Anomaly detection, autoencoder, data centers, deep learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 744990 A Trainable Neural Network Ensemble for ECG Beat Classification
Authors: Atena Sajedin, Shokoufeh Zakernejad, Soheil Faridi, Mehrdad Javadi, Reza Ebrahimpour
Abstract:
This paper illustrates the use of a combined neural network model for classification of electrocardiogram (ECG) beats. We present a trainable neural network ensemble approach to develop customized electrocardiogram beat classifier in an effort to further improve the performance of ECG processing and to offer individualized health care. We process a three stage technique for detection of premature ventricular contraction (PVC) from normal beats and other heart diseases. This method includes a denoising, a feature extraction and a classification. At first we investigate the application of stationary wavelet transform (SWT) for noise reduction of the electrocardiogram (ECG) signals. Then feature extraction module extracts 10 ECG morphological features and one timing interval feature. Then a number of multilayer perceptrons (MLPs) neural networks with different topologies are designed. The performance of the different combination methods as well as the efficiency of the whole system is presented. Among them, Stacked Generalization as a proposed trainable combined neural network model possesses the highest recognition rate of around 95%. Therefore, this network proves to be a suitable candidate in ECG signal diagnosis systems. ECG samples attributing to the different ECG beat types were extracted from the MIT-BIH arrhythmia database for the study. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2216989 MPPT Operation for PV Grid-connected System using RBFNN and Fuzzy Classification
Authors: A. Chaouachi, R. M. Kamel, K. Nagasaka
Abstract:
This paper presents a novel methodology for Maximum Power Point Tracking (MPPT) of a grid-connected 20 kW Photovoltaic (PV) system using neuro-fuzzy network. The proposed method predicts the reference PV voltage guarantying optimal power transfer between the PV generator and the main utility grid. The neuro-fuzzy network is composed of a fuzzy rule-based classifier and three Radial Basis Function Neural Networks (RBFNN). Inputs of the network (irradiance and temperature) are classified before they are fed into the appropriated RBFNN for either training or estimation process while the output is the reference voltage. The main advantage of the proposed methodology, comparing to a conventional single neural network-based approach, is the distinct generalization ability regarding to the nonlinear and dynamic behavior of a PV generator. In fact, the neuro-fuzzy network is a neural network based multi-model machine learning that defines a set of local models emulating the complex and non-linear behavior of a PV generator under a wide range of operating conditions. Simulation results under several rapid irradiance variations proved that the proposed MPPT method fulfilled the highest efficiency comparing to a conventional single neural network.
Keywords: MPPT, neuro-fuzzy, RBFN, grid-connected, photovoltaic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3182988 Migrating Words and Voices in Joseph O’Neill’s Netherland and The Dog
Authors: Masami Usui
Abstract:
The 21th century has already witnessed the rapid globalization of catastrophes caused by layered political, social, religious, cultural, and environmental conflicts. The post 9/11 literature that reflects these characteristics retells the experiences of those who are, whether directly or indirectly, involved in the globalized catastrophes of enlarging and endangering their boundaries and consequences. With an Irish-Turkish origin, a Dutch and British educational background, and as an American green-card holder, Joseph O’Neill challenges this changing circumstances of the expanding crisis. In his controversial novel, Netherland (2008), O’Neill embodies the deeply-rooted compromises, the transplanted conflicts, and human internalized crisis in post 9/11 New York City. O’Neill presents to us the transition between Netherland to New York with a post-colonial perspective. This internalized conflicts are revised in The Dog (2014) in which a newly-constructing and expanding global city of gold, Dubai, represents the transitional location from New York City. Through these two novels, words and voices are migrating beyond cultural and political boundaries and discussing what a collective mind embodies in this globalized society.Keywords: American literature, global literature, cultural studies, political science.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1627987 Researches on Simulation and Validation of Airborne Enhanced Ground Proximity Warning System
Authors: Ma Shidong, He Yuncheng, Wang Zhong, Yang Guoqing
Abstract:
In this paper, enhanced ground proximity warning simulation and validation system is designed and implemented. First, based on square grid and sub-grid structure, the global digital terrain database is designed and constructed. Terrain data searching is implemented through querying the latitude and longitude bands and separated zones of global terrain database with the current aircraft position. A combination of dynamic scheduling and hierarchical scheduling is adopted to schedule the terrain data, and the terrain data can be read and delete dynamically in the memory. Secondly, according to the scope, distance, approach speed information etc. to the dangerous terrain in front, and using security profiles calculating method, collision threat detection is executed in real-time, and provides caution and warning alarm. According to this scheme, the implementation of the enhanced ground proximity warning simulation system is realized. Simulations are carried out to verify a good real-time in terrain display and alarm trigger, and the results show simulation system is realized correctly, reasonably and stable.
Keywords: enhanced ground proximity warning system, digital terrain, look-ahead terrain alarm, terrain display, simulation and validation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1692986 Night-Time Traffic Light Detection Based On SVM with Geometric Moment Features
Authors: Hyun-Koo Kim, Young-Nam Shin, Sa-gong Kuk, Ju H. Park, Ho-Youl Jung
Abstract:
This paper presents an effective traffic lights detection method at the night-time. First, candidate blobs of traffic lights are extracted from RGB color image. Input image is represented on the dominant color domain by using color transform proposed by Ruta, then red and green color dominant regions are selected as candidates. After candidate blob selection, we carry out shape filter for noise reduction using information of blobs such as length, area, area of boundary box, etc. A multi-class classifier based on SVM (Support Vector Machine) applies into the candidates. Three kinds of features are used. We use basic features such as blob width, height, center coordinate, area, area of blob. Bright based stochastic features are also used. In particular, geometric based moment-s values between candidate region and adjacent region are proposed and used to improve the detection performance. The proposed system is implemented on Intel Core CPU with 2.80 GHz and 4 GB RAM and tested with the urban and rural road videos. Through the test, we show that the proposed method using PF, BMF, and GMF reaches up to 93 % of detection rate with computation time of in average 15 ms/frame.Keywords: Night-time traffic light detection, multi-class classification, driving assistance system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3886985 Effective Traffic Lights Recognition Method for Real Time Driving Assistance Systemin the Daytime
Authors: Hyun-Koo Kim, Ju H. Park, Ho-Youl Jung
Abstract:
This paper presents an effective traffic lights recognition method at the daytime. First, Potential Traffic Lights Detector (PTLD) use whole color source of YCbCr channel image and make each binary image of green and red traffic lights. After PTLD step, Shape Filter (SF) use to remove noise such as traffic sign, street tree, vehicle, and building. At this time, noise removal properties consist of information of blobs of binary image; length, area, area of boundary box, etc. Finally, after an intermediate association step witch goal is to define relevant candidates region from the previously detected traffic lights, Adaptive Multi-class Classifier (AMC) is executed. The classification method uses Haar-like feature and Adaboost algorithm. For simulation, we are implemented through Intel Core CPU with 2.80 GHz and 4 GB RAM and tested in the urban and rural roads. Through the test, we are compared with our method and standard object-recognition learning processes and proved that it reached up to 94 % of detection rate which is better than the results achieved with cascade classifiers. Computation time of our proposed method is 15 ms.Keywords: Traffic Light Detection, Multi-class Classification, Driving Assistance System, Haar-like Feature, Color SegmentationMethod, Shape Filter
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2780984 Multi-Hazard Risk Assessment and Management in Tourism Industry- A Case Study from the Island of Taiwan
Authors: Chung-Hung Tsai
Abstract:
Global environmental changes lead to increased frequency and scale of natural disaster, Taiwan is under the influence of global warming and extreme weather. Therefore, the vulnerability was increased and variability and complexity of disasters is relatively enhanced. The purpose of this study is to consider the source and magnitude of hazard characteristics on the tourism industry. Using modern risk management concepts, integration of related domestic and international basic research, this goes beyond the Taiwan typhoon disaster risk assessment model and evaluation of loss. This loss evaluation index system considers the impact of extreme weather, in particular heavy rain on the tourism industry in Taiwan. Consider the extreme climate of the compound impact of disaster for the tourism industry; we try to make multi-hazard risk assessment model, strategies and suggestions. Related risk analysis results are expected to provide government department, the tourism industry asset owners, insurance companies and banking include tourist disaster risk necessary information to help its tourism industry for effective natural disaster risk management.
Keywords: Tourism industry, extreme weather, multi-hazard, vulnerability analysis, loss exceeding probability, risk management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3830983 Investigating Ultra Violet (UV) Strength against Different Level of Altitude using New Environmental Data Management System
Authors: M. Amir Abas, M. Dahlui
Abstract:
This paper presents the investigation results of UV measurement at different level of altitudes and the development of a new portable instrument for measuring UV. The rapid growth of industrial sectors in developing countries including Malaysia, brings not only income to the nation, but also causes pollution in various forms. Air pollution is one of the significant contributors to global warming by depleting the Ozone layer, which would reduce the filtration of UV rays. Long duration of exposure to high to UV rays has many devastating health effects to mankind directly or indirectly through destruction of the natural resources. This study aimed to show correlation between UV and altitudes which indirectly can help predict Ozone depletion. An instrument had been designed to measure and monitors the level of UV. The instrument comprises of two main blocks namely data logger and Graphic User Interface (GUI). Three sensors were used in the data logger to detect changes in the temperature, humidity and ultraviolet. The system has undergone experimental measurement to capture data at two different conditions; industrial area and high attitude area. The performance of the instrument showed consistency in the data captured and the results of the experiment drew a significantly high reading of UV at high altitudes.Keywords: Ozone Layer, Monitoring, Global Warming, Measurement, Ultraviolet
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737982 Isolation and Classification of Red Blood Cells in Anemic Microscopic Images
Authors: Jameela Ali Alkrimi, Loay E. George, Azizah Suliman, Abdul Rahim Ahmad, Karim Al-Jashamy
Abstract:
Red blood cells (RBCs) are among the most commonly and intensively studied type of blood cells in cell biology. Anemia is a lack of RBCs is characterized by its level compared to the normal hemoglobin level. In this study, a system based image processing methodology was developed to localize and extract RBCs from microscopic images. Also, the machine learning approach is adopted to classify the localized anemic RBCs images. Several textural and geometrical features are calculated for each extracted RBCs. The training set of features was analyzed using principal component analysis (PCA). With the proposed method, RBCs were isolated in 4.3secondsfrom an image containing 18 to 27 cells. The reasons behind using PCA are its low computation complexity and suitability to find the most discriminating features which can lead to accurate classification decisions. Our classifier algorithm yielded accuracy rates of 100%, 99.99%, and 96.50% for K-nearest neighbor (K-NN) algorithm, support vector machine (SVM), and neural network RBFNN, respectively. Classification was evaluated in highly sensitivity, specificity, and kappa statistical parameters. In conclusion, the classification results were obtained within short time period, and the results became better when PCA was used.
Keywords: Red blood cells, pre-processing image algorithms, classification algorithms, principal component analysis PCA, confusion matrix, kappa statistical parameters, ROC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3199981 Combined Feature Based Hyperspectral Image Classification Technique Using Support Vector Machines
Authors: Mrs.K.Kavitha, S.Arivazhagan
Abstract:
A spatial classification technique incorporating a State of Art Feature Extraction algorithm is proposed in this paper for classifying a heterogeneous classes present in hyper spectral images. The classification accuracy can be improved if and only if both the feature extraction and classifier selection are proper. As the classes in the hyper spectral images are assumed to have different textures, textural classification is entertained. Run Length feature extraction is entailed along with the Principal Components and Independent Components. A Hyperspectral Image of Indiana Site taken by AVIRIS is inducted for the experiment. Among the original 220 bands, a subset of 120 bands is selected. Gray Level Run Length Matrix (GLRLM) is calculated for the selected forty bands. From GLRLMs the Run Length features for individual pixels are calculated. The Principle Components are calculated for other forty bands. Independent Components are calculated for next forty bands. As Principal & Independent Components have the ability to represent the textural content of pixels, they are treated as features. The summation of Run Length features, Principal Components, and Independent Components forms the Combined Features which are used for classification. SVM with Binary Hierarchical Tree is used to classify the hyper spectral image. Results are validated with ground truth and accuracies are calculated.
Keywords: Multi-class, Run Length features, PCA, ICA, classification and Support Vector Machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1523980 Intelligent Transport System: Classification of Traffic Signs Using Deep Neural Networks in Real Time
Authors: Anukriti Kumar, Tanmay Singh, Dinesh Kumar Vishwakarma
Abstract:
Traffic control has been one of the most common and irritating problems since the time automobiles have hit the roads. Problems like traffic congestion have led to a significant time burden around the world and one significant solution to these problems can be the proper implementation of the Intelligent Transport System (ITS). It involves the integration of various tools like smart sensors, artificial intelligence, position technologies and mobile data services to manage traffic flow, reduce congestion and enhance driver's ability to avoid accidents during adverse weather. Road and traffic signs’ recognition is an emerging field of research in ITS. Classification problem of traffic signs needs to be solved as it is a major step in our journey towards building semi-autonomous/autonomous driving systems. The purpose of this work focuses on implementing an approach to solve the problem of traffic sign classification by developing a Convolutional Neural Network (CNN) classifier using the GTSRB (German Traffic Sign Recognition Benchmark) dataset. Rather than using hand-crafted features, our model addresses the concern of exploding huge parameters and data method augmentations. Our model achieved an accuracy of around 97.6% which is comparable to various state-of-the-art architectures.
Keywords: Multiclass classification, convolution neural network, OpenCV, Data Augmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 815979 New Insights for Soft Skills Development in Vietnamese Business Schools: Defining Essential Soft Skills for Maximizing Graduates’ Career Success
Authors: Hang T. T. Truong, Ronald S. Laura, Kylie Shaw
Abstract:
Within Vietnam's system of higher education, its schools of business play a vital role in supporting the country’s economic objectives. However, the crucial contribution of soft skills for maximal success within the business sector has to date not been adequately recognized by its business schools. This being so, the development of the business school curriculum in Vietnam has not been able to 'catch up', so to say, with the burgeoning need of students for a comprehensive soft skills program designed to meet the national and global business objectives of their potential employers. The burden of the present paper is first to reveal the results of our survey in Vietnam which make explicit the extent to which major Vietnamese industrial employers’ value the potential role that soft skill competencies can play in maximizing business success. Our final task will be to determine which soft skills employers discern as best serving to maximize the economic interests of Vietnam within the global marketplace. Semi-structured telephone interviews have been conducted with the 15 representative Head Employers of Vietnam's reputedly largest and most successful of the diverse business enterprises across Vietnam. The findings of the study indicate that all respondents highly value the increasing importance of soft skills in business success. Our critical analysis of respondent data reveals that 19 essential soft skills are deemed by employers as integral to business workplace efficacy and should thus be integrated into the formal business curriculum. We are confident that our study represents the first comprehensive and specific survey yet undertaken within the business sector in Vietnam which accesses and analyses the opinions of representative employers from major companies across the country in regard to the growing importance of 19 specific soft skills essential for maximizing overall business success. Our research findings also reveal that the integration into business school curriculums nationwide of the soft skills we have identified is of paramount importance to advance the national and global economic interests of Vietnam.Keywords: Business curriculum, business graduates, employers’ perception, soft skills.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2490978 Performance Comparison of Situation-Aware Models for Activating Robot Vacuum Cleaner in a Smart Home
Authors: Seongcheol Kwon, Jeongmin Kim, Kwang Ryel Ryu
Abstract:
We assume an IoT-based smart-home environment where the on-off status of each of the electrical appliances including the room lights can be recognized in a real time by monitoring and analyzing the smart meter data. At any moment in such an environment, we can recognize what the household or the user is doing by referring to the status data of the appliances. In this paper, we focus on a smart-home service that is to activate a robot vacuum cleaner at right time by recognizing the user situation, which requires a situation-aware model that can distinguish the situations that allow vacuum cleaning (Yes) from those that do not (No). We learn as our candidate models a few classifiers such as naïve Bayes, decision tree, and logistic regression that can map the appliance-status data into Yes and No situations. Our training and test data are obtained from simulations of user behaviors, in which a sequence of user situations such as cooking, eating, dish washing, and so on is generated with the status of the relevant appliances changed in accordance with the situation changes. During the simulation, both the situation transition and the resulting appliance status are determined stochastically. To compare the performances of the aforementioned classifiers we obtain their learning curves for different types of users through simulations. The result of our empirical study reveals that naïve Bayes achieves a slightly better classification accuracy than the other compared classifiers.Keywords: Situation-awareness, Smart home, IoT, Machine learning, Classifier.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1856977 Integrated Drunken Driving Prevention System
Authors: T. Shyam Ramanath, A. Sudharsan, A. Kavitha
Abstract:
As is needless to say; a majority of accidents, which occur, are due to drunk driving. As such, there is no effective mechanism to prevent this. Here we have designed an integrated system for the same purpose. Alcohol content in the driver-s body is detected by means of an infrared breath analyzer placed at the steering wheel. An infrared cell directs infrared energy through the sample and any unabsorbed energy at the other side is detected. The higher the concentration of ethanol, the more infrared absorption occurs (in much the same way that a sunglass lens absorbs visible light, alcohol absorbs infrared light). Thus the alcohol level of the driver is continuously monitored and calibrated on a scale. When it exceeds a particular limit the fuel supply is cutoff. If the device is removed also, the fuel supply will be automatically cut off or an alarm is sounded depending upon the requirement. This does not happen abruptly and special indicators are fixed at the back to avoid inconvenience to other drivers using the highway signals. Frame work for integration of sensors and control module in a scalable multi-agent system is provided .A SMS which contains the current GPS location of the vehicle is sent via a GSM module to the police control room to alert the police. The system is foolproof and the driver cannot tamper with it easily. Thus it provides an effective and cost effective solution for the problem of drunk driving in vehicles.
Keywords: Global system monitoring, global positioning system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4218976 Development of a Technology Assessment Model by Patents and Customers' Review Data
Authors: Kisik Song, Sungjoo Lee
Abstract:
Recent years have seen an increasing number of patent disputes due to excessive competition in the global market and a reduced technology life-cycle; this has increased the risk of investment in technology development. While many global companies have started developing a methodology to identify promising technologies and assess for decisions, the existing methodology still has some limitations. Post hoc assessments of the new technology are not being performed, especially to determine whether the suggested technologies turned out to be promising. For example, in existing quantitative patent analysis, a patent’s citation information has served as an important metric for quality assessment, but this analysis cannot be applied to recently registered patents because such information accumulates over time. Therefore, we propose a new technology assessment model that can replace citation information and positively affect technological development based on post hoc analysis of the patents for promising technologies. Additionally, we collect customer reviews on a target technology to extract keywords that show the customers’ needs, and we determine how many keywords are covered in the new technology. Finally, we construct a portfolio (based on a technology assessment from patent information) and a customer-based marketability assessment (based on review data), and we use them to visualize the characteristics of the new technologies.Keywords: Technology assessment, patents, citation information, opinion mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 992975 Constructing a New World Order through a Narrative of Infrastructural Development: The Case of the BRICS
Authors: Carolijn Van Noort
Abstract:
The aim of this research is to understand how the emerging power bloc BRICS employs infrastructure development narratives to construct a new world order. BRICS is an international body consisting of five emerging countries that collaborate on economic and political issues: Brazil, Russia, India, China, and South Africa. This study explores the projection of infrastructure development narratives through an analysis of BRICS’ attention to infrastructure investment and financing, its support of the New Partnership on African Development and the establishment of the New Development Bank in Shanghai. The theory of Strategic Narratives is used to explore BRICS’ commitment to infrastructure development and to distinguish three layers: system narratives (BRICS as a global actor to propose development reform), identity narratives (BRICS as a collective identity joining efforts to act upon development aspirations) and issue narratives (BRICS committed to a range of issues of which infrastructure development is prominent). The methodology that is employed is a narrative analysis of BRICS’ official documents, media statements, and website imagery. A comparison of these narratives illuminates tensions at the three layers and among the five member states. Identifying tensions among development infrastructure narratives provides an indication of how policymaking for infrastructure development could be improved. Subsequently, it advances BRICS’ ability to act as a global actor to construct a new world order.Keywords: BRICS, emerging powers, infrastructural development, strategic narratives.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2059974 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks
Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone
Abstract:
Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.
Keywords: Artificial Neural Network, Data Mining, Electroencephalogram, Epilepsy, Feature Extraction, Seizure Detection, Signal Processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1314973 Dynamic Features Selection for Heart Disease Classification
Authors: Walid MOUDANI
Abstract:
The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the Coronary Heart Disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts- knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.Keywords: Multi-Classifier Decisions Tree, Features Reduction, Dynamic Programming, Rough Sets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2533972 Performance Assessment of Multi-Level Ensemble for Multi-Class Problems
Authors: Rodolfo Lorbieski, Silvia Modesto Nassar
Abstract:
Many supervised machine learning tasks require decision making across numerous different classes. Multi-class classification has several applications, such as face recognition, text recognition and medical diagnostics. The objective of this article is to analyze an adapted method of Stacking in multi-class problems, which combines ensembles within the ensemble itself. For this purpose, a training similar to Stacking was used, but with three levels, where the final decision-maker (level 2) performs its training by combining outputs from the tree-based pair of meta-classifiers (level 1) from Bayesian families. These are in turn trained by pairs of base classifiers (level 0) of the same family. This strategy seeks to promote diversity among the ensembles forming the meta-classifier level 2. Three performance measures were used: (1) accuracy, (2) area under the ROC curve, and (3) time for three factors: (a) datasets, (b) experiments and (c) levels. To compare the factors, ANOVA three-way test was executed for each performance measure, considering 5 datasets by 25 experiments by 3 levels. A triple interaction between factors was observed only in time. The accuracy and area under the ROC curve presented similar results, showing a double interaction between level and experiment, as well as for the dataset factor. It was concluded that level 2 had an average performance above the other levels and that the proposed method is especially efficient for multi-class problems when compared to binary problems.Keywords: Stacking, multi-layers, ensemble, multi-class.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1094971 Pre-Operative Tool for Facial-Post-Surgical Estimation and Detection
Authors: Ayat E. Ali, Christeen R. Aziz, Merna A. Helmy, Mohammed M. Malek, Sherif H. El-Gohary
Abstract:
Goal: Purpose of the project was to make a plastic surgery prediction by using pre-operative images for the plastic surgeries’ patients and to show this prediction on a screen to compare between the current case and the appearance after the surgery. Methods: To this aim, we implemented a software which used data from the internet for facial skin diseases, skin burns, pre-and post-images for plastic surgeries then the post- surgical prediction is done by using K-nearest neighbor (KNN). So we designed and fabricated a smart mirror divided into two parts a screen and a reflective mirror so patient's pre- and post-appearance will be showed at the same time. Results: We worked on some skin diseases like vitiligo, skin burns and wrinkles. We classified the three degrees of burns using KNN classifier with accuracy 60%. We also succeeded in segmenting the area of vitiligo. Our future work will include working on more skin diseases, classify them and give a prediction for the look after the surgery. Also we will go deeper into facial deformities and plastic surgeries like nose reshaping and face slim down. Conclusion: Our project will give a prediction relates strongly to the real look after surgery and decrease different diagnoses among doctors. Significance: The mirror may have broad societal appeal as it will make the distance between patient's satisfaction and the medical standards smaller.
Keywords: K-nearest neighbor, face detection, vitiligo, bone deformity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 701970 Analysis of Precipitation Time Series of Urban Centers of Northeastern Brazil using Wavelet Transform
Authors: Celso A. G. Santos, Paula K. M. M. Freire
Abstract:
The urban centers within northeastern Brazil are mainly influenced by the intense rainfalls, which can occur after long periods of drought, when flood events can be observed during such events. Thus, this paper aims to study the rainfall frequencies in such region through the wavelet transform. An application of wavelet analysis is done with long time series of the total monthly rainfall amount at the capital cities of northeastern Brazil. The main frequency components in the time series are studied by the global wavelet spectrum and the modulation in separated periodicity bands were done in order to extract additional information, e.g., the 8 and 16 months band was examined by an average of all scales, giving a measure of the average annual variance versus time, where the periods with low or high variance could be identified. The important increases were identified in the average variance for some periods, e.g. 1947 to 1952 at Teresina city, which can be considered as high wet periods. Although, the precipitation in those sites showed similar global wavelet spectra, the wavelet spectra revealed particular features. This study can be considered an important tool for time series analysis, which can help the studies concerning flood control, mainly when they are applied together with rainfall-runoff simulations.Keywords: rainfall data, urban center, wavelet transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2448969 A Pattern Recognition Neural Network Model for Detection and Classification of SQL Injection Attacks
Authors: Naghmeh Moradpoor Sheykhkanloo
Abstract:
Thousands of organisations store important and confidential information related to them, their customers, and their business partners in databases all across the world. The stored data ranges from less sensitive (e.g. first name, last name, date of birth) to more sensitive data (e.g. password, pin code, and credit card information). Losing data, disclosing confidential information or even changing the value of data are the severe damages that Structured Query Language injection (SQLi) attack can cause on a given database. It is a code injection technique where malicious SQL statements are inserted into a given SQL database by simply using a web browser. In this paper, we propose an effective pattern recognition neural network model for detection and classification of SQLi attacks. The proposed model is built from three main elements of: a Uniform Resource Locator (URL) generator in order to generate thousands of malicious and benign URLs, a URL classifier in order to: 1) classify each generated URL to either a benign URL or a malicious URL and 2) classify the malicious URLs into different SQLi attack categories, and a NN model in order to: 1) detect either a given URL is a malicious URL or a benign URL and 2) identify the type of SQLi attack for each malicious URL. The model is first trained and then evaluated by employing thousands of benign and malicious URLs. The results of the experiments are presented in order to demonstrate the effectiveness of the proposed approach.Keywords: Neural Networks, pattern recognition, SQL injection attacks, SQL injection attack classification, SQL injection attack detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2844