Search results for: Replacement Algorithms
354 Improved Feature Extraction Technique for Handling Occlusion in Automatic Facial Expression Recognition
Authors: Khadijat T. Bamigbade, Olufade F. W. Onifade
Abstract:
The field of automatic facial expression analysis has been an active research area in the last two decades. Its vast applicability in various domains has drawn so much attention into developing techniques and dataset that mirror real life scenarios. Many techniques such as Local Binary Patterns and its variants (CLBP, LBP-TOP) and lately, deep learning techniques, have been used for facial expression recognition. However, the problem of occlusion has not been sufficiently handled, making their results not applicable in real life situations. This paper develops a simple, yet highly efficient method tagged Local Binary Pattern-Histogram of Gradient (LBP-HOG) with occlusion detection in face image, using a multi-class SVM for Action Unit and in turn expression recognition. Our method was evaluated on three publicly available datasets which are JAFFE, CK, SFEW. Experimental results showed that our approach performed considerably well when compared with state-of-the-art algorithms and gave insight to occlusion detection as a key step to handling expression in wild.
Keywords: Automatic facial expression analysis, local binary pattern, LBP-HOG, occlusion detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 783353 Optimal Placement and Sizing of Energy Storage System in Distribution Network with Photovoltaic Based Distributed Generation Using Improved Firefly Algorithms
Authors: Ling Ai Wong, Hussain Shareef, Azah Mohamed, Ahmad Asrul Ibrahim
Abstract:
The installation of photovoltaic based distributed generation (PVDG) in active distribution system can lead to voltage fluctuation due to the intermittent and unpredictable PVDG output power. This paper presented a method in mitigating the voltage rise by optimally locating and sizing the battery energy storage system (BESS) in PVDG integrated distribution network. The improved firefly algorithm is used to perform optimal placement and sizing. Three objective functions are presented considering the voltage deviation and BESS off-time with state of charge as the constraint. The performance of the proposed method is compared with another optimization method such as the original firefly algorithm and gravitational search algorithm. Simulation results show that the proposed optimum BESS location and size improve the voltage stability.
Keywords: BESS, PVDG, firefly algorithm, voltage fluctuation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1323352 On the Reduction of Side Effects in Tomography
Authors: V. Masilamani, C. Vanniarajan, Kamala Krithivasan
Abstract:
As the Computed Tomography(CT) requires normally hundreds of projections to reconstruct the image, patients are exposed to more X-ray energy, which may cause side effects such as cancer. Even when the variability of the particles in the object is very less, Computed Tomography requires many projections for good quality reconstruction. In this paper, less variability of the particles in an object has been exploited to obtain good quality reconstruction. Though the reconstructed image and the original image have same projections, in general, they need not be the same. In addition to projections, if a priori information about the image is known, it is possible to obtain good quality reconstructed image. In this paper, it has been shown by experimental results why conventional algorithms fail to reconstruct from a few projections, and an efficient polynomial time algorithm has been given to reconstruct a bi-level image from its projections along row and column, and a known sub image of unknown image with smoothness constraints by reducing the reconstruction problem to integral max flow problem. This paper also discusses the necessary and sufficient conditions for uniqueness and extension of 2D-bi-level image reconstruction to 3D-bi-level image reconstruction.Keywords: Discrete Tomography, Image Reconstruction, Projection, Computed Tomography, Integral Max Flow Problem, Smooth Binary Image.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1370351 Morphing Human Faces: Automatic Control Points Selection and Color Transition
Authors: Stephen Karungaru, Minoru Fukumi, Norio Akamatsu
Abstract:
In this paper, we propose a morphing method by which face color images can be freely transformed. The main focus of this work is the transformation of one face image to another. This method is fully automatic in that it can morph two face images by automatically detecting all the control points necessary to perform the morph. A face detection neural network, edge detection and medium filters are employed to detect the face position and features. Five control points, for both the source and target images, are then extracted based on the facial features. Triangulation method is then used to match and warp the source image to the target image using the control points. Finally color interpolation is done using a color Gaussian model that calculates the color for each particular frame depending on the number of frames used. A real coded Genetic algorithm is used in both the image warping and color blending steps to assist in step size decisions and speed up the morphing. This method results in ''very smooth'' morphs and is fast to process.
Keywords: color transition, genetic algorithms morphing, warping
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2823350 Site Selection of Traffic Camera based on Dempster-Shafer and Bagging Theory
Authors: S. Rokhsari, M. Delavar, A. Sadeghi-Niaraki, A. Abed-Elmdoust, B. Moshiri
Abstract:
Traffic incident has bad effect on all parts of society so controlling road networks with enough traffic devices could help to decrease number of accidents, so using the best method for optimum site selection of these devices could help to implement good monitoring system. This paper has considered here important criteria for optimum site selection of traffic camera based on aggregation methods such as Bagging and Dempster-Shafer concepts. In the first step, important criteria such as annual traffic flow, distance from critical places such as parks that need more traffic controlling were identified for selection of important road links for traffic camera installation, Then classification methods such as Artificial neural network and Decision tree algorithms were employed for classification of road links based on their importance for camera installation. Then for improving the result of classifiers aggregation methods such as Bagging and Dempster-Shafer theories were used.Keywords: Aggregation, Bagging theory, Dempster-Shafer theory, Site selection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1706349 Interfacing Photovoltaic Systems to the Utility Grid: A Comparative Simulation Study to Mitigate the Impact of Unbalanced Voltage Dips
Authors: Badr M. Alshammari, A. Rabeh, A. K. Mohamed
Abstract:
This paper presents the modeling and the control of a grid-connected photovoltaic system (PVS). Firstly, the MPPT control of the PVS and its associated DC/DC converter has been analyzed in order to extract the maximum of available power. Secondly, the control system of the grid side converter (GSC) which is a three-phase voltage source inverter (VSI) has been presented. A special attention has been paid to the control algorithms of the GSC converter during grid voltages imbalances. Especially, three different control objectives are to achieve; the mitigation of the grid imbalance adverse effects, at the point of common coupling (PCC), on the injected currents, the elimination of double frequency oscillations in active power flow, and the elimination of double frequency oscillations in reactive power flow. Simulation results of two control strategies have been performed via MATLAB software in order to demonstrate the particularities of each control strategy according to power quality standards.
Keywords: Renewable energies, photovoltaic systems, DC link, voltage source inverter, space vector SVPWM, unbalanced voltage dips, symmetrical components.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1625348 FIR Filter Design via Linear Complementarity Problem, Messy Genetic Algorithm, and Ising Messy Genetic Algorithm
Authors: A.M. Al-Fahed Nuseirat, R. Abu-Zitar
Abstract:
In this paper the design of maximally flat linear phase finite impulse response (FIR) filters is considered. The problem is handled with totally two different approaches. The first one is completely deterministic numerical approach where the problem is formulated as a Linear Complementarity Problem (LCP). The other one is based on a combination of Markov Random Fields (MRF's) approach with messy genetic algorithm (MGA). Markov Random Fields (MRFs) are a class of probabilistic models that have been applied for many years to the analysis of visual patterns or textures. Our objective is to establish MRFs as an interesting approach to modeling messy genetic algorithms. We establish a theoretical result that every genetic algorithm problem can be characterized in terms of a MRF model. This allows us to construct an explicit probabilistic model of the MGA fitness function and introduce the Ising MGA. Experimentations done with Ising MGA are less costly than those done with standard MGA since much less computations are involved. The least computations of all is for the LCP. Results of the LCP, random search, random seeded search, MGA, and Ising MGA are discussed.Keywords: Filter design, FIR digital filters, LCP, Ising model, MGA, Ising MGA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2023347 Approximating Maximum Speed on Road from Curvature Information of Bezier Curve
Authors: M. Y. Misro, A. Ramli, J. M. Ali
Abstract:
Bezier curves have useful properties for path generation problem, for instance, it can generate the reference trajectory for vehicles to satisfy the path constraints. Both algorithms join cubic Bezier curve segment smoothly to generate the path. Some of the useful properties of Bezier are curvature. In mathematics, curvature is the amount by which a geometric object deviates from being flat, or straight in the case of a line. Another extrinsic example of curvature is a circle, where the curvature is equal to the reciprocal of its radius at any point on the circle. The smaller the radius, the higher the curvature thus the vehicle needs to bend sharply. In this study, we use Bezier curve to fit highway-like curve. We use different approach to find the best approximation for the curve so that it will resembles highway-like curve. We compute curvature value by analytical differentiation of the Bezier Curve. We will then compute the maximum speed for driving using the curvature information obtained. Our research works on some assumptions; first, the Bezier curve estimates the real shape of the curve which can be verified visually. Even though, fitting process of Bezier curve does not interpolate exactly on the curve of interest, we believe that the estimation of speed are acceptable. We verified our result with the manual calculation of the curvature from the map.Keywords: Speed estimation, path constraints, reference trajectory, Bezier curve.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4057346 Energy-Efficient Clustering Protocol in Wireless Sensor Networks for Healthcare Monitoring
Authors: Ebrahim Farahmand, Ali Mahani
Abstract:
Wireless sensor networks (WSNs) can facilitate continuous monitoring of patients and increase early detection of emergency conditions and diseases. High density WSNs helps us to accurately monitor a remote environment by intelligently combining the data from the individual nodes. Due to energy capacity limitation of sensors, enhancing the lifetime and the reliability of WSNs are important factors in designing of these networks. The clustering strategies are verified as effective and practical algorithms for reducing energy consumption in WSNs and can tackle WSNs limitations. In this paper, an Energy-efficient weight-based Clustering Protocol (EWCP) is presented. Artificial retina is selected as a case study of WSNs applied in body sensors. Cluster heads’ (CHs) selection is equipped with energy efficient parameters. Moreover, cluster members are selected based on their distance to the selected CHs. Comparing with the other benchmark protocols, the lifetime of EWCP is improved significantly.Keywords: Clustering of WSNs, healthcare monitoring, weight-based clustering, wireless sensor networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1555345 A Two-Phase Mechanism for Agent's Action Selection in Soccer Simulation
Authors: Vahid Salmani, Mahmoud Naghibzadeh, Farid Seifi, Amirhossein Taherinia
Abstract:
Soccer simulation is an effort to motivate researchers and practitioners to do artificial and robotic intelligence research; and at the same time put into practice and test the results. Many researchers and practitioners throughout the world are continuously working to polish their ideas and improve their implemented systems. At the same time, new groups are forming and they bring bright new thoughts to the field. The research includes designing and executing robotic soccer simulation algorithms. In our research, a soccer simulation player is considered to be an intelligent agent that is capable of receiving information from the environment, analyze it and to choose the best action from a set of possible ones, for its next move. We concentrate on developing a two-phase method for the soccer player agent to choose its best next move. The method is then implemented into our software system called Nexus simulation team of Ferdowsi University. This system is based on TsinghuAeolus[1] team that was the champion of the world RoboCup soccer simulation contest in 2001 and 2002.
Keywords: RoboCup, Soccer simulation, multi-agent environment, intelligent soccer agent, ball controller agent.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1545344 A Similarity Function for Global Quality Assessment of Retinal Vessel Segmentations
Authors: Arturo Aquino, Manuel Emilio Gegundez, Jose Manuel Bravo, Diego Marin
Abstract:
Retinal vascularity assessment plays an important role in diagnosis of ophthalmic pathologies. The employment of digital images for this purpose makes possible a computerized approach and has motivated development of many methods for automated vascular tree segmentation. Metrics based on contingency tables for binary classification have been widely used for evaluating performance of these algorithms and, concretely, the accuracy has been mostly used as measure of global performance in this topic. However, this metric shows very poor matching with human perception as well as other notable deficiencies. Here, a new similarity function for measuring quality of retinal vessel segmentations is proposed. This similarity function is based on characterizing the vascular tree as a connected structure with a measurable area and length. Tests made indicate that this new approach shows better behaviour than the current one does. Generalizing, this concept of measuring descriptive properties may be used for designing functions for measuring more successfully segmentation quality of other complex structures.
Keywords: Retinal vessel segmentation, quality assessment, performanceevaluation, similarity function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1500343 Detection and Correction of Ectopic Beats for HRV Analysis Applying Discrete Wavelet Transforms
Authors: Desmond B. Keenan
Abstract:
The clinical usefulness of heart rate variability is limited to the range of Holter monitoring software available. These software algorithms require a normal sinus rhythm to accurately acquire heart rate variability (HRV) measures in the frequency domain. Premature ventricular contractions (PVC) or more commonly referred to as ectopic beats, frequent in heart failure, hinder this analysis and introduce ambiguity. This investigation demonstrates an algorithm to automatically detect ectopic beats by analyzing discrete wavelet transform coefficients. Two techniques for filtering and replacing the ectopic beats from the RR signal are compared. One technique applies wavelet hard thresholding techniques and another applies linear interpolation to replace ectopic cycles. The results demonstrate through simulation, and signals acquired from a 24hr ambulatory recorder, that these techniques can accurately detect PVC-s and remove the noise and leakage effects produced by ectopic cycles retaining smooth spectra with the minimum of error.Keywords: Heart rate variability, vagal tone, sympathetic, parasympathetic, wavelets, ectopic beats, spectral analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2070342 A Text Clustering System based on k-means Type Subspace Clustering and Ontology
Authors: Liping Jing, Michael K. Ng, Xinhua Yang, Joshua Zhexue Huang
Abstract:
This paper presents a text clustering system developed based on a k-means type subspace clustering algorithm to cluster large, high dimensional and sparse text data. In this algorithm, a new step is added in the k-means clustering process to automatically calculate the weights of keywords in each cluster so that the important words of a cluster can be identified by the weight values. For understanding and interpretation of clustering results, a few keywords that can best represent the semantic topic are extracted from each cluster. Two methods are used to extract the representative words. The candidate words are first selected according to their weights calculated by our new algorithm. Then, the candidates are fed to the WordNet to identify the set of noun words and consolidate the synonymy and hyponymy words. Experimental results have shown that the clustering algorithm is superior to the other subspace clustering algorithms, such as PROCLUS and HARP and kmeans type algorithm, e.g., Bisecting-KMeans. Furthermore, the word extraction method is effective in selection of the words to represent the topics of the clusters.
Keywords: Subspace Clustering, Text Mining, Feature Weighting, Cluster Interpretation, Ontology
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2462341 Manufacturing of Full Automatic Carwash Using with Intelligent Control Algorithms
Authors: Amir Hossein Daei Sorkhabi, Bita Khazini
Abstract:
In this paper the intelligent control of full automatic car wash using a programmable logic controller (PLC) has been investigated and designed to do all steps of carwashing. The Intelligent control of full automatic carwash has the ability to identify and profile the geometrical dimensions of the vehicle chassis. Vehicle dimension identification is an important point in this control system to adjust the washing brushes position and time duration. The study also tries to design a control set for simulating and building the automatic carwash. The main purpose of the simulation is to develop criteria for designing and building this type of carwash in actual size to overcome challenges of automation. The results of this research indicate that the proposed method in process control not only increases productivity, speed, accuracy and safety but also reduce the time and cost of washing based on dynamic model of the vehicle. A laboratory prototype based on an advanced intelligent control has been built to study the validity of the design and simulation which it’s appropriate performance confirms the validity of this study.
Keywords: Automatic Carwash, Dimension, PLC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6824340 Optimal Control Strategy for High Performance EV Interior Permanent Magnet Synchronous Motor
Authors: Mehdi Karbalaye Zadeh, Ehsan M. Siavashi
Abstract:
The controllable electrical loss which consists of the copper loss and iron loss can be minimized by the optimal control of the armature current vector. The control algorithm of current vector minimizing the electrical loss is proposed and the optimal current vector can be decided according to the operating speed and the load conditions. The proposed control algorithm is applied to the experimental PM motor drive system and this paper presents a modern approach of speed control for permanent magnet synchronous motor (PMSM) applied for Electric Vehicle using a nonlinear control. The regulation algorithms are based on the feedback linearization technique. The direct component of the current is controlled to be zero which insures the maximum torque operation. The near unity power factor operation is also achieved. More over, among EV-s motor electric propulsion features, the energy efficiency is a basic characteristic that is influenced by vehicle dynamics and system architecture. For this reason, the EV dynamics are taken into account.Keywords: PMSM, Electric Vehicle, Optimal control, Traction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1768339 WebAppShield: An Approach Exploiting Machine Learning to Detect SQLi Attacks in an Application Layer in Run-Time
Authors: Ahmed Abdulla Ashlam, Atta Badii, Frederic Stahl
Abstract:
In recent years, SQL injection attacks have been identified as being prevalent against web applications. They affect network security and user data, which leads to a considerable loss of money and data every year. This paper presents the use of classification algorithms in machine learning using a method to classify the login data filtering inputs into "SQLi" or "Non-SQLi,” thus increasing the reliability and accuracy of results in terms of deciding whether an operation is an attack or a valid operation. A method as a Web-App is developed for auto-generated data replication to provide a twin of the targeted data structure. Shielding against SQLi attacks (WebAppShield) that verifies all users and prevents attackers (SQLi attacks) from entering and or accessing the database, which the machine learning module predicts as "Non-SQLi", has been developed. A special login form has been developed with a special instance of the data validation; this verification process secures the web application from its early stages. The system has been tested and validated, and up to 99% of SQLi attacks have been prevented.
Keywords: SQL injection, attacks, web application, accuracy, database, WebAppShield.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 444338 Sequential Straightforward Clustering for Local Image Block Matching
Authors: Mohammad Akbarpour Sekeh, Mohd. Aizaini Maarof, Mohd. Foad Rohani, Malihe Motiei
Abstract:
Duplicated region detection is a technical method to expose copy-paste forgeries on digital images. Copy-paste is one of the common types of forgeries to clone portion of an image in order to conceal or duplicate special object. In this type of forgery detection, extracting robust block feature and also high time complexity of matching step are two main open problems. This paper concentrates on computational time and proposes a local block matching algorithm based on block clustering to enhance time complexity. Time complexity of the proposed algorithm is formulated and effects of two parameter, block size and number of cluster, on efficiency of this algorithm are considered. The experimental results and mathematical analysis demonstrate this algorithm is more costeffective than lexicographically algorithms in time complexity issue when the image is complex.Keywords: Copy-paste forgery detection, Duplicated region, Timecomplexity, Local block matching, Sequential block clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1832337 Performance Improvement in the Bivariate Models by using Modified Marginal Variance of Noisy Observations for Image-Denoising Applications
Authors: R. Senthilkumar
Abstract:
Most simple nonlinear thresholding rules for wavelet- based denoising assume that the wavelet coefficients are independent. However, wavelet coefficients of natural images have significant dependencies. This paper attempts to give a recipe for selecting one of the popular image-denoising algorithms based on VisuShrink, SureShrink, OracleShrink, BayesShrink and BiShrink and also this paper compares different Bivariate models used for image denoising applications. The first part of the paper compares different Shrinkage functions used for image-denoising. The second part of the paper compares different bivariate models and the third part of this paper uses the Bivariate model with modified marginal variance which is based on Laplacian assumption. This paper gives an experimental comparison on six 512x512 commonly used images, Lenna, Barbara, Goldhill, Clown, Boat and Stonehenge. The following noise powers 25dB,26dB, 27dB, 28dB and 29dB are added to the six standard images and the corresponding Peak Signal to Noise Ratio (PSNR) values are calculated for each noise level.Keywords: BiShrink, Image-Denoising, PSNR, Shrinkage function
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1347336 Collision Detection Algorithm Based on Data Parallelism
Authors: Zhen Peng, Baifeng Wu
Abstract:
Modern computing technology enters the era of parallel computing with the trend of sustainable and scalable parallelism. Single Instruction Multiple Data (SIMD) is an important way to go along with the trend. It is able to gather more and more computing ability by increasing the number of processor cores without the need of modifying the program. Meanwhile, in the field of scientific computing and engineering design, many computation intensive applications are facing the challenge of increasingly large amount of data. Data parallel computing will be an important way to further improve the performance of these applications. In this paper, we take the accurate collision detection in building information modeling as an example. We demonstrate a model for constructing a data parallel algorithm. According to the model, a complex object is decomposed into the sets of simple objects; collision detection among complex objects is converted into those among simple objects. The resulting algorithm is a typical SIMD algorithm, and its advantages in parallelism and scalability is unparalleled in respect to the traditional algorithms.
Keywords: Data parallelism, collision detection, single instruction multiple data, building information modeling, continuous scalability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1235335 Jobs Scheduling and Worker Assignment Problem to Minimize Makespan using Ant Colony Optimization Metaheuristic
Authors: Mian Tahir Aftab, Muhammad Umer, Riaz Ahmad
Abstract:
This article proposes an Ant Colony Optimization (ACO) metaheuristic to minimize total makespan for scheduling a set of jobs and assign workers for uniformly related parallel machines. An algorithm based on ACO has been developed and coded on a computer program Matlab®, to solve this problem. The paper explains various steps to apply Ant Colony approach to the problem of minimizing makespan for the worker assignment & jobs scheduling problem in a parallel machine model and is aimed at evaluating the strength of ACO as compared to other conventional approaches. One data set containing 100 problems (12 Jobs, 03 machines and 10 workers) which is available on internet, has been taken and solved through this ACO algorithm. The results of our ACO based algorithm has shown drastically improved results, especially, in terms of negligible computational effort of CPU, to reach the optimal solution. In our case, the time taken to solve all 100 problems is even lesser than the average time taken to solve one problem in the data set by other conventional approaches like GA algorithm and SPT-A/LMC heuristics.Keywords: Ant Colony Optimization (ACO), Genetic algorithms (GA), Makespan, SPT-A/LMC heuristic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3472334 DCBOR: A Density Clustering Based on Outlier Removal
Authors: A. M. Fahim, G. Saake, A. M. Salem, F. A. Torkey, M. A. Ramadan
Abstract:
Data clustering is an important data exploration technique with many applications in data mining. We present an enhanced version of the well known single link clustering algorithm. We will refer to this algorithm as DCBOR. The proposed algorithm alleviates the chain effect by removing the outliers from the given dataset. So this algorithm provides outlier detection and data clustering simultaneously. This algorithm does not need to update the distance matrix, since the algorithm depends on merging the most k-nearest objects in one step and the cluster continues grow as long as possible under specified condition. So the algorithm consists of two phases; at the first phase, it removes the outliers from the input dataset. At the second phase, it performs the clustering process. This algorithm discovers clusters of different shapes, sizes, densities and requires only one input parameter; this parameter represents a threshold for outlier points. The value of the input parameter is ranging from 0 to 1. The algorithm supports the user in determining an appropriate value for it. We have tested this algorithm on different datasets contain outlier and connecting clusters by chain of density points, and the algorithm discovers the correct clusters. The results of our experiments demonstrate the effectiveness and the efficiency of DCBOR.Keywords: Data Clustering, Clustering Algorithms, Handling Noise, Arbitrary Shape of Clusters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1933333 Predicting the Three Major Dimensions of the Learner-s Emotions from Brainwaves
Authors: Alicia Heraz, Claude Frasson
Abstract:
This paper investigates how the use of machine learning techniques can significantly predict the three major dimensions of learner-s emotions (pleasure, arousal and dominance) from brainwaves. This study has adopted an experimentation in which participants were exposed to a set of pictures from the International Affective Picture System (IAPS) while their electrical brain activity was recorded with an electroencephalogram (EEG). The pictures were already rated in a previous study via the affective rating system Self-Assessment Manikin (SAM) to assess the three dimensions of pleasure, arousal, and dominance. For each picture, we took the mean of these values for all subjects used in this previous study and associated them to the recorded brainwaves of the participants in our study. Correlation and regression analyses confirmed the hypothesis that brainwave measures could significantly predict emotional dimensions. This can be very useful in the case of impassive, taciturn or disabled learners. Standard classification techniques were used to assess the reliability of the automatic detection of learners- three major dimensions from the brainwaves. We discuss the results and the pertinence of such a method to assess learner-s emotions and integrate it into a brainwavesensing Intelligent Tutoring System.
Keywords: Algorithms, brainwaves, emotional dimensions, performance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2205332 Tape-Shaped Multiscale Fiducial Marker: A Design Prototype for Indoor Localization
Authors: Marcell S. A. Martins, Benedito S. R. Neto, Gerson L. Serejo, Carlos G. R. Santos
Abstract:
Indoor positioning systems use sensors such as Bluetooth, ZigBee, and Wi-Fi, as well as cameras for image capture, which can be fixed or mobile. These computer vision-based positioning approaches are low-cost to implement, mainly when it uses a mobile camera. The present study aims to create a design of a fiducial marker for a low-cost indoor localization system. The marker is tape-shaped to perform a continuous reading employing two detection algorithms, one for greater distances and another for smaller distances. Therefore, the location service is always operational, even with variations in capture distance. A minimal localization and reading algorithm was implemented for the proposed marker design, aiming to validate it. The accuracy tests consider readings varying the capture distance between [0.5, 10] meters, comparing the proposed marker with others. The tests showed that the proposed marker has a broader capture range than the ArUco and QRCode, maintaining the same size. Therefore, reducing the visual pollution and maximizing the tracking since the ambient can be covered entirely.
Keywords: Multiscale recognition, indoor localization, tape-shaped marker, Fiducial Marker.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 179331 Weld Defect Detection in Industrial Radiography Based Digital Image Processing
Authors: N. Nacereddine, M. Zelmat, S. S. Belaïfa, M. Tridi
Abstract:
Industrial radiography is a famous technique for the identification and evaluation of discontinuities, or defects, such as cracks, porosity and foreign inclusions found in welded joints. Although this technique has been well developed, improving both the inspection process and operating time, it does suffer from several drawbacks. The poor quality of radiographic images is due to the physical nature of radiography as well as small size of the defects and their poor orientation relatively to the size and thickness of the evaluated parts. Digital image processing techniques allow the interpretation of the image to be automated, avoiding the presence of human operators making the inspection system more reliable, reproducible and faster. This paper describes our attempt to develop and implement digital image processing algorithms for the purpose of automatic defect detection in radiographic images. Because of the complex nature of the considered images, and in order that the detected defect region represents the most accurately possible the real defect, the choice of global and local preprocessing and segmentation methods must be appropriated.
Keywords: Digital image processing, global and localapproaches, radiographic film, weld defect.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4073330 Syntactic Recognition of Distorted Patterns
Authors: Marek Skomorowski
Abstract:
In syntactic pattern recognition a pattern can be represented by a graph. Given an unknown pattern represented by a graph g, the problem of recognition is to determine if the graph g belongs to a language L(G) generated by a graph grammar G. The so-called IE graphs have been defined in [1] for a description of patterns. The IE graphs are generated by so-called ETPL(k) graph grammars defined in [1]. An efficient, parsing algorithm for ETPL(k) graph grammars for syntactic recognition of patterns represented by IE graphs has been presented in [1]. In practice, structural descriptions may contain pattern distortions, so that the assignment of a graph g, representing an unknown pattern, to a graph language L(G) generated by an ETPL(k) graph grammar G is rejected by the ETPL(k) type parsing. Therefore, there is a need for constructing effective parsing algorithms for recognition of distorted patterns. The purpose of this paper is to present a new approach to syntactic recognition of distorted patterns. To take into account all variations of a distorted pattern under study, a probabilistic description of the pattern is needed. A random IE graph approach is proposed here for such a description ([2]).Keywords: Syntactic pattern recognition, Distorted patterns, Random graphs, Graph grammars.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1395329 A Novel RLS Based Adaptive Filtering Method for Speech Enhancement
Authors: Pogula Rakesh, T. Kishore Kumar
Abstract:
Speech enhancement is a long standing problem with numerous applications like teleconferencing, VoIP, hearing aids and speech recognition. The motivation behind this research work is to obtain a clean speech signal of higher quality by applying the optimal noise cancellation technique. Real-time adaptive filtering algorithms seem to be the best candidate among all categories of the speech enhancement methods. In this paper, we propose a speech enhancement method based on Recursive Least Squares (RLS) adaptive filter of speech signals. Experiments were performed on noisy data which was prepared by adding AWGN, Babble and Pink noise to clean speech samples at -5dB, 0dB, 5dB and 10dB SNR levels. We then compare the noise cancellation performance of proposed RLS algorithm with existing NLMS algorithm in terms of Mean Squared Error (MSE), Signal to Noise ratio (SNR) and SNR Loss. Based on the performance evaluation, the proposed RLS algorithm was found to be a better optimal noise cancellation technique for speech signals.
Keywords: Adaptive filter, Adaptive Noise Canceller, Mean Squared Error, Noise reduction, NLMS, RLS, SNR, SNR Loss.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3183328 Low Cost Real Time Robust Identification of Impulsive Signals
Authors: R. Biondi, G. Dys, G. Ferone, T. Renard, M. Zysman
Abstract:
This paper describes an automated implementable system for impulsive signals detection and recognition. The system uses a Digital Signal Processing device for the detection and identification process. Here the system analyses the signals in real time in order to produce a particular response if needed. The system analyses the signals in real time in order to produce a specific output if needed. Detection is achieved through normalizing the inputs and comparing the read signals to a dynamic threshold and thus avoiding detections linked to loud or fluctuating environing noise. Identification is done through neuronal network algorithms. As a setup our system can receive signals to “learn” certain patterns. Through “learning” the system can recognize signals faster, inducing flexibility to new patterns similar to those known. Sound is captured through a simple jack input, and could be changed for an enhanced recording surface such as a wide-area recorder. Furthermore a communication module can be added to the apparatus to send alerts to another interface if needed.
Keywords: Sound Detection, Impulsive Signal, Background Noise, Neural Network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2335327 Prediction of the Solubility of Benzoic Acid in Supercritical CO2 Using the PC-SAFT EoS
Authors: Hamidreza Bagheri, Alireza Shariati
Abstract:
There are many difficulties in the purification of raw components and products. However, researchers are seeking better ways for purification. One of the recent methods is extraction using supercritical fluids. In this study, the phase equilibria of benzoic acid -supercritical carbon dioxide system were investigated. Regarding the phase equilibria of this system, the modeling of solid-supercritical fluid behavior was performed using the Perturbed-Chain Statistical Association Fluid Theory (PC-SAFT) and Peng-Robinson equations of state (PR EoS). For this purpose, five PC-SAFT EoS parameters for pure benzoic acid were obtained using its experimental vapor pressure. Benzoic acid has association sites and the behavior of the benzoic acid-supercritical fluid system was well predicted using both equations of state, while the binary interaction parameter values for PR EoS were negative. Genetic algorithm, which is one of the most accurate global optimization algorithms, was also used to optimize the pure benzoic acid parameters and the binary interaction parameters. The AAD% value for the PC-SAFT EoS, were 0.22 for the carbon dioxide-benzoic acid system.
Keywords: Supercritical fluids, Solubility, Solid, PC-SAFT EoS, Genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2668326 Artificial Intelligence in Penetration Testing of a Connected and Autonomous Vehicle Network
Authors: Phillip Garrad, Saritha Unnikrishnan
Abstract:
The increase in connected and autonomous vehicles (CAV) creates more opportunities for cyber-attacks. Cyber-attacks can be performed with malicious intent or for research and testing purposes. As connected vehicles approach full autonomy, the possible impact of these cyber-attacks also grows. This review analyses the challenges faced in CAV cybersecurity testing. This includes access and cost of the representative test setup and lack of experts in the field A review of potential solutions to overcome these challenges is presented. Studies have demonstrated Artificial Intelligence (AI) as a promising technique to reduce runtime, enhance effectiveness and comprehensively cover all the standard test aspects in penetration testing in other industries. However, this review has identified a significant gap in the systematic implementation of AI for penetration testing in the CAV cybersecurity domain. The expectation from this review is to investigate potential AI algorithms, which can demonstrate similar improvements in runtime and efficiency for a CAV model. If proven to be an effective means of penetration test for CAV, this methodology may be used on a full CAV test network.
Keywords: Cybersecurity, connected vehicles, software simulation, artificial intelligence, penetration testing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 490325 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks
Authors: Wang Yichen, Haruka Yamashita
Abstract:
In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.Keywords: Recurrent Neural Network, players lineup, basketball data, decision making model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 829