Search results for: graph algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1777

Search results for: graph algorithms

337 An Efficient Data Mining Approach on Compressed Transactions

Authors: Jia-Yu Dai, Don-Lin Yang, Jungpin Wu, Ming-Chuan Hung

Abstract:

In an era of knowledge explosion, the growth of data increases rapidly day by day. Since data storage is a limited resource, how to reduce the data space in the process becomes a challenge issue. Data compression provides a good solution which can lower the required space. Data mining has many useful applications in recent years because it can help users discover interesting knowledge in large databases. However, existing compression algorithms are not appropriate for data mining. In [1, 2], two different approaches were proposed to compress databases and then perform the data mining process. However, they all lack the ability to decompress the data to their original state and improve the data mining performance. In this research a new approach called Mining Merged Transactions with the Quantification Table (M2TQT) was proposed to solve these problems. M2TQT uses the relationship of transactions to merge related transactions and builds a quantification table to prune the candidate itemsets which are impossible to become frequent in order to improve the performance of mining association rules. The experiments show that M2TQT performs better than existing approaches.

Keywords: Association rule, data mining, merged transaction, quantification table.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1960
336 Semantic Spatial Objects Data Structure for Spatial Access Method

Authors: Kalum Priyanath Udagepola, Zuo Decheng, Wu Zhibo, Yang Xiaozong

Abstract:

Modern spatial database management systems require a unique Spatial Access Method (SAM) in order solve complex spatial quires efficiently. In this case the spatial data structure takes a prominent place in the SAM. Inadequate data structure leads forming poor algorithmic choices and forging deficient understandings of algorithm behavior on the spatial database. A key step in developing a better semantic spatial object data structure is to quantify the performance effects of semantic and outlier detections that are not reflected in the previous tree structures (R-Tree and its variants). This paper explores a novel SSRO-Tree on SAM to the Topo-Semantic approach. The paper shows how to identify and handle the semantic spatial objects with outlier objects during page overflow/underflow, using gain/loss metrics. We introduce a new SSRO-Tree algorithm which facilitates the achievement of better performance in practice over algorithms that are superior in the R*-Tree and RO-Tree by considering selection queries.

Keywords: Outlier, semantic spatial object, spatial objects, SSRO-Tree, topo-semantic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1695
335 Specific Emitter Identification Based on Refined Composite Multiscale Dispersion Entropy

Authors: Shaoying Guo, Yanyun Xu, Meng Zhang, Weiqing Huang

Abstract:

The wireless communication network is developing rapidly, thus the wireless security becomes more and more important. Specific emitter identification (SEI) is an vital part of wireless communication security as a technique to identify the unique transmitters. In this paper, a SEI method based on multiscale dispersion entropy (MDE) and refined composite multiscale dispersion entropy (RCMDE) is proposed. The algorithms of MDE and RCMDE are used to extract features for identification of five wireless devices and cross-validation support vector machine (CV-SVM) is used as the classifier. The experimental results show that the total identification accuracy is 99.3%, even at low signal-to-noise ratio(SNR) of 5dB, which proves that MDE and RCMDE can describe the communication signal series well. In addition, compared with other methods, the proposed method is effective and provides better accuracy and stability for SEI.

Keywords: Cross-validation support vector machine, refined composite multiscale dispersion entropy, specific emitter identification, transient signal, wireless communication device.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 857
334 Multiclass Support Vector Machines with Simultaneous Multi-Factors Optimization for Corporate Credit Ratings

Authors: Hyunchul Ahn, William X. S. Wong

Abstract:

Corporate credit rating prediction is one of the most important topics, which has been studied by researchers in the last decade. Over the last decade, researchers are pushing the limit to enhance the exactness of the corporate credit rating prediction model by applying several data-driven tools including statistical and artificial intelligence methods. Among them, multiclass support vector machine (MSVM) has been widely applied due to its good predictability. However, heuristics, for example, parameters of a kernel function, appropriate feature and instance subset, has become the main reason for the critics on MSVM, as they have dictate the MSVM architectural variables. This study presents a hybrid MSVM model that is intended to optimize all the parameter such as feature selection, instance selection, and kernel parameter. Our model adopts genetic algorithm (GA) to simultaneously optimize multiple heterogeneous design factors of MSVM.

Keywords: Corporate credit rating prediction, feature selection, genetic algorithms, instance selection, multiclass support vector machines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1411
333 Using Jumping Particle Swarm Optimization for Optimal Operation of Pump in Water Distribution Networks

Authors: R. Rajabpour, N. Talebbeydokhti, M. H. Ahmadi

Abstract:

Carefully scheduling the operations of pumps can be resulted to significant energy savings. Schedules can be defined either implicit, in terms of other elements of the network such as tank levels, or explicit by specifying the time during which each pump is on/off. In this study, two new explicit representations based on timecontrolled triggers were analyzed, where the maximum number of pump switches was established beforehand, and the schedule may contain fewer switches than the maximum. The optimal operation of pumping stations was determined using a Jumping Particle Swarm Optimization (JPSO) algorithm to achieve the minimum energy cost. The model integrates JPSO optimizer and EPANET hydraulic network solver. The optimal pump operation schedule of VanZyl water distribution system was determined using the proposed model and compared with those from Genetic and Ant Colony algorithms. The results indicate that the proposed model utilizing the JPSO algorithm is a versatile management model for the operation of realworld water distribution system.

Keywords: JPSO, operation, optimization, water distribution system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2051
332 A Medical Resource Forecasting Model for Emergency Room Patients with Acute Hepatitis

Authors: R. J. Kuo, W. C. Cheng, W. C. Lien, T. J. Yang

Abstract:

Taiwan is a hyper endemic area for the Hepatitis B virus (HBV). The estimated total number of HBsAg carriers in the general population who are more than 20 years old is more than 3 million. Therefore, a case record review is conducted from January 2003 to June 2007 for all patients with a diagnosis of acute hepatitis who were admitted to the Emergency Department (ED) of a well-known teaching hospital. The cost for the use of medical resources is defined as the total medical fee. In this study, principal component analysis (PCA) is firstly employed to reduce the number of dimensions. Support vector regression (SVR) and artificial neural network (ANN) are then used to develop the forecasting model. A total of 117 patients meet the inclusion criteria. 61% patients involved in this study are hepatitis B related. The computational result shows that the proposed PCA-SVR model has superior performance than other compared algorithms. In conclusion, the Child-Pugh score and echogram can both be used to predict the cost of medical resources for patients with acute hepatitis in the ED.

Keywords: Acute hepatitis, Medical resource cost, Artificial neural network, Support vector regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1917
331 Improved Feature Extraction Technique for Handling Occlusion in Automatic Facial Expression Recognition

Authors: Khadijat T. Bamigbade, Olufade F. W. Onifade

Abstract:

The field of automatic facial expression analysis has been an active research area in the last two decades. Its vast applicability in various domains has drawn so much attention into developing techniques and dataset that mirror real life scenarios. Many techniques such as Local Binary Patterns and its variants (CLBP, LBP-TOP) and lately, deep learning techniques, have been used for facial expression recognition. However, the problem of occlusion has not been sufficiently handled, making their results not applicable in real life situations. This paper develops a simple, yet highly efficient method tagged Local Binary Pattern-Histogram of Gradient (LBP-HOG) with occlusion detection in face image, using a multi-class SVM for Action Unit and in turn expression recognition. Our method was evaluated on three publicly available datasets which are JAFFE, CK, SFEW. Experimental results showed that our approach performed considerably well when compared with state-of-the-art algorithms and gave insight to occlusion detection as a key step to handling expression in wild.

Keywords: Automatic facial expression analysis, local binary pattern, LBP-HOG, occlusion detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 781
330 Optimal Placement and Sizing of Energy Storage System in Distribution Network with Photovoltaic Based Distributed Generation Using Improved Firefly Algorithms

Authors: Ling Ai Wong, Hussain Shareef, Azah Mohamed, Ahmad Asrul Ibrahim

Abstract:

The installation of photovoltaic based distributed generation (PVDG) in active distribution system can lead to voltage fluctuation due to the intermittent and unpredictable PVDG output power. This paper presented a method in mitigating the voltage rise by optimally locating and sizing the battery energy storage system (BESS) in PVDG integrated distribution network. The improved firefly algorithm is used to perform optimal placement and sizing. Three objective functions are presented considering the voltage deviation and BESS off-time with state of charge as the constraint. The performance of the proposed method is compared with another optimization method such as the original firefly algorithm and gravitational search algorithm. Simulation results show that the proposed optimum BESS location and size improve the voltage stability.

Keywords: BESS, PVDG, firefly algorithm, voltage fluctuation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1322
329 On the Reduction of Side Effects in Tomography

Authors: V. Masilamani, C. Vanniarajan, Kamala Krithivasan

Abstract:

As the Computed Tomography(CT) requires normally hundreds of projections to reconstruct the image, patients are exposed to more X-ray energy, which may cause side effects such as cancer. Even when the variability of the particles in the object is very less, Computed Tomography requires many projections for good quality reconstruction. In this paper, less variability of the particles in an object has been exploited to obtain good quality reconstruction. Though the reconstructed image and the original image have same projections, in general, they need not be the same. In addition to projections, if a priori information about the image is known, it is possible to obtain good quality reconstructed image. In this paper, it has been shown by experimental results why conventional algorithms fail to reconstruct from a few projections, and an efficient polynomial time algorithm has been given to reconstruct a bi-level image from its projections along row and column, and a known sub image of unknown image with smoothness constraints by reducing the reconstruction problem to integral max flow problem. This paper also discusses the necessary and sufficient conditions for uniqueness and extension of 2D-bi-level image reconstruction to 3D-bi-level image reconstruction.

Keywords: Discrete Tomography, Image Reconstruction, Projection, Computed Tomography, Integral Max Flow Problem, Smooth Binary Image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1370
328 Morphing Human Faces: Automatic Control Points Selection and Color Transition

Authors: Stephen Karungaru, Minoru Fukumi, Norio Akamatsu

Abstract:

In this paper, we propose a morphing method by which face color images can be freely transformed. The main focus of this work is the transformation of one face image to another. This method is fully automatic in that it can morph two face images by automatically detecting all the control points necessary to perform the morph. A face detection neural network, edge detection and medium filters are employed to detect the face position and features. Five control points, for both the source and target images, are then extracted based on the facial features. Triangulation method is then used to match and warp the source image to the target image using the control points. Finally color interpolation is done using a color Gaussian model that calculates the color for each particular frame depending on the number of frames used. A real coded Genetic algorithm is used in both the image warping and color blending steps to assist in step size decisions and speed up the morphing. This method results in ''very smooth'' morphs and is fast to process.

Keywords: color transition, genetic algorithms morphing, warping

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2823
327 Site Selection of Traffic Camera based on Dempster-Shafer and Bagging Theory

Authors: S. Rokhsari, M. Delavar, A. Sadeghi-Niaraki, A. Abed-Elmdoust, B. Moshiri

Abstract:

Traffic incident has bad effect on all parts of society so controlling road networks with enough traffic devices could help to decrease number of accidents, so using the best method for optimum site selection of these devices could help to implement good monitoring system. This paper has considered here important criteria for optimum site selection of traffic camera based on aggregation methods such as Bagging and Dempster-Shafer concepts. In the first step, important criteria such as annual traffic flow, distance from critical places such as parks that need more traffic controlling were identified for selection of important road links for traffic camera installation, Then classification methods such as Artificial neural network and Decision tree algorithms were employed for classification of road links based on their importance for camera installation. Then for improving the result of classifiers aggregation methods such as Bagging and Dempster-Shafer theories were used.

Keywords: Aggregation, Bagging theory, Dempster-Shafer theory, Site selection

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1706
326 Interfacing Photovoltaic Systems to the Utility Grid: A Comparative Simulation Study to Mitigate the Impact of Unbalanced Voltage Dips

Authors: Badr M. Alshammari, A. Rabeh, A. K. Mohamed

Abstract:

This paper presents the modeling and the control of a grid-connected photovoltaic system (PVS). Firstly, the MPPT control of the PVS and its associated DC/DC converter has been analyzed in order to extract the maximum of available power. Secondly, the control system of the grid side converter (GSC) which is a three-phase voltage source inverter (VSI) has been presented. A special attention has been paid to the control algorithms of the GSC converter during grid voltages imbalances. Especially, three different control objectives are to achieve; the mitigation of the grid imbalance adverse effects, at the point of common coupling (PCC), on the injected currents, the elimination of double frequency oscillations in active power flow, and the elimination of double frequency oscillations in reactive power flow. Simulation results of two control strategies have been performed via MATLAB software in order to demonstrate the particularities of each control strategy according to power quality standards.

Keywords: Renewable energies, photovoltaic systems, DC link, voltage source inverter, space vector SVPWM, unbalanced voltage dips, symmetrical components.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1625
325 FIR Filter Design via Linear Complementarity Problem, Messy Genetic Algorithm, and Ising Messy Genetic Algorithm

Authors: A.M. Al-Fahed Nuseirat, R. Abu-Zitar

Abstract:

In this paper the design of maximally flat linear phase finite impulse response (FIR) filters is considered. The problem is handled with totally two different approaches. The first one is completely deterministic numerical approach where the problem is formulated as a Linear Complementarity Problem (LCP). The other one is based on a combination of Markov Random Fields (MRF's) approach with messy genetic algorithm (MGA). Markov Random Fields (MRFs) are a class of probabilistic models that have been applied for many years to the analysis of visual patterns or textures. Our objective is to establish MRFs as an interesting approach to modeling messy genetic algorithms. We establish a theoretical result that every genetic algorithm problem can be characterized in terms of a MRF model. This allows us to construct an explicit probabilistic model of the MGA fitness function and introduce the Ising MGA. Experimentations done with Ising MGA are less costly than those done with standard MGA since much less computations are involved. The least computations of all is for the LCP. Results of the LCP, random search, random seeded search, MGA, and Ising MGA are discussed.

Keywords: Filter design, FIR digital filters, LCP, Ising model, MGA, Ising MGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2023
324 Approximating Maximum Speed on Road from Curvature Information of Bezier Curve

Authors: M. Y. Misro, A. Ramli, J. M. Ali

Abstract:

Bezier curves have useful properties for path generation problem, for instance, it can generate the reference trajectory for vehicles to satisfy the path constraints. Both algorithms join cubic Bezier curve segment smoothly to generate the path. Some of the useful properties of Bezier are curvature. In mathematics, curvature is the amount by which a geometric object deviates from being flat, or straight in the case of a line. Another extrinsic example of curvature is a circle, where the curvature is equal to the reciprocal of its radius at any point on the circle. The smaller the radius, the higher the curvature thus the vehicle needs to bend sharply. In this study, we use Bezier curve to fit highway-like curve. We use different approach to find the best approximation for the curve so that it will resembles highway-like curve. We compute curvature value by analytical differentiation of the Bezier Curve. We will then compute the maximum speed for driving using the curvature information obtained. Our research works on some assumptions; first, the Bezier curve estimates the real shape of the curve which can be verified visually. Even though, fitting process of Bezier curve does not interpolate exactly on the curve of interest, we believe that the estimation of speed are acceptable. We verified our result with the manual calculation of the curvature from the map.

Keywords: Speed estimation, path constraints, reference trajectory, Bezier curve.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4057
323 Energy-Efficient Clustering Protocol in Wireless Sensor Networks for Healthcare Monitoring

Authors: Ebrahim Farahmand, Ali Mahani

Abstract:

Wireless sensor networks (WSNs) can facilitate continuous monitoring of patients and increase early detection of emergency conditions and diseases. High density WSNs helps us to accurately monitor a remote environment by intelligently combining the data from the individual nodes. Due to energy capacity limitation of sensors, enhancing the lifetime and the reliability of WSNs are important factors in designing of these networks. The clustering strategies are verified as effective and practical algorithms for reducing energy consumption in WSNs and can tackle WSNs limitations. In this paper, an Energy-efficient weight-based Clustering Protocol (EWCP) is presented. Artificial retina is selected as a case study of WSNs applied in body sensors. Cluster heads’ (CHs) selection is equipped with energy efficient parameters. Moreover, cluster members are selected based on their distance to the selected CHs. Comparing with the other benchmark protocols, the lifetime of EWCP is improved significantly.

Keywords: Clustering of WSNs, healthcare monitoring, weight-based clustering, wireless sensor networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1555
322 A Two-Phase Mechanism for Agent's Action Selection in Soccer Simulation

Authors: Vahid Salmani, Mahmoud Naghibzadeh, Farid Seifi, Amirhossein Taherinia

Abstract:

Soccer simulation is an effort to motivate researchers and practitioners to do artificial and robotic intelligence research; and at the same time put into practice and test the results. Many researchers and practitioners throughout the world are continuously working to polish their ideas and improve their implemented systems. At the same time, new groups are forming and they bring bright new thoughts to the field. The research includes designing and executing robotic soccer simulation algorithms. In our research, a soccer simulation player is considered to be an intelligent agent that is capable of receiving information from the environment, analyze it and to choose the best action from a set of possible ones, for its next move. We concentrate on developing a two-phase method for the soccer player agent to choose its best next move. The method is then implemented into our software system called Nexus simulation team of Ferdowsi University. This system is based on TsinghuAeolus[1] team that was the champion of the world RoboCup soccer simulation contest in 2001 and 2002.

Keywords: RoboCup, Soccer simulation, multi-agent environment, intelligent soccer agent, ball controller agent.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1545
321 A Similarity Function for Global Quality Assessment of Retinal Vessel Segmentations

Authors: Arturo Aquino, Manuel Emilio Gegundez, Jose Manuel Bravo, Diego Marin

Abstract:

Retinal vascularity assessment plays an important role in diagnosis of ophthalmic pathologies. The employment of digital images for this purpose makes possible a computerized approach and has motivated development of many methods for automated vascular tree segmentation. Metrics based on contingency tables for binary classification have been widely used for evaluating performance of these algorithms and, concretely, the accuracy has been mostly used as measure of global performance in this topic. However, this metric shows very poor matching with human perception as well as other notable deficiencies. Here, a new similarity function for measuring quality of retinal vessel segmentations is proposed. This similarity function is based on characterizing the vascular tree as a connected structure with a measurable area and length. Tests made indicate that this new approach shows better behaviour than the current one does. Generalizing, this concept of measuring descriptive properties may be used for designing functions for measuring more successfully segmentation quality of other complex structures.

Keywords: Retinal vessel segmentation, quality assessment, performanceevaluation, similarity function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1500
320 Detection and Correction of Ectopic Beats for HRV Analysis Applying Discrete Wavelet Transforms

Authors: Desmond B. Keenan

Abstract:

The clinical usefulness of heart rate variability is limited to the range of Holter monitoring software available. These software algorithms require a normal sinus rhythm to accurately acquire heart rate variability (HRV) measures in the frequency domain. Premature ventricular contractions (PVC) or more commonly referred to as ectopic beats, frequent in heart failure, hinder this analysis and introduce ambiguity. This investigation demonstrates an algorithm to automatically detect ectopic beats by analyzing discrete wavelet transform coefficients. Two techniques for filtering and replacing the ectopic beats from the RR signal are compared. One technique applies wavelet hard thresholding techniques and another applies linear interpolation to replace ectopic cycles. The results demonstrate through simulation, and signals acquired from a 24hr ambulatory recorder, that these techniques can accurately detect PVC-s and remove the noise and leakage effects produced by ectopic cycles retaining smooth spectra with the minimum of error.

Keywords: Heart rate variability, vagal tone, sympathetic, parasympathetic, wavelets, ectopic beats, spectral analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2069
319 A Text Clustering System based on k-means Type Subspace Clustering and Ontology

Authors: Liping Jing, Michael K. Ng, Xinhua Yang, Joshua Zhexue Huang

Abstract:

This paper presents a text clustering system developed based on a k-means type subspace clustering algorithm to cluster large, high dimensional and sparse text data. In this algorithm, a new step is added in the k-means clustering process to automatically calculate the weights of keywords in each cluster so that the important words of a cluster can be identified by the weight values. For understanding and interpretation of clustering results, a few keywords that can best represent the semantic topic are extracted from each cluster. Two methods are used to extract the representative words. The candidate words are first selected according to their weights calculated by our new algorithm. Then, the candidates are fed to the WordNet to identify the set of noun words and consolidate the synonymy and hyponymy words. Experimental results have shown that the clustering algorithm is superior to the other subspace clustering algorithms, such as PROCLUS and HARP and kmeans type algorithm, e.g., Bisecting-KMeans. Furthermore, the word extraction method is effective in selection of the words to represent the topics of the clusters.

Keywords: Subspace Clustering, Text Mining, Feature Weighting, Cluster Interpretation, Ontology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2462
318 Manufacturing of Full Automatic Carwash Using with Intelligent Control Algorithms

Authors: Amir Hossein Daei Sorkhabi, Bita Khazini

Abstract:

In this paper the intelligent control of full automatic car wash using a programmable logic controller (PLC) has been investigated and designed to do all steps of carwashing. The Intelligent control of full automatic carwash has the ability to identify and profile the geometrical dimensions of the vehicle chassis. Vehicle dimension identification is an important point in this control system to adjust the washing brushes position and time duration. The study also tries to design a control set for simulating and building the automatic carwash. The main purpose of the simulation is to develop criteria for designing and building this type of carwash in actual size to overcome challenges of automation. The results of this research indicate that the proposed method in process control not only increases productivity, speed, accuracy and safety but also reduce the time and cost of washing based on dynamic model of the vehicle. A laboratory prototype based on an advanced intelligent control has been built to study the validity of the design and simulation which it’s appropriate performance confirms the validity of this study.

Keywords: Automatic Carwash, Dimension, PLC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6824
317 Optimal Control Strategy for High Performance EV Interior Permanent Magnet Synchronous Motor

Authors: Mehdi Karbalaye Zadeh, Ehsan M. Siavashi

Abstract:

The controllable electrical loss which consists of the copper loss and iron loss can be minimized by the optimal control of the armature current vector. The control algorithm of current vector minimizing the electrical loss is proposed and the optimal current vector can be decided according to the operating speed and the load conditions. The proposed control algorithm is applied to the experimental PM motor drive system and this paper presents a modern approach of speed control for permanent magnet synchronous motor (PMSM) applied for Electric Vehicle using a nonlinear control. The regulation algorithms are based on the feedback linearization technique. The direct component of the current is controlled to be zero which insures the maximum torque operation. The near unity power factor operation is also achieved. More over, among EV-s motor electric propulsion features, the energy efficiency is a basic characteristic that is influenced by vehicle dynamics and system architecture. For this reason, the EV dynamics are taken into account.

Keywords: PMSM, Electric Vehicle, Optimal control, Traction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1767
316 WebAppShield: An Approach Exploiting Machine Learning to Detect SQLi Attacks in an Application Layer in Run-Time

Authors: Ahmed Abdulla Ashlam, Atta Badii, Frederic Stahl

Abstract:

In recent years, SQL injection attacks have been identified as being prevalent against web applications. They affect network security and user data, which leads to a considerable loss of money and data every year. This paper presents the use of classification algorithms in machine learning using a method to classify the login data filtering inputs into "SQLi" or "Non-SQLi,” thus increasing the reliability and accuracy of results in terms of deciding whether an operation is an attack or a valid operation. A method as a Web-App is developed for auto-generated data replication to provide a twin of the targeted data structure. Shielding against SQLi attacks (WebAppShield) that verifies all users and prevents attackers (SQLi attacks) from entering and or accessing the database, which the machine learning module predicts as "Non-SQLi", has been developed. A special login form has been developed with a special instance of the data validation; this verification process secures the web application from its early stages. The system has been tested and validated, and up to 99% of SQLi attacks have been prevented.

Keywords: SQL injection, attacks, web application, accuracy, database, WebAppShield.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 443
315 Sequential Straightforward Clustering for Local Image Block Matching

Authors: Mohammad Akbarpour Sekeh, Mohd. Aizaini Maarof, Mohd. Foad Rohani, Malihe Motiei

Abstract:

Duplicated region detection is a technical method to expose copy-paste forgeries on digital images. Copy-paste is one of the common types of forgeries to clone portion of an image in order to conceal or duplicate special object. In this type of forgery detection, extracting robust block feature and also high time complexity of matching step are two main open problems. This paper concentrates on computational time and proposes a local block matching algorithm based on block clustering to enhance time complexity. Time complexity of the proposed algorithm is formulated and effects of two parameter, block size and number of cluster, on efficiency of this algorithm are considered. The experimental results and mathematical analysis demonstrate this algorithm is more costeffective than lexicographically algorithms in time complexity issue when the image is complex.

Keywords: Copy-paste forgery detection, Duplicated region, Timecomplexity, Local block matching, Sequential block clustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1832
314 Performance Improvement in the Bivariate Models by using Modified Marginal Variance of Noisy Observations for Image-Denoising Applications

Authors: R. Senthilkumar

Abstract:

Most simple nonlinear thresholding rules for wavelet- based denoising assume that the wavelet coefficients are independent. However, wavelet coefficients of natural images have significant dependencies. This paper attempts to give a recipe for selecting one of the popular image-denoising algorithms based on VisuShrink, SureShrink, OracleShrink, BayesShrink and BiShrink and also this paper compares different Bivariate models used for image denoising applications. The first part of the paper compares different Shrinkage functions used for image-denoising. The second part of the paper compares different bivariate models and the third part of this paper uses the Bivariate model with modified marginal variance which is based on Laplacian assumption. This paper gives an experimental comparison on six 512x512 commonly used images, Lenna, Barbara, Goldhill, Clown, Boat and Stonehenge. The following noise powers 25dB,26dB, 27dB, 28dB and 29dB are added to the six standard images and the corresponding Peak Signal to Noise Ratio (PSNR) values are calculated for each noise level.

Keywords: BiShrink, Image-Denoising, PSNR, Shrinkage function

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1347
313 Collision Detection Algorithm Based on Data Parallelism

Authors: Zhen Peng, Baifeng Wu

Abstract:

Modern computing technology enters the era of parallel computing with the trend of sustainable and scalable parallelism. Single Instruction Multiple Data (SIMD) is an important way to go along with the trend. It is able to gather more and more computing ability by increasing the number of processor cores without the need of modifying the program. Meanwhile, in the field of scientific computing and engineering design, many computation intensive applications are facing the challenge of increasingly large amount of data. Data parallel computing will be an important way to further improve the performance of these applications. In this paper, we take the accurate collision detection in building information modeling as an example. We demonstrate a model for constructing a data parallel algorithm. According to the model, a complex object is decomposed into the sets of simple objects; collision detection among complex objects is converted into those among simple objects. The resulting algorithm is a typical SIMD algorithm, and its advantages in parallelism and scalability is unparalleled in respect to the traditional algorithms.

Keywords: Data parallelism, collision detection, single instruction multiple data, building information modeling, continuous scalability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1235
312 Jobs Scheduling and Worker Assignment Problem to Minimize Makespan using Ant Colony Optimization Metaheuristic

Authors: Mian Tahir Aftab, Muhammad Umer, Riaz Ahmad

Abstract:

This article proposes an Ant Colony Optimization (ACO) metaheuristic to minimize total makespan for scheduling a set of jobs and assign workers for uniformly related parallel machines. An algorithm based on ACO has been developed and coded on a computer program Matlab®, to solve this problem. The paper explains various steps to apply Ant Colony approach to the problem of minimizing makespan for the worker assignment & jobs scheduling problem in a parallel machine model and is aimed at evaluating the strength of ACO as compared to other conventional approaches. One data set containing 100 problems (12 Jobs, 03 machines and 10 workers) which is available on internet, has been taken and solved through this ACO algorithm. The results of our ACO based algorithm has shown drastically improved results, especially, in terms of negligible computational effort of CPU, to reach the optimal solution. In our case, the time taken to solve all 100 problems is even lesser than the average time taken to solve one problem in the data set by other conventional approaches like GA algorithm and SPT-A/LMC heuristics.

Keywords: Ant Colony Optimization (ACO), Genetic algorithms (GA), Makespan, SPT-A/LMC heuristic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3472
311 DCBOR: A Density Clustering Based on Outlier Removal

Authors: A. M. Fahim, G. Saake, A. M. Salem, F. A. Torkey, M. A. Ramadan

Abstract:

Data clustering is an important data exploration technique with many applications in data mining. We present an enhanced version of the well known single link clustering algorithm. We will refer to this algorithm as DCBOR. The proposed algorithm alleviates the chain effect by removing the outliers from the given dataset. So this algorithm provides outlier detection and data clustering simultaneously. This algorithm does not need to update the distance matrix, since the algorithm depends on merging the most k-nearest objects in one step and the cluster continues grow as long as possible under specified condition. So the algorithm consists of two phases; at the first phase, it removes the outliers from the input dataset. At the second phase, it performs the clustering process. This algorithm discovers clusters of different shapes, sizes, densities and requires only one input parameter; this parameter represents a threshold for outlier points. The value of the input parameter is ranging from 0 to 1. The algorithm supports the user in determining an appropriate value for it. We have tested this algorithm on different datasets contain outlier and connecting clusters by chain of density points, and the algorithm discovers the correct clusters. The results of our experiments demonstrate the effectiveness and the efficiency of DCBOR.

Keywords: Data Clustering, Clustering Algorithms, Handling Noise, Arbitrary Shape of Clusters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1933
310 Predicting the Three Major Dimensions of the Learner-s Emotions from Brainwaves

Authors: Alicia Heraz, Claude Frasson

Abstract:

This paper investigates how the use of machine learning techniques can significantly predict the three major dimensions of learner-s emotions (pleasure, arousal and dominance) from brainwaves. This study has adopted an experimentation in which participants were exposed to a set of pictures from the International Affective Picture System (IAPS) while their electrical brain activity was recorded with an electroencephalogram (EEG). The pictures were already rated in a previous study via the affective rating system Self-Assessment Manikin (SAM) to assess the three dimensions of pleasure, arousal, and dominance. For each picture, we took the mean of these values for all subjects used in this previous study and associated them to the recorded brainwaves of the participants in our study. Correlation and regression analyses confirmed the hypothesis that brainwave measures could significantly predict emotional dimensions. This can be very useful in the case of impassive, taciturn or disabled learners. Standard classification techniques were used to assess the reliability of the automatic detection of learners- three major dimensions from the brainwaves. We discuss the results and the pertinence of such a method to assess learner-s emotions and integrate it into a brainwavesensing Intelligent Tutoring System.

Keywords: Algorithms, brainwaves, emotional dimensions, performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2205
309 Tape-Shaped Multiscale Fiducial Marker: A Design Prototype for Indoor Localization

Authors: Marcell S. A. Martins, Benedito S. R. Neto, Gerson L. Serejo, Carlos G. R. Santos

Abstract:

Indoor positioning systems use sensors such as Bluetooth, ZigBee, and Wi-Fi, as well as cameras for image capture, which can be fixed or mobile. These computer vision-based positioning approaches are low-cost to implement, mainly when it uses a mobile camera. The present study aims to create a design of a fiducial marker for a low-cost indoor localization system. The marker is tape-shaped to perform a continuous reading employing two detection algorithms, one for greater distances and another for smaller distances. Therefore, the location service is always operational, even with variations in capture distance. A minimal localization and reading algorithm was implemented for the proposed marker design, aiming to validate it. The accuracy tests consider readings varying the capture distance between [0.5, 10] meters, comparing the proposed marker with others. The tests showed that the proposed marker has a broader capture range than the ArUco and QRCode, maintaining the same size. Therefore, reducing the visual pollution and maximizing the tracking since the ambient can be covered entirely.

Keywords: Multiscale recognition, indoor localization, tape-shaped marker, Fiducial Marker.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 176
308 Weld Defect Detection in Industrial Radiography Based Digital Image Processing

Authors: N. Nacereddine, M. Zelmat, S. S. Belaïfa, M. Tridi

Abstract:

Industrial radiography is a famous technique for the identification and evaluation of discontinuities, or defects, such as cracks, porosity and foreign inclusions found in welded joints. Although this technique has been well developed, improving both the inspection process and operating time, it does suffer from several drawbacks. The poor quality of radiographic images is due to the physical nature of radiography as well as small size of the defects and their poor orientation relatively to the size and thickness of the evaluated parts. Digital image processing techniques allow the interpretation of the image to be automated, avoiding the presence of human operators making the inspection system more reliable, reproducible and faster. This paper describes our attempt to develop and implement digital image processing algorithms for the purpose of automatic defect detection in radiographic images. Because of the complex nature of the considered images, and in order that the detected defect region represents the most accurately possible the real defect, the choice of global and local preprocessing and segmentation methods must be appropriated.

Keywords: Digital image processing, global and localapproaches, radiographic film, weld defect.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4072