Search results for: K-means clustering algorithm
3556 The Use of Appeals in Green Printed Advertisements: A Case of Product Orientation and Organizational Image Orientation Ads
Authors: Chutima Ruanguttamanun
Abstract:
Despite the relatively large number of studies that have examined the use of appeals in advertisements, research on the use of appeals in green advertisements is still underdeveloped and needs to be investigated further, as it is definitely a tool for marketers to create illustrious ads. In this study, content analysis was employed to examine the nature of green advertising appeals and to match the appeals with the green advertisements. Two different types of green print advertisings, product orientation and organizational image orientation were used. Thirty highly educated participants with different backgrounds were asked individually to ascertain three appeals out of thirty-four given appeals found among forty real green advertisements. To analyze participant responses and to group them based on common appeals, two-step K-mean clustering is used. The clustering solution indicates that eye-catching graphics and imaginative appeals are highly notable in both types of green ads. Depressed, meaningful and sad appeals are found to be highly used in organizational image orientation ads, whereas, corporate image, informative and natural appeals are found to be essential for product orientation ads.Keywords: advertising appeals, green marketing, green advertisement, printed advertisement
Procedia PDF Downloads 2793555 Efficiency of Grover’s Search Algorithm Implemented on Open Quantum System in the Presence of Drive-Induced Dissipation
Authors: Nilanjana Chanda, Rangeet Bhattacharyya
Abstract:
Grover’s search algorithm is the fastest possible quantum mechanical algorithm to search a certain element from an unstructured set of data of N items. The algorithm can determine the desired result in only O(√N) steps. It has been demonstrated theoretically and experimentally on two-qubit systems long ago. In this work, we investigate the fidelity of Grover’s search algorithm by implementing it on an open quantum system. In particular, we study with what accuracy one can estimate that the algorithm would deliver the searched state. In reality, every system has some influence on its environment. We include the environmental effects on the system dynamics by using a recently reported fluctuation-regulated quantum master equation (FRQME). We consider that the environment experiences thermal fluctuations, which leave its signature in the second-order term of the master equation through its appearance as a regulator. The FRQME indicates that in addition to the regular relaxation due to system-environment coupling, the applied drive also causes dissipation in the system dynamics. As a result, the fidelity is found to depend on both the drive-induced dissipative terms and the relaxation terms, and we find that there exists a competition between them, leading to an optimum drive amplitude for which the fidelity becomes maximum. For efficient implementation of the search algorithm, precise knowledge of this optimum drive amplitude is essential.Keywords: dissipation, fidelity, quantum master equation, relaxation, system-environment coupling
Procedia PDF Downloads 1063554 An Improved Cuckoo Search Algorithm for Voltage Stability Enhancement in Power Transmission Networks
Authors: Reza Sirjani, Nobosse Tafem Bolan
Abstract:
Many optimization techniques available in the literature have been developed in order to solve the problem of voltage stability enhancement in power systems. However, there are a number of drawbacks in the use of previous techniques aimed at determining the optimal location and size of reactive compensators in a network. In this paper, an Improved Cuckoo Search algorithm is applied as an appropriate optimization algorithm to determine the optimum location and size of a Static Var Compensator (SVC) in a transmission network. The main objectives are voltage stability improvement and total cost minimization. The results of the presented technique are then compared with other available optimization techniques.Keywords: cuckoo search algorithm, optimization, power system, var compensators, voltage stability
Procedia PDF Downloads 5533553 Bioinformatic Approaches in Population Genetics and Phylogenetic Studies
Authors: Masoud Sheidai
Abstract:
Biologists with a special field of population genetics and phylogeny have different research tasks such as populations’ genetic variability and divergence, species relatedness, the evolution of genetic and morphological characters, and identification of DNA SNPs with adaptive potential. To tackle these problems and reach a concise conclusion, they must use the proper and efficient statistical and bioinformatic methods as well as suitable genetic and morphological characteristics. In recent years application of different bioinformatic and statistical methods, which are based on various well-documented assumptions, are the proper analytical tools in the hands of researchers. The species delineation is usually carried out with the use of different clustering methods like K-means clustering based on proper distance measures according to the studied features of organisms. A well-defined species are assumed to be separated from the other taxa by molecular barcodes. The species relationships are studied by using molecular markers, which are analyzed by different analytical methods like multidimensional scaling (MDS) and principal coordinate analysis (PCoA). The species population structuring and genetic divergence are usually investigated by PCoA and PCA methods and a network diagram. These are based on bootstrapping of data. The Association of different genes and DNA sequences to ecological and geographical variables is determined by LFMM (Latent factor mixed model) and redundancy analysis (RDA), which are based on Bayesian and distance methods. Molecular and morphological differentiating characters in the studied species may be identified by linear discriminant analysis (DA) and discriminant analysis of principal components (DAPC). We shall illustrate these methods and related conclusions by giving examples from different edible and medicinal plant species.Keywords: GWAS analysis, K-Means clustering, LFMM, multidimensional scaling, redundancy analysis
Procedia PDF Downloads 1263552 Meta-Learning for Hierarchical Classification and Applications in Bioinformatics
Authors: Fabio Fabris, Alex A. Freitas
Abstract:
Hierarchical classification is a special type of classification task where the class labels are organised into a hierarchy, with more generic class labels being ancestors of more specific ones. Meta-learning for classification-algorithm recommendation consists of recommending to the user a classification algorithm, from a pool of candidate algorithms, for a dataset, based on the past performance of the candidate algorithms in other datasets. Meta-learning is normally used in conventional, non-hierarchical classification. By contrast, this paper proposes a meta-learning approach for more challenging task of hierarchical classification, and evaluates it in a large number of bioinformatics datasets. Hierarchical classification is especially relevant for bioinformatics problems, as protein and gene functions tend to be organised into a hierarchy of class labels. This work proposes meta-learning approach for recommending the best hierarchical classification algorithm to a hierarchical classification dataset. This work’s contributions are: 1) proposing an algorithm for splitting hierarchical datasets into new datasets to increase the number of meta-instances, 2) proposing meta-features for hierarchical classification, and 3) interpreting decision-tree meta-models for hierarchical classification algorithm recommendation.Keywords: algorithm recommendation, meta-learning, bioinformatics, hierarchical classification
Procedia PDF Downloads 3143551 A Clustering-Based Approach for Weblog Data Cleaning
Authors: Amine Ganibardi, Cherif Arab Ali
Abstract:
This paper addresses the data cleaning issue as a part of web usage data preprocessing within the scope of Web Usage Mining. Weblog data recorded by web servers within log files reflect usage activity, i.e., End-users’ clicks and underlying user-agents’ hits. As Web Usage Mining is interested in End-users’ behavior, user-agents’ hits are referred to as noise to be cleaned-off before mining. Filtering hits from clicks is not trivial for two reasons, i.e., a server records requests interlaced in sequential order regardless of their source or type, website resources may be set up as requestable interchangeably by end-users and user-agents. The current methods are content-centric based on filtering heuristics of relevant/irrelevant items in terms of some cleaning attributes, i.e., website’s resources filetype extensions, website’s resources pointed by hyperlinks/URIs, http methods, user-agents, etc. These methods need exhaustive extra-weblog data and prior knowledge on the relevant and/or irrelevant items to be assumed as clicks or hits within the filtering heuristics. Such methods are not appropriate for dynamic/responsive Web for three reasons, i.e., resources may be set up to as clickable by end-users regardless of their type, website’s resources are indexed by frame names without filetype extensions, web contents are generated and cancelled differently from an end-user to another. In order to overcome these constraints, a clustering-based cleaning method centered on the logging structure is proposed. This method focuses on the statistical properties of the logging structure at the requested and referring resources attributes levels. It is insensitive to logging content and does not need extra-weblog data. The used statistical property takes on the structure of the generated logging feature by webpage requests in terms of clicks and hits. Since a webpage consists of its single URI and several components, these feature results in a single click to multiple hits ratio in terms of the requested and referring resources. Thus, the clustering-based method is meant to identify two clusters based on the application of the appropriate distance to the frequency matrix of the requested and referring resources levels. As the ratio clicks to hits is single to multiple, the clicks’ cluster is the smallest one in requests number. Hierarchical Agglomerative Clustering based on a pairwise distance (Gower) and average linkage has been applied to four logfiles of dynamic/responsive websites whose click to hits ratio range from 1/2 to 1/15. The optimal clustering set on the basis of average linkage and maximum inter-cluster inertia results always in two clusters. The evaluation of the smallest cluster referred to as clicks cluster under the terms of confusion matrix indicators results in 97% of true positive rate. The content-centric cleaning methods, i.e., conventional and advanced cleaning, resulted in a lower rate 91%. Thus, the proposed clustering-based cleaning outperforms the content-centric methods within dynamic and responsive web design without the need of any extra-weblog. Such an improvement in cleaning quality is likely to refine dependent analysis.Keywords: clustering approach, data cleaning, data preprocessing, weblog data, web usage data
Procedia PDF Downloads 1703550 Ischemic Stroke Detection in Computed Tomography Examinations
Authors: Allan F. F. Alves, Fernando A. Bacchim Neto, Guilherme Giacomini, Marcela de Oliveira, Ana L. M. Pavan, Maria E. D. Rosa, Diana R. Pina
Abstract:
Stroke is a worldwide concern, only in Brazil it accounts for 10% of all registered deaths. There are 2 stroke types, ischemic (87%) and hemorrhagic (13%). Early diagnosis is essential to avoid irreversible cerebral damage. Non-enhanced computed tomography (NECT) is one of the main diagnostic techniques used due to its wide availability and rapid diagnosis. Detection depends on the size and severity of lesions and the time spent between the first symptoms and examination. The Alberta Stroke Program Early CT Score (ASPECTS) is a subjective method that increases the detection rate. The aim of this work was to implement an image segmentation system to enhance ischemic stroke and to quantify the area of ischemic and hemorrhagic stroke lesions in CT scans. We evaluated 10 patients with NECT examinations diagnosed with ischemic stroke. Analyzes were performed in two axial slices, one at the level of the thalamus and basal ganglion and one adjacent to the top edge of the ganglionic structures with window width between 80 and 100 Hounsfield Units. We used different image processing techniques such as morphological filters, discrete wavelet transform and Fuzzy C-means clustering. Subjective analyzes were performed by a neuroradiologist according to the ASPECTS scale to quantify ischemic areas in the middle cerebral artery region. These subjective analysis results were compared with objective analyzes performed by the computational algorithm. Preliminary results indicate that the morphological filters actually improve the ischemic areas for subjective evaluations. The comparison in area of the ischemic region contoured by the neuroradiologist and the defined area by computational algorithm showed no deviations greater than 12% in any of the 10 examination tests. Although there is a tendency that the areas contoured by the neuroradiologist are smaller than those obtained by the algorithm. These results show the importance of a computer aided diagnosis software to assist neuroradiology decisions, especially in critical situations as the choice of treatment for ischemic stroke.Keywords: ischemic stroke, image processing, CT scans, Fuzzy C-means
Procedia PDF Downloads 3693549 An Exact Algorithm for Location–Transportation Problems in Humanitarian Relief
Authors: Chansiri Singhtaun
Abstract:
This paper proposes a mathematical model and examines the performance of an exact algorithm for a location–transportation problems in humanitarian relief. The model determines the number and location of distribution centers in a relief network, the amount of relief supplies to be stocked at each distribution center and the vehicles to take the supplies to meet the needs of disaster victims under capacity restriction, transportation and budgetary constraints. The computational experiments are conducted on the various sizes of problems that are generated. Branch and bound algorithm is applied for these problems. The results show that this algorithm can solve problem sizes of up to three candidate locations with five demand points and one candidate location with up to twenty demand points without premature termination.Keywords: disaster response, facility location, humanitarian relief, transportation
Procedia PDF Downloads 4513548 A New Sign Subband Adaptive Filter Based on Dynamic Selection of Subbands
Authors: Mohammad Shams Esfand Abadi, Mehrdad Zalaghi, Reza ebrahimpour
Abstract:
In this paper, we propose a sign adaptive filter algorithm with the ability of dynamic selection of subband filters which leads to low computational complexity compared with conventional sign subband adaptive filter (SSAF) algorithm. Dynamic selection criterion is based on largest reduction of the mean square deviation at each adaption. We demonstrate that this simple proposed algorithm has the same performance of the conventional SSAF and somewhat faster than it. In the presence of impulsive interferences robustness of the simple proposed algorithm as well as the conventional SSAF and outperform the conventional normalized subband adaptive filter (NSAF) algorithm. Therefore, it is preferred for environments under impulsive interferences. Simulation results are presented to verify these above considerations very well have been achieved.Keywords: acoustic echo cancellation (AEC), normalized subband adaptive filter (NSAF), dynamic selection subband adaptive filter (DS-NSAF), sign subband adaptive filter (SSAF), impulsive noise, robust filtering
Procedia PDF Downloads 6013547 Summarizing Data Sets for Data Mining by Using Statistical Methods in Coastal Engineering
Authors: Yunus Doğan, Ahmet Durap
Abstract:
Coastal regions are the one of the most commonly used places by the natural balance and the growing population. In coastal engineering, the most valuable data is wave behaviors. The amount of this data becomes very big because of observations that take place for periods of hours, days and months. In this study, some statistical methods such as the wave spectrum analysis methods and the standard statistical methods have been used. The goal of this study is the discovery profiles of the different coast areas by using these statistical methods, and thus, obtaining an instance based data set from the big data to analysis by using data mining algorithms. In the experimental studies, the six sample data sets about the wave behaviors obtained by 20 minutes of observations from Mersin Bay in Turkey and converted to an instance based form, while different clustering techniques in data mining algorithms were used to discover similar coastal places. Moreover, this study discusses that this summarization approach can be used in other branches collecting big data such as medicine.Keywords: clustering algorithms, coastal engineering, data mining, data summarization, statistical methods
Procedia PDF Downloads 3613546 A New Evolutionary Algorithm for Multi-Objective Cylindrical Spur Gear Design Optimization
Authors: Hammoudi Abderazek
Abstract:
The present paper introduces a modified adaptive mixed differential evolution (MAMDE) to select the main geometry parameters of specific cylindrical spur gear. The developed algorithm used the self-adaptive mechanism in order to update the values of mutation and crossover factors. The feasibility rules are used in the selection phase to improve the search exploration of MAMDE. Moreover, the elitism is performed to keep the best individual found in each generation. For the constraints handling the normalization method is used to treat each constraint design equally. The finite element analysis is used to confirm the optimization results for the maximum bending resistance. The simulation results reached in this paper indicate clearly that the proposed algorithm is very competitive in precision gear design optimization.Keywords: evolutionary algorithm, spur gear, tooth profile, meta-heuristics
Procedia PDF Downloads 1323545 UAV’s Enhanced Data Collection for Heterogeneous Wireless Sensor Networks
Authors: Kamel Barka, Lyamine Guezouli, Assem Rezki
Abstract:
In this article, we propose a protocol called DataGA-DRF (a protocol for Data collection using a Genetic Algorithm through Dynamic Reference Points) that collects data from Heterogeneous wireless sensor networks. This protocol is based on DGA (Destination selection according to Genetic Algorithm) to control the movement of the UAV (Unmanned aerial vehicle) between dynamic reference points that virtually represent the sensor node deployment. The dynamics of these points ensure an even distribution of energy consumption among the sensors and also improve network performance. To determine the best points, DataGA-DRF uses a classification algorithm such as K-Means.Keywords: heterogeneous wireless networks, unmanned aerial vehicles, reference point, collect data, genetic algorithm
Procedia PDF Downloads 843544 The Selection of the Nearest Anchor Using Received Signal Strength Indication (RSSI)
Authors: Hichem Sassi, Tawfik Najeh, Noureddine Liouane
Abstract:
The localization information is crucial for the operation of WSN. There are principally two types of localization algorithms. The Range-based localization algorithm has strict requirements on hardware; thus, it is expensive to be implemented in practice. The Range-free localization algorithm reduces the hardware cost. However, it can only achieve high accuracy in ideal scenarios. In this paper, we locate unknown nodes by incorporating the advantages of these two types of methods. The proposed algorithm makes the unknown nodes select the nearest anchor using the Received Signal Strength Indicator (RSSI) and choose two other anchors which are the most accurate to achieve the estimated location. Our algorithm improves the localization accuracy compared with previous algorithms, which has been demonstrated by the simulating results.Keywords: WSN, localization, DV-Hop, RSSI
Procedia PDF Downloads 3633543 A Graph Theoretic Algorithm for Bandwidth Improvement in Computer Networks
Authors: Mehmet Karaata
Abstract:
Given two distinct vertices (nodes) source s and target t of a graph G = (V, E), the two node-disjoint paths problem is to identify two node-disjoint paths between s ∈ V and t ∈ V . Two paths are node-disjoint if they have no common intermediate vertices. In this paper, we present an algorithm with O(m)-time complexity for finding two node-disjoint paths between s and t in arbitrary graphs where m is the number of edges. The proposed algorithm has a wide range of applications in ensuring reliability and security of sensor, mobile and fixed communication networks.Keywords: disjoint paths, distributed systems, fault-tolerance, network routing, security
Procedia PDF Downloads 4443542 Radial Basis Surrogate Model Integrated to Evolutionary Algorithm for Solving Computation Intensive Black-Box Problems
Authors: Abdulbaset Saad, Adel Younis, Zuomin Dong
Abstract:
For design optimization with high-dimensional expensive problems, an effective and efficient optimization methodology is desired. This work proposes a series of modification to the Differential Evolution (DE) algorithm for solving computation Intensive Black-Box Problems. The proposed methodology is called Radial Basis Meta-Model Algorithm Assisted Differential Evolutionary (RBF-DE), which is a global optimization algorithm based on the meta-modeling techniques. A meta-modeling assisted DE is proposed to solve computationally expensive optimization problems. The Radial Basis Function (RBF) model is used as a surrogate model to approximate the expensive objective function, while DE employs a mechanism to dynamically select the best performing combination of parameters such as differential rate, cross over probability, and population size. The proposed algorithm is tested on benchmark functions and real life practical applications and problems. The test results demonstrate that the proposed algorithm is promising and performs well compared to other optimization algorithms. The proposed algorithm is capable of converging to acceptable and good solutions in terms of accuracy, number of evaluations, and time needed to converge.Keywords: differential evolution, engineering design, expensive computations, meta-modeling, radial basis function, optimization
Procedia PDF Downloads 3973541 Reduction of Impulsive Noise in OFDM System using Adaptive Algorithm
Authors: Alina Mirza, Sumrin M. Kabir, Shahzad A. Sheikh
Abstract:
The Orthogonal Frequency Division Multiplexing (OFDM) with high data rate, high spectral efficiency and its ability to mitigate the effects of multipath makes them most suitable in wireless application. Impulsive noise distorts the OFDM transmission and therefore methods must be investigated to suppress this noise. In this paper, a State Space Recursive Least Square (SSRLS) algorithm based adaptive impulsive noise suppressor for OFDM communication system is proposed. And a comparison with another adaptive algorithm is conducted. The state space model-dependent recursive parameters of proposed scheme enables to achieve steady state mean squared error (MSE), low bit error rate (BER), and faster convergence than that of some of existing algorithm.Keywords: OFDM, impulsive noise, SSRLS, BER
Procedia PDF Downloads 4583540 Channel Estimation for Orthogonal Frequency Division Multiplexing Systems over Doubly Selective Channels Base on DCS-DCSOMP Algorithm
Authors: Linyu Wang, Furui Huo, Jianhong Xiang
Abstract:
The Doppler shift generated by high-speed movement and multipath effects in the channel are the main reasons for the generation of a time-frequency doubly-selective (DS) channel. There is severe inter-carrier interference (ICI) in the DS channel. Channel estimation for an orthogonal frequency division multiplexing (OFDM) system over a DS channel is very difficult. The simultaneous orthogonal matching pursuit algorithm under distributed compressive sensing theory (DCS-SOMP) has been used in channel estimation for OFDM systems over DS channels. However, the reconstruction accuracy of the DCS-SOMP algorithm is not high enough in the low SNR stage. To solve this problem, in this paper, we propose an improved DCS-SOMP algorithm based on the inner product difference comparison operation (DCS-DCSOMP). The reconstruction accuracy is improved by increasing the number of candidate indexes and designing the comparison conditions of inner product difference. We combine the DCS-DCSOMP algorithm with the basis expansion model (BEM) to reduce the complexity of channel estimation. Simulation results show the effectiveness of the proposed algorithm and its advantages over other algorithms.Keywords: OFDM, doubly selective, channel estimation, compressed sensing
Procedia PDF Downloads 983539 Robot Operating System-Based SLAM for a Gazebo-Simulated Turtlebot2 in 2d Indoor Environment with Cartographer Algorithm
Authors: Wilayat Ali, Li Sheng, Waleed Ahmed
Abstract:
The ability of the robot to make simultaneously map of the environment and localize itself with respect to that environment is the most important element of mobile robots. To solve SLAM many algorithms could be utilized to build up the SLAM process and SLAM is a developing area in Robotics research. Robot Operating System (ROS) is one of the frameworks which provide multiple algorithm nodes to work with and provide a transmission layer to robots. Manyof these algorithms extensively in use are Hector SLAM, Gmapping and Cartographer SLAM. This paper describes a ROS-based Simultaneous localization and mapping (SLAM) library Google Cartographer mapping, which is open-source algorithm. The algorithm was applied to create a map using laser and pose data from 2d Lidar that was placed on a mobile robot. The model robot uses the gazebo package and simulated in Rviz. Our research work's primary goal is to obtain mapping through Cartographer SLAM algorithm in a static indoor environment. From our research, it is shown that for indoor environments cartographer is an applicable algorithm to generate 2d maps with LIDAR placed on mobile robot because it uses both odometry and poses estimation. The algorithm has been evaluated and maps are constructed against the SLAM algorithms presented by Turtlebot2 in the static indoor environment.Keywords: SLAM, ROS, navigation, localization and mapping, gazebo, Rviz, Turtlebot2, slam algorithms, 2d indoor environment, cartographer
Procedia PDF Downloads 1463538 Comparison of Back-Projection with Non-Uniform Fast Fourier Transform for Real-Time Photoacoustic Tomography
Authors: Moung Young Lee, Chul Gyu Song
Abstract:
Photoacoustic imaging is the imaging technology that combines the optical imaging and ultrasound. This provides the high contrast and resolution due to optical imaging and ultrasound imaging, respectively. We developed the real-time photoacoustic tomography (PAT) system using linear-ultrasound transducer and digital acquisition (DAQ) board. There are two types of algorithm for reconstructing the photoacoustic signal. One is back-projection algorithm, the other is FFT algorithm. Especially, we used the non-uniform FFT algorithm. To evaluate the performance of our system and algorithms, we monitored two wires that stands at interval of 2.89 mm and 0.87 mm. Then, we compared the images reconstructed by algorithms. Finally, we monitored the two hairs crossed and compared between these algorithms.Keywords: back-projection, image comparison, non-uniform FFT, photoacoustic tomography
Procedia PDF Downloads 4343537 Symmetric Arabic Language Encryption Technique Based on Modified Playfair Algorithm
Authors: Fairouz Beggas
Abstract:
Due to the large number of exchanges in the networks, the security of communications is essential. Most ways of keeping communication secure rely on encryption. In this work, a symmetric encryption technique is offered to encrypt and decrypt simple Arabic scripts based on a multi-level security. A proposed technique uses an idea of Playfair encryption with a larger table size and an additional layer of encryption to ensure more security. The idea of the proposed algorithm aims to generate a dynamic table that depends on a secret key. The same secret key is also used to create other secret keys to over-encrypt the plaintext in three steps. The obtained results show that the proposed algorithm is faster in terms of encryption/decryption speed and can resist to many types of attacks.Keywords: arabic data, encryption, playfair, symmetric algorithm
Procedia PDF Downloads 903536 A Matheuristic Algorithm for the School Bus Routing Problem
Authors: Cagri Memis, Muzaffer Kapanoglu
Abstract:
The school bus routing problem (SBRP) is a variant of the Vehicle Routing Problem (VRP) classified as a location-allocation-routing problem. In this study, the SBRP is decomposed into two sub-problems: (1) bus route generation and (2) bus stop selection to solve large instances of the SBRP in reasonable computational times. To solve the first sub-problem, we propose a genetic algorithm to generate bus routes. Once the routes have been fixed, a sub-problem remains of allocating students to stops considering the capacity of the buses and the walkability constraints of the students. While the exact method solves small-scale problems, treating large-scale problems with the exact method becomes complex due to computational problems, a deficiency that the genetic algorithm can overcome. Results obtained from the proposed approach on 150 instances up to 250 stops show that the matheuristic algorithm provides better solutions in reasonable computational times with respect to benchmark algorithms.Keywords: genetic algorithm, matheuristic, school bus routing problem, vehicle routing problem
Procedia PDF Downloads 713535 Algorithm for Path Recognition in-between Tree Rows for Agricultural Wheeled-Mobile Robots
Authors: Anderson Rocha, Pedro Miguel de Figueiredo Dinis Oliveira Gaspar
Abstract:
Machine vision has been widely used in recent years in agriculture, as a tool to promote the automation of processes and increase the levels of productivity. The aim of this work is the development of a path recognition algorithm based on image processing to guide a terrestrial robot in-between tree rows. The proposed algorithm was developed using the software MATLAB, and it uses several image processing operations, such as threshold detection, morphological erosion, histogram equalization and the Hough transform, to find edge lines along tree rows on an image and to create a path to be followed by a mobile robot. To develop the algorithm, a set of images of different types of orchards was used, which made possible the construction of a method capable of identifying paths between trees of different heights and aspects. The algorithm was evaluated using several images with different characteristics of quality and the results showed that the proposed method can successfully detect a path in different types of environments.Keywords: agricultural mobile robot, image processing, path recognition, hough transform
Procedia PDF Downloads 1473534 Improving the Security of Internet of Things Using Encryption Algorithms
Authors: Amirhossein Safi
Abstract:
Internet of things (IOT) is a kind of advanced information technology which has drawn societies’ attention. Sensors and stimulators are usually recognized as smart devices of our environment. Simultaneously, IOT security brings up new issues. Internet connection and possibility of interaction with smart devices cause those devices to involve more in human life. Therefore, safety is a fundamental requirement in designing IOT. IOT has three remarkable features: overall perception, reliable transmission, and intelligent processing. Because of IOT span, security of conveying data is an essential factor for system security. Hybrid encryption technique is a new model that can be used in IOT. This type of encryption generates strong security and low computation. In this paper, we have proposed a hybrid encryption algorithm which has been conducted in order to reduce safety risks and enhancing encryption's speed and less computational complexity. The purpose of this hybrid algorithm is information integrity, confidentiality, non-repudiation in data exchange for IOT. Eventually, the suggested encryption algorithm has been simulated by MATLAB software, and its speed and safety efficiency were evaluated in comparison with conventional encryption algorithm.Keywords: internet of things, security, hybrid algorithm, privacy
Procedia PDF Downloads 4693533 Study on the Self-Location Estimate by the Evolutional Triangle Similarity Matching Using Artificial Bee Colony Algorithm
Authors: Yuji Kageyama, Shin Nagata, Tatsuya Takino, Izuru Nomura, Hiroyuki Kamata
Abstract:
In previous study, technique to estimate a self-location by using a lunar image is proposed. We consider the improvement of the conventional method in consideration of FPGA implementation in this paper. Specifically, we introduce Artificial Bee Colony algorithm for reduction of search time. In addition, we use fixed point arithmetic to enable high-speed operation on FPGA.Keywords: SLIM, Artificial Bee Colony Algorithm, location estimate, evolutional triangle similarity
Procedia PDF Downloads 5193532 An Integrated Label Propagation Network for Structural Condition Assessment
Authors: Qingsong Xiong, Cheng Yuan, Qingzhao Kong, Haibei Xiong
Abstract:
Deep-learning-driven approaches based on vibration responses have attracted larger attention in rapid structural condition assessment while obtaining sufficient measured training data with corresponding labels is relevantly costly and even inaccessible in practical engineering. This study proposes an integrated label propagation network for structural condition assessment, which is able to diffuse the labels from continuously-generating measurements by intact structure to those of missing labels of damage scenarios. The integrated network is embedded with damage-sensitive features extraction by deep autoencoder and pseudo-labels propagation by optimized fuzzy clustering, the architecture and mechanism which are elaborated. With a sophisticated network design and specified strategies for improving performance, the present network achieves to extends the superiority of self-supervised representation learning, unsupervised fuzzy clustering and supervised classification algorithms into an integration aiming at assessing damage conditions. Both numerical simulations and full-scale laboratory shaking table tests of a two-story building structure were conducted to validate its capability of detecting post-earthquake damage. The identifying accuracy of a present network was 0.95 in numerical validations and an average 0.86 in laboratory case studies, respectively. It should be noted that the whole training procedure of all involved models in the network stringently doesn’t rely upon any labeled data of damage scenarios but only several samples of intact structure, which indicates a significant superiority in model adaptability and feasible applicability in practice.Keywords: autoencoder, condition assessment, fuzzy clustering, label propagation
Procedia PDF Downloads 983531 FPGA Implementation of Novel Triangular Systolic Array Based Architecture for Determining the Eigenvalues of Matrix
Authors: Soumitr Sanjay Dubey, Shubhajit Roy Chowdhury, Rahul Shrestha
Abstract:
In this paper, we have presented a novel approach of calculating eigenvalues of any matrix for the first time on Field Programmable Gate Array (FPGA) using Triangular Systolic Arra (TSA) architecture. Conventionally, additional computation unit is required in the architecture which is compliant to the algorithm for determining the eigenvalues and this in return enhances the delay and power consumption. However, recently reported works are only dedicated for symmetric matrices or some specific case of matrix. This works presents an architecture to calculate eigenvalues of any matrix based on QR algorithm which is fully implementable on FPGA. For the implementation of QR algorithm we have used TSA architecture, which is further utilising CORDIC (CO-ordinate Rotation DIgital Computer) algorithm, to calculate various trigonometric and arithmetic functions involved in the procedure. The proposed architecture gives an error in the range of 10−4. Power consumption by the design is 0.598W. It can work at the frequency of 900 MHz.Keywords: coordinate rotation digital computer, three angle complex rotation, triangular systolic array, QR algorithm
Procedia PDF Downloads 4153530 Development and Implementation of Curvature Dependent Force Correction Algorithm for the Planning of Forced Controlled Robotic Grinding
Authors: Aiman Alshare, Sahar Qaadan
Abstract:
A curvature dependent force correction algorithm for planning force controlled grinding process with off-line programming flexibility is designed for ABB industrial robot, in order to avoid the manual interface during the process. The machining path utilizes a spline curve fit that is constructed from the CAD data of the workpiece. The fitted spline has a continuity of the second order to assure path smoothness. The implemented algorithm computes uniform forces normal to the grinding surface of the workpiece, by constructing a curvature path in the spatial coordinates using the spline method.Keywords: ABB industrial robot, grinding process, offline programming, CAD data extraction, force correction algorithm
Procedia PDF Downloads 3643529 Frequent-Pattern Tree Algorithm Application to S&P and Equity Indexes
Authors: E. Younsi, H. Andriamboavonjy, A. David, S. Dokou, B. Lemrabet
Abstract:
Software and time optimization are very important factors in financial markets, which are competitive fields, and emergence of new computer tools further stresses the challenge. In this context, any improvement of technical indicators which generate a buy or sell signal is a major issue. Thus, many tools have been created to make them more effective. This worry about efficiency has been leading in present paper to seek best (and most innovative) way giving largest improvement in these indicators. The approach consists in attaching a signature to frequent market configurations by application of frequent patterns extraction method which is here most appropriate to optimize investment strategies. The goal of proposed trading algorithm is to find most accurate signatures using back testing procedure applied to technical indicators for improving their performance. The problem is then to determine the signatures which, combined with an indicator, outperform this indicator alone. To do this, the FP-Tree algorithm has been preferred, as it appears to be the most efficient algorithm to perform this task.Keywords: quantitative analysis, back-testing, computational models, apriori algorithm, pattern recognition, data mining, FP-tree
Procedia PDF Downloads 3633528 A Comprehensive Study and Evaluation on Image Fashion Features Extraction
Authors: Yuanchao Sang, Zhihao Gong, Longsheng Chen, Long Chen
Abstract:
Clothing fashion represents a human’s aesthetic appreciation towards everyday outfits and appetite for fashion, and it reflects the development of status in society, humanity, and economics. However, modelling fashion by machine is extremely challenging because fashion is too abstract to be efficiently described by machines. Even human beings can hardly reach a consensus about fashion. In this paper, we are dedicated to answering a fundamental fashion-related problem: what image feature best describes clothing fashion? To address this issue, we have designed and evaluated various image features, ranging from traditional low-level hand-crafted features to mid-level style awareness features to various current popular deep neural network-based features, which have shown state-of-the-art performance in various vision tasks. In summary, we tested the following 9 feature representations: color, texture, shape, style, convolutional neural networks (CNNs), CNNs with distance metric learning (CNNs&DML), AutoEncoder, CNNs with multiple layer combination (CNNs&MLC) and CNNs with dynamic feature clustering (CNNs&DFC). Finally, we validated the performance of these features on two publicly available datasets. Quantitative and qualitative experimental results on both intra-domain and inter-domain fashion clothing image retrieval showed that deep learning based feature representations far outweigh traditional hand-crafted feature representation. Additionally, among all deep learning based methods, CNNs with explicit feature clustering performs best, which shows feature clustering is essential for discriminative fashion feature representation.Keywords: convolutional neural network, feature representation, image processing, machine modelling
Procedia PDF Downloads 1413527 Developing an Advanced Algorithm Capable of Classifying News, Articles and Other Textual Documents Using Text Mining Techniques
Authors: R. B. Knudsen, O. T. Rasmussen, R. A. Alphinas
Abstract:
The reason for conducting this research is to develop an algorithm that is capable of classifying news articles from the automobile industry, according to the competitive actions that they entail, with the use of Text Mining (TM) methods. It is needed to test how to properly preprocess the data for this research by preparing pipelines which fits each algorithm the best. The pipelines are tested along with nine different classification algorithms in the realm of regression, support vector machines, and neural networks. Preliminary testing for identifying the optimal pipelines and algorithms resulted in the selection of two algorithms with two different pipelines. The two algorithms are Logistic Regression (LR) and Artificial Neural Network (ANN). These algorithms are optimized further, where several parameters of each algorithm are tested. The best result is achieved with the ANN. The final model yields an accuracy of 0.79, a precision of 0.80, a recall of 0.78, and an F1 score of 0.76. By removing three of the classes that created noise, the final algorithm is capable of reaching an accuracy of 94%.Keywords: Artificial Neural network, Competitive dynamics, Logistic Regression, Text classification, Text mining
Procedia PDF Downloads 122