Search results for: Chaos Optimization Algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4636

Search results for: Chaos Optimization Algorithm

4636 Neural Network Learning Based on Chaos

Authors: Truong Quang Dang Khoa, Masahiro Nakagawa

Abstract:

Chaos and fractals are novel fields of physics and mathematics showing up a new way of universe viewpoint and creating many ideas to solve several present problems. In this paper, a novel algorithm based on the chaotic sequence generator with the highest ability to adapt and reach the global optima is proposed. The adaptive ability of proposal algorithm is flexible in 2 steps. The first one is a breadth-first search and the second one is a depth-first search. The proposal algorithm is examined by 2 functions, the Camel function and the Schaffer function. Furthermore, the proposal algorithm is applied to optimize training Multilayer Neural Networks.

Keywords: learning and evolutionary computing, Chaos Optimization Algorithm, Artificial Neural Networks, nonlinear optimization, intelligent computational technologies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
4635 Application of Hybrid Genetic Algorithm Based on Simulated Annealing in Function Optimization

Authors: Panpan Xu, Shulin Sui, Zongjie Du

Abstract:

Genetic algorithm is widely used in optimization problems for its excellent global search capabilities and highly parallel processing capabilities; but, it converges prematurely and has a poor local optimization capability in actual operation. Simulated annealing algorithm can avoid the search process falling into local optimum. A hybrid genetic algorithm based on simulated annealing is designed by combining the advantages of genetic algorithm and simulated annealing algorithm. The numerical experiment represents the hybrid genetic algorithm can be applied to solve the function optimization problems efficiently.

Keywords: Genetic algorithm, Simulated annealing, Hybrid genetic algorithm, Function optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2493
4634 Particle Swarm Optimization with Reduction for Global Optimization Problems

Authors: Michiharu Maeda, Shinya Tsuda

Abstract:

This paper presents an algorithm of particle swarm optimization with reduction for global optimization problems. Particle swarm optimization is an algorithm which refers to the collective motion such as birds or fishes, and a multi-point search algorithm which finds a best solution using multiple particles. Particle swarm optimization is so flexible that it can adapt to a number of optimization problems. When an objective function has a lot of local minimums complicatedly, the particle may fall into a local minimum. For avoiding the local minimum, a number of particles are initially prepared and their positions are updated by particle swarm optimization. Particles sequentially reduce to reach a predetermined number of them grounded in evaluation value and particle swarm optimization continues until the termination condition is met. In order to show the effectiveness of the proposed algorithm, we examine the minimum by using test functions compared to existing algorithms. Furthermore the influence of best value on the initial number of particles for our algorithm is discussed.

Keywords: Particle swarm optimization, Global optimization, Metaheuristics, Reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1568
4633 A Hybrid Multi-Objective Firefly-Sine Cosine Algorithm for Multi-Objective Optimization Problem

Authors: Gaohuizi Guo, Ning Zhang

Abstract:

Firefly algorithm (FA) and Sine Cosine algorithm (SCA) are two very popular and advanced metaheuristic algorithms. However, these algorithms applied to multi-objective optimization problems have some shortcomings, respectively, such as premature convergence and limited exploration capability. Combining the privileges of FA and SCA while avoiding their deficiencies may improve the accuracy and efficiency of the algorithm. This paper proposes a hybridization of FA and SCA algorithms, named multi-objective firefly-sine cosine algorithm (MFA-SCA), to develop a more efficient meta-heuristic algorithm than FA and SCA.

Keywords: Firefly algorithm, hybrid algorithm, multi-objective optimization, Sine Cosine algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 436
4632 Application of Adaptive Genetic Algorithm in Function Optimization

Authors: Panpan Xu, Shulin Sui

Abstract:

The crossover probability and mutation probability are the two important factors in genetic algorithm. The adaptive genetic algorithm can improve the convergence performance of genetic algorithm, in which the crossover probability and mutation probability are adaptively designed with the changes of fitness value. We apply adaptive genetic algorithm into a function optimization problem. The numerical experiment represents that adaptive genetic algorithm improves the convergence speed and avoids local convergence.

Keywords: Genetic algorithm, Adaptive genetic algorithm, Function optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1653
4631 A New Tool for Global Optimization Problems- Cuttlefish Algorithm

Authors: Adel Sabry Eesa, Adnan Mohsin Abdulazeez Brifcani, Zeynep Orman

Abstract:

This paper presents a new meta-heuristic bio-inspired optimization algorithm which is called Cuttlefish Algorithm (CFA). The algorithm mimics the mechanism of color changing behavior of the cuttlefish to solve numerical global optimization problems. The colors and patterns of the cuttlefish are produced by reflected light from three different layers of cells. The proposed algorithm considers mainly two processes: reflection and visibility. Reflection process simulates light reflection mechanism used by these layers, while visibility process simulates visibility of matching patterns of the cuttlefish. To show the effectiveness of the algorithm, it is tested with some other popular bio-inspired optimization algorithms such as Genetic Algorithms (GA), Particle Swarm Optimization (PSO) and Bees Algorithm (BA) that have been previously proposed in the literature. Simulations and obtained results indicate that the proposed CFA is superior when compared with these algorithms.

Keywords: Cuttlefish Algorithm, bio-inspired algorithms, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3789
4630 Investigation of a Transition from Steady Convection to Chaos in Porous Media Using Piecewise Variational Iteration Method

Authors: Mohamed M. Mousa, Aidarkhan Kaltayev Shahwar F. Ragab

Abstract:

In this paper, a new dependable algorithm based on an adaptation of the standard variational iteration method (VIM) is used for analyzing the transition from steady convection to chaos for lowto-intermediate Rayleigh numbers convection in porous media. The solution trajectories show the transition from steady convection to chaos that occurs at a slightly subcritical value of Rayleigh number, the critical value being associated with the loss of linear stability of the steady convection solution. The VIM is treated as an algorithm in a sequence of intervals for finding accurate approximate solutions to the considered model and other dynamical systems. We shall call this technique as the piecewise VIM. Numerical comparisons between the piecewise VIM and the classical fourth-order Runge–Kutta (RK4) numerical solutions reveal that the proposed technique is a promising tool for the nonlinear chaotic and nonchaotic systems.

Keywords: Variational iteration method, free convection, Chaos, Lorenz equations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1486
4629 Simulated Annealing Application for Structural Optimization

Authors: Farhad Kolahan, M. Hossein Abolbashari, Samaeddin Mohitzadeh

Abstract:

Several methods are available for weight and shape optimization of structures, among which Evolutionary Structural Optimization (ESO) is one of the most widely used methods. In ESO, however, the optimization criterion is completely case-dependent. Moreover, only the improving solutions are accepted during the search. In this paper a Simulated Annealing (SA) algorithm is used for structural optimization problem. This algorithm differs from other random search methods by accepting non-improving solutions. The implementation of SA algorithm is done through reducing the number of finite element analyses (function evaluations). Computational results show that SA can efficiently and effectively solve such optimization problems within short search time.

Keywords: Simulated annealing, Structural optimization, Compliance, C.V. product.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1897
4628 Optimal DG Allocation in Distribution Network

Authors: A. Safari, R. Jahani, H. A. Shayanfar, J. Olamaei

Abstract:

This paper shows the results obtained in the analysis of the impact of distributed generation (DG) on distribution losses and presents a new algorithm to the optimal allocation of distributed generation resources in distribution networks. The optimization is based on a Hybrid Genetic Algorithm and Particle Swarm Optimization (HGAPSO) aiming to optimal DG allocation in distribution network. Through this algorithm a significant improvement in the optimization goal is achieved. With a numerical example the superiority of the proposed algorithm is demonstrated in comparison with the simple genetic algorithm.

Keywords: Distributed Generation, Distribution Networks, Genetic Algorithm, Particle Swarm Optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2645
4627 Optimization of a Three-Term Backpropagation Algorithm Used for Neural Network Learning

Authors: Yahya H. Zweiri

Abstract:

The back-propagation algorithm calculates the weight changes of an artificial neural network, and a two-term algorithm with a dynamically optimal learning rate and a momentum factor is commonly used. Recently the addition of an extra term, called a proportional factor (PF), to the two-term BP algorithm was proposed. The third term increases the speed of the BP algorithm. However, the PF term also reduces the convergence of the BP algorithm, and optimization approaches for evaluating the learning parameters are required to facilitate the application of the three terms BP algorithm. This paper considers the optimization of the new back-propagation algorithm by using derivative information. A family of approaches exploiting the derivatives with respect to the learning rate, momentum factor and proportional factor is presented. These autonomously compute the derivatives in the weight space, by using information gathered from the forward and backward procedures. The three-term BP algorithm and the optimization approaches are evaluated using the benchmark XOR problem.

Keywords: Neural Networks, Backpropagation, Optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1497
4626 IBFO_PSO: Evaluating the Performance of Bio-Inspired Integrated Bacterial Foraging Optimization Algorithm and Particle Swarm Optimization Algorithm in MANET Routing

Authors: K. Geetha, P. Thangaraj, C. Rasi Priya, C. Rajan, S. Geetha

Abstract:

This paper presents the performance of Integrated Bacterial Foraging Optimization and Particle Swarm Optimization (IBFO_PSO) technique in MANET routing. The BFO is a bio-inspired algorithm, which simulates the foraging behavior of bacteria. It is effectively applied in improving the routing performance in MANET. In results, it is proved that the PSO integrated with BFO reduces routing delay, energy consumption and communication overhead.

Keywords: Ant Colony Optimization, Bacterial Foraging Optimization, Hybrid Routing Intelligent Algorithm, Naturally inspired algorithms, Particle Swarm Optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2679
4625 Application of a New Hybrid Optimization Algorithm on Cluster Analysis

Authors: T. Niknam, M. Nayeripour, B.Bahmani Firouzi

Abstract:

Clustering techniques have received attention in many areas including engineering, medicine, biology and data mining. The purpose of clustering is to group together data points, which are close to one another. The K-means algorithm is one of the most widely used techniques for clustering. However, K-means has two shortcomings: dependency on the initial state and convergence to local optima and global solutions of large problems cannot found with reasonable amount of computation effort. In order to overcome local optima problem lots of studies done in clustering. This paper is presented an efficient hybrid evolutionary optimization algorithm based on combining Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO), called PSO-ACO, for optimally clustering N object into K clusters. The new PSO-ACO algorithm is tested on several data sets, and its performance is compared with those of ACO, PSO and K-means clustering. The simulation results show that the proposed evolutionary optimization algorithm is robust and suitable for handing data clustering.

Keywords: Ant Colony Optimization (ACO), Data clustering, Hybrid evolutionary optimization algorithm, K-means clustering, Particle Swarm Optimization (PSO).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2152
4624 The Whale Optimization Algorithm and Its Implementation in MATLAB

Authors: S. Adhirai, R. P. Mahapatra, Paramjit Singh

Abstract:

Optimization is an important tool in making decisions and in analysing physical systems. In mathematical terms, an optimization problem is the problem of finding the best solution from among the set of all feasible solutions. The paper discusses the Whale Optimization Algorithm (WOA), and its applications in different fields. The algorithm is tested using MATLAB because of its unique and powerful features. The benchmark functions used in WOA algorithm are grouped as: unimodal (F1-F7), multimodal (F8-F13), and fixed-dimension multimodal (F14-F23). Out of these benchmark functions, we show the experimental results for F7, F11, and F19 for different number of iterations. The search space and objective space for the selected function are drawn, and finally, the best solution as well as the best optimal value of the objective function found by WOA is presented. The algorithmic results demonstrate that the WOA performs better than the state-of-the-art meta-heuristic and conventional algorithms.

Keywords: Optimization, optimal value, objective function, optimization problems, meta-heuristic optimization algorithms, Whale Optimization Algorithm, Implementation, MATLAB.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2821
4623 Reformulations of Big Bang-Big Crunch Algorithm for Discrete Structural Design Optimization

Authors: O. Hasançebi, S. Kazemzadeh Azad

Abstract:

In the present study the efficiency of Big Bang-Big Crunch (BB-BC) algorithm is investigated in discrete structural design optimization. It is shown that a standard version of the BB-BC algorithm is sometimes unable to produce reasonable solutions to problems from discrete structural design optimization. Two reformulations of the algorithm, which are referred to as modified BB-BC (MBB-BC) and exponential BB-BC (EBB-BC), are introduced to enhance the capability of the standard algorithm in locating good solutions for steel truss and frame type structures, respectively. The performances of the proposed algorithms are experimented and compared to its standard version as well as some other algorithms over several practical design examples. In these examples, steel structures are sized for minimum weight subject to stress, stability and displacement limitations according to the provisions of AISC-ASD.

Keywords: Structural optimization, discrete optimization, metaheuristics, big bang-big crunch (BB-BC) algorithm, design optimization of steel trusses and frames.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2334
4622 Chemical Reaction Algorithm for Expectation Maximization Clustering

Authors: Li Ni, Pen ManMan, Li KenLi

Abstract:

Clustering is an intensive research for some years because of its multifaceted applications, such as biology, information retrieval, medicine, business and so on. The expectation maximization (EM) is a kind of algorithm framework in clustering methods, one of the ten algorithms of machine learning. Traditionally, optimization of objective function has been the standard approach in EM. Hence, research has investigated the utility of evolutionary computing and related techniques in the regard. Chemical Reaction Optimization (CRO) is a recently established method. So the property embedded in CRO is used to solve optimization problems. This paper presents an algorithm framework (EM-CRO) with modified CRO operators based on EM cluster problems. The hybrid algorithm is mainly to solve the problem of initial value sensitivity of the objective function optimization clustering algorithm. Our experiments mainly take the EM classic algorithm:k-means and fuzzy k-means as an example, through the CRO algorithm to optimize its initial value, get K-means-CRO and FKM-CRO algorithm. The experimental results of them show that there is improved efficiency for solving objective function optimization clustering problems.

Keywords: Chemical reaction optimization, expectation maximization, initial, objective function clustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1234
4621 Exponential Particle Swarm Optimization Approach for Improving Data Clustering

Authors: Neveen I. Ghali, Nahed El-Dessouki, Mervat A. N., Lamiaa Bakrawi

Abstract:

In this paper we use exponential particle swarm optimization (EPSO) to cluster data. Then we compare between (EPSO) clustering algorithm which depends on exponential variation for the inertia weight and particle swarm optimization (PSO) clustering algorithm which depends on linear inertia weight. This comparison is evaluated on five data sets. The experimental results show that EPSO clustering algorithm increases the possibility to find the optimal positions as it decrease the number of failure. Also show that (EPSO) clustering algorithm has a smaller quantization error than (PSO) clustering algorithm, i.e. (EPSO) clustering algorithm more accurate than (PSO) clustering algorithm.

Keywords: Particle swarm optimization, data clustering, exponential PSO.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1635
4620 A Simple Adaptive Algorithm for Norm-Constrained Optimization

Authors: Hyun-Chool Shin

Abstract:

In this paper we propose a simple adaptive algorithm iteratively solving the unit-norm constrained optimization problem. Instead of conventional parameter norm based normalization, the proposed algorithm incorporates scalar normalization which is computationally much simpler. The analysis of stationary point is presented to show that the proposed algorithm indeed solves the constrained optimization problem. The simulation results illustrate that the proposed algorithm performs as good as conventional ones while being computationally simpler.

Keywords: constrained optimization, unit-norm, LMS, principle component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2077
4619 An Optimization Algorithm Based on Dynamic Schema with Dissimilarities and Similarities of Chromosomes

Authors: Radhwan Yousif Sedik Al-Jawadi

Abstract:

Optimization is necessary for finding appropriate solutions to a range of real-life problems. In particular, genetic (or more generally, evolutionary) algorithms have proved very useful in solving many problems for which analytical solutions are not available. In this paper, we present an optimization algorithm called Dynamic Schema with Dissimilarity and Similarity of Chromosomes (DSDSC) which is a variant of the classical genetic algorithm. This approach constructs new chromosomes from a schema and pairs of existing ones by exploring their dissimilarities and similarities. To show the effectiveness of the algorithm, it is tested and compared with the classical GA, on 15 two-dimensional optimization problems taken from literature. We have found that, in most cases, our method is better than the classical genetic algorithm.

Keywords: Genetic algorithm, similarity and dissimilarity, chromosome injection, dynamic schema.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1251
4618 Choosing Search Algorithms in Bayesian Optimization Algorithm

Authors: Hao Wu, Jonathan L. Shapiro

Abstract:

The Bayesian Optimization Algorithm (BOA) is an algorithm based on the estimation of distributions. It uses techniques from modeling data by Bayesian networks to estimating the joint distribution of promising solutions. To obtain the structure of Bayesian network, different search algorithms can be used. The key point that BOA addresses is whether the constructed Bayesian network could generate new and useful solutions (strings), which could lead the algorithm in the right direction to solve the problem. Undoubtedly, this ability is a crucial factor of the efficiency of BOA. Varied search algorithms can be used in BOA, but their performances are different. For choosing better ones, certain suitable method to present their ability difference is needed. In this paper, a greedy search algorithm and a stochastic search algorithm are used in BOA to solve certain optimization problem. A method using Kullback-Leibler (KL) Divergence to reflect their difference is described.

Keywords: Bayesian optimization algorithm, greedy search, KL divergence, stochastic search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1641
4617 Application of Soft Computing Methods for Economic Dispatch in Power Systems

Authors: Jagabondhu Hazra, Avinash Sinha

Abstract:

Economic dispatch problem is an optimization problem where objective function is highly non linear, non-convex, non-differentiable and may have multiple local minima. Therefore, classical optimization methods may not converge or get trapped to any local minima. This paper presents a comparative study of four different evolutionary algorithms i.e. genetic algorithm, bacteria foraging optimization, ant colony optimization and particle swarm optimization for solving the economic dispatch problem. All the methods are tested on IEEE 30 bus test system. Simulation results are presented to show the comparative performance of these methods.

Keywords: Ant colony optimization, bacteria foraging optimization, economic dispatch, evolutionary algorithm, genetic algorithm, particle swarm optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2433
4616 Dense Chaos in Coupled Map Lattices

Authors: Tianxiu Lu, Peiyong Zhu

Abstract:

This paper is mainly concerned with a kind of coupled map lattices (CMLs). New definitions of dense δ-chaos and dense chaos (which is a special case of dense δ-chaos with δ = 0) in discrete spatiotemporal systems are given and sufficient conditions for these systems to be densely chaotic or densely δ-chaotic are derived.

Keywords: Discrete spatiotemporal systems, coupled map lattices, dense δ-chaos, Li-Yorke pairs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1614
4615 A Multi-objective Fuzzy Optimization Method of Resource Input Based on Genetic Algorithm

Authors: Tao Zhao, Xin Wang

Abstract:

With the increasing complexity of engineering problems, the traditional, single-objective and deterministic optimization method can not meet people-s requirements. A multi-objective fuzzy optimization model of resource input is built for M chlor-alkali chemical eco-industrial park in this paper. First, the model is changed into the form that can be solved by genetic algorithm using fuzzy theory. And then, a fitness function is constructed for genetic algorithm. Finally, a numerical example is presented to show that the method compared with traditional single-objective optimization method is more practical and efficient.

Keywords: Fitness function, genetic algorithm, multi-objectivefuzzy optimization, satisfaction degree membership function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1311
4614 Voltage Stability Enhancement Using Cat Swarm Optimization Algorithm

Authors: P. Suryakumari, P. Kantarao

Abstract:

Optimal Power Flow (OPF) problem in electrical power system is considered as a static, non-linear, multi-objective or a single objective optimization problem. This paper presents an algorithm for solving the voltage stability objective reactive power dispatch problem in a power system .The proposed approach employs cat swarm optimization algorithm for optimal settings of RPD control variables. Generator terminal voltages, reactive power generation of the capacitor banks and tap changing transformer setting are taken as the optimization variables. CSO algorithm is tested on standard IEEE 30 bus system and the results are compared with other methods to prove the effectiveness of the new algorithm. As a result, the proposed method is the best for solving optimal reactive power dispatch problem.

Keywords: RPD problem, voltage stability enhancement, CSO algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2388
4613 Comparing the Performance of the Particle Swarm Optimization and the Genetic Algorithm on the Geometry Design of Longitudinal Fin

Authors: Hassan Azarkish, Said Farahat, S.Masoud H. Sarvari

Abstract:

In the present work, the performance of the particle swarm optimization and the genetic algorithm compared as a typical geometry design problem. The design maximizes the heat transfer rate from a given fin volume. The analysis presumes that a linear temperature distribution along the fin. The fin profile generated using the B-spline curves and controlled by the change of control point coordinates. An inverse method applied to find the appropriate fin geometry yield the linear temperature distribution along the fin corresponds to optimum design. The numbers of the populations, the count of iterations and time to convergence measure efficiency. Results show that the particle swarm optimization is most efficient for geometry optimization.

Keywords: Genetic Algorithm, Geometry Optimization, longitudinal Fin, Particle Swarm Optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1587
4612 Investigation on Novel Based Naturally-Inspired Swarm Intelligence Algorithms for Optimization Problems in Mobile Ad Hoc Networks

Authors: C. Rajan, K. Geetha, C. Rasi Priya, S. Geetha

Abstract:

Nature is the immense gifted source for solving complex problems. It always helps to find the optimal solution to solve the problem. Mobile Ad Hoc NETwork (MANET) is a wide research area of networks which has set of independent nodes. The characteristics involved in MANET’s are Dynamic, does not depend on any fixed infrastructure or centralized networks, High mobility. The Bio-Inspired algorithms are mimics the nature for solving optimization problems opening a new era in MANET. The typical Swarm Intelligence (SI) algorithms are Ant Colony Optimization (ACO), Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO), Modified Termite Algorithm, Bat Algorithm (BA), Wolf Search Algorithm (WSA) and so on. This work mainly concentrated on nature of MANET and behavior of nodes. Also it analyses various performance metrics such as throughput, QoS and End-to-End delay etc.

Keywords: Ant Colony Algorithm, Artificial Bee Colony algorithm, Bio-Inspired algorithm, Modified Termite Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2424
4611 Nature Inspired Metaheuristic Algorithms for Multilevel Thresholding Image Segmentation - A Survey

Authors: C. Deepika, J. Nithya

Abstract:

Segmentation is one of the essential tasks in image processing. Thresholding is one of the simplest techniques for performing image segmentation. Multilevel thresholding is a simple and effective technique. The primary objective of bi-level or multilevel thresholding for image segmentation is to determine a best thresholding value. To achieve multilevel thresholding various techniques has been proposed. A study of some nature inspired metaheuristic algorithms for multilevel thresholding for image segmentation is conducted. Here, we study about Particle swarm optimization (PSO) algorithm, artificial bee colony optimization (ABC), Ant colony optimization (ACO) algorithm and Cuckoo search (CS) algorithm.

Keywords: Ant colony optimization, Artificial bee colony optimization, Cuckoo search algorithm, Image segmentation, Multilevel thresholding, Particle swarm optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3470
4610 Performance Comparison of Prim’s and Ant Colony Optimization Algorithm to Select Shortest Path in Case of Link Failure

Authors: Rimmy Yadav, Avtar Singh

Abstract:

Ant Colony Optimization (ACO) is a promising modern approach to the unused combinatorial optimization. Here ACO is applied to finding the shortest during communication link failure. In this paper, the performances of the prim’s and ACO algorithm are made. By comparing the time complexity and program execution time as set of parameters, we demonstrate the pleasant performance of ACO in finding excellent solution to finding shortest path during communication link failure.

Keywords: Ant colony optimization, link failure, prim’s algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2129
4609 Combined Simulated Annealing and Genetic Algorithm to Solve Optimization Problems

Authors: Younis R. Elhaddad

Abstract:

Combinatorial optimization problems arise in many scientific and practical applications. Therefore many researchers try to find or improve different methods to solve these problems with high quality results and in less time. Genetic Algorithm (GA) and Simulated Annealing (SA) have been used to solve optimization problems. Both GA and SA search a solution space throughout a sequence of iterative states. However, there are also significant differences between them. The GA mechanism is parallel on a set of solutions and exchanges information using the crossover operation. SA works on a single solution at a time. In this work SA and GA are combined using new technique in order to overcome the disadvantages' of both algorithms.

Keywords: Genetic Algorithm, Optimization problems, Simulated Annealing, Traveling Salesman Problem

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3388
4608 A New Evolutionary Algorithm for Cluster Analysis

Authors: B.Bahmani Firouzi, T. Niknam, M. Nayeripour

Abstract:

Clustering is a very well known technique in data mining. One of the most widely used clustering techniques is the kmeans algorithm. Solutions obtained from this technique depend on the initialization of cluster centers and the final solution converges to local minima. In order to overcome K-means algorithm shortcomings, this paper proposes a hybrid evolutionary algorithm based on the combination of PSO, SA and K-means algorithms, called PSO-SA-K, which can find better cluster partition. The performance is evaluated through several benchmark data sets. The simulation results show that the proposed algorithm outperforms previous approaches, such as PSO, SA and K-means for partitional clustering problem.

Keywords: Data clustering, Hybrid evolutionary optimization algorithm, K-means algorithm, Simulated Annealing (SA), Particle Swarm Optimization (PSO).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2227
4607 Genetic Algorithm Optimization of the Economical, Ecological and Self-Consumption Impact of the Energy Production of a Single Building

Authors: Ludovic Favre, Thibaut M. Schafer, Jean-Luc Robyr, Elena-Lavinia Niederhäuser

Abstract:

This paper presents an optimization method based on genetic algorithm for the energy management inside buildings developed in the frame of the project Smart Living Lab (SLL) in Fribourg (Switzerland). This algorithm optimizes the interaction between renewable energy production, storage systems and energy consumers. In comparison with standard algorithms, the innovative aspect of this project is the extension of the smart regulation over three simultaneous criteria: the energy self-consumption, the decrease of greenhouse gas emissions and operating costs. The genetic algorithm approach was chosen due to the large quantity of optimization variables and the non-linearity of the optimization function. The optimization process includes also real time data of the building as well as weather forecast and users habits. This information is used by a physical model of the building energy resources to predict the future energy production and needs, to select the best energetic strategy, to combine production or storage of energy in order to guarantee the demand of electrical and thermal energy. The principle of operation of the algorithm as well as typical output example of the algorithm is presented.

Keywords: Building’s energy, control system, energy management, modelling, genetic optimization algorithm, renewable energy, greenhouse gases, energy storage.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 717