Search results for: multidimensional compromise optimization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2008

Search results for: multidimensional compromise optimization

1138 Experimental Implementation of Model Predictive Control for Permanent Magnet Synchronous Motor

Authors: Abdelsalam A. Ahmed

Abstract:

Fast speed drives for Permanent Magnet Synchronous Motor (PMSM) is a crucial performance for the electric traction systems. In this paper, PMSM is derived with a Model-based Predictive Control (MPC) technique. Fast speed tracking is achieved through optimization of the DC source utilization using MPC. The technique is based on predicting the optimum voltage vector applied to the driver. Control technique is investigated by comparing to the cascaded PI control based on Space Vector Pulse Width Modulation (SVPWM). MPC and SVPWM-based FOC are implemented with the TMS320F2812 DSP and its power driver circuits. The designed MPC for a PMSM drive is experimentally validated on a laboratory test bench. The performances are compared with those obtained by a conventional PI-based system in order to highlight the improvements, especially regarding speed tracking response.

Keywords: Permanent magnet synchronous motor, mode predictive control, optimization of DC source utilization, cascaded PI control, space vector pulse width modulation, TMS320F2812 DSP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3108
1137 Order Optimization of a Telecommunication Distribution Center through Service Lead Time

Authors: Tamás Hartványi, Ferenc Tóth

Abstract:

European telecommunication distribution center performance is measured by service lead time and quality. Operation model is CTO (customized to order) namely, a high mix customization of telecommunication network equipment and parts. CTO operation contains material receiving, warehousing, network and server assembly to order and configure based on customer specifications. Variety of the product and orders does not support mass production structure. One of the success factors to satisfy customer is to have a proper aggregated planning method for the operation in order to have optimized human resources and highly efficient asset utilization. Research will investigate several methods and find proper way to have an order book simulation where practical optimization problem may contain thousands of variables and the simulation running times of developed algorithms were taken into account with high importance. There are two operation research models that were developed, customer demand is given in orders, no change over time, customer demands are given for product types, and changeover time is constant.

Keywords: CTO, aggregated planning, demand simulation, changeover time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 792
1136 Optimization of HALO Structure Effects in 45nm p-type MOSFETs Device Using Taguchi Method

Authors: F. Salehuddin, I. Ahmad, F. A. Hamid, A. Zaharim, H. A. Elgomati, B. Y. Majlis, P. R. Apte

Abstract:

In this study, the Taguchi method was used to optimize the effect of HALO structure or halo implant variations on threshold voltage (VTH) and leakage current (ILeak) in 45nm p-type Metal Oxide Semiconductor Field Effect Transistors (MOSFETs) device. Besides halo implant dose, the other process parameters which used were Source/Drain (S/D) implant dose, oxide growth temperature and silicide anneal temperature. This work was done using TCAD simulator, consisting of a process simulator, ATHENA and device simulator, ATLAS. These two simulators were combined with Taguchi method to aid in design and optimize the process parameters. In this research, the most effective process parameters with respect to VTH and ILeak are halo implant dose (40%) and S/D implant dose (52%) respectively. Whereas the second ranking factor affecting VTH and ILeak are oxide growth temperature (32%) and halo implant dose (34%) respectively. The results show that after optimizations approaches is -0.157V at ILeak=0.195mA/μm.

Keywords: Optimization, p-type MOSFETs device, HALO Structure, Taguchi Method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2044
1135 A New Hybrid Optimization Method for Optimum Distribution Capacitor Planning

Authors: A. R. Seifi

Abstract:

This work presents a new algorithm based on a combination of fuzzy (FUZ), Dynamic Programming (DP), and Genetic Algorithm (GA) approach for capacitor allocation in distribution feeders. The problem formulation considers two distinct objectives related to total cost of power loss and total cost of capacitors including the purchase and installation costs. The novel formulation is a multi-objective and non-differentiable optimization problem. The proposed method of this article uses fuzzy reasoning for sitting of capacitors in radial distribution feeders, DP for sizing and finally GA for finding the optimum shape of membership functions which are used in fuzzy reasoning stage. The proposed method has been implemented in a software package and its effectiveness has been verified through a 9-bus radial distribution feeder for the sake of conclusions supports. A comparison has been done among the proposed method of this paper and similar methods in other research works that shows the effectiveness of the proposed method of this paper for solving optimum capacitor planning problem.

Keywords: Capacitor planning, Fuzzy logic method, Genetic Algorithm, Dynamic programming, Radial Distribution feeder

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1615
1134 An Analysis of the Optimization Condition of Plasma Generator for Air Conditioner System

Authors: Arunrungrusmi S, Chaokamnerd W , Tanitteerapan T , Mungkung N., Yuji T.

Abstract:

This research aimed to develop plasma system used in air conditioners. This developed plasma system could be installed in the air conditioners - all split type. The quality of air could be improved to be equal to present plasma system. Development processes were as follows: 1) to study the plasma system used in the air conditioners, 2) to design a plasma generator, 3) to develop the plasma generator, and 4) to test its performance in many types of the air conditioners. This plasma system was developed by AC high voltage – 14 kv with a frequency of 50 kHz. Carbon was a conductor to generate arc in air purifier system. The research was tested by installing the plasma generator in the air conditioners - wall type. Whereas, there were 3 types of installations: air flow out, air flow in, and room center. The result of the plasma generator installed in the air conditioners, split type, revealed that the air flow out installation provided the highest average of o-zone at 223 mg/h. This type of installation provided the highest efficiency of air quality improvement. Moreover, the air flow in installation and the room center installation provided the average of the o-zone at 163 mg/h and 64 mg/h, respectively.

Keywords: Air Conditioner, Plasma generator, High voltage, Optimization, Installation position.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1368
1133 Cash Flow Optimization on Synthetic CDOs

Authors: Timothée Bligny, Clément Codron, Antoine Estruch, Nicolas Girodet, Clément Ginet

Abstract:

Collateralized Debt Obligations are not as widely used nowadays as they were before 2007 Subprime crisis. Nonetheless there remains an enthralling challenge to optimize cash flows associated with synthetic CDOs. A Gaussian-based model is used here in which default correlation and unconditional probabilities of default are highlighted. Then numerous simulations are performed based on this model for different scenarios in order to evaluate the associated cash flows given a specific number of defaults at different periods of time. Cash flows are not solely calculated on a single bought or sold tranche but rather on a combination of bought and sold tranches. With some assumptions, the simplex algorithm gives a way to find the maximum cash flow according to correlation of defaults and maturities. The used Gaussian model is not realistic in crisis situations. Besides present system does not handle buying or selling a portion of a tranche but only the whole tranche. However the work provides the investor with relevant elements on how to know what and when to buy and sell.

Keywords: Synthetic Collateralized Debt Obligation (CDO), Credit Default Swap (CDS), Cash Flow Optimization, Probability of Default, Default Correlation, Strategies, Simulation, Simplex.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1909
1132 Simplified Models to Determine Nodal Voltagesin Problems of Optimal Allocation of Capacitor Banks in Power Distribution Networks

Authors: A. Pereira, S. Haffner, L. V. Gasperin

Abstract:

This paper presents two simplified models to determine nodal voltages in power distribution networks. These models allow estimating the impact of the installation of reactive power compensations equipments like fixed or switched capacitor banks. The procedure used to develop the models is similar to the procedure used to develop linear power flow models of transmission lines, which have been widely used in optimization problems of operation planning and system expansion. The steady state non-linear load flow equations are approximated by linear equations relating the voltage amplitude and currents. The approximations of the linear equations are based on the high relationship between line resistance and line reactance (ratio R/X), which is valid for power distribution networks. The performance and accuracy of the models are evaluated through comparisons with the exact results obtained from the solution of the load flow using two test networks: a hypothetical network with 23 nodes and a real network with 217 nodes.

Keywords: Distribution network models, distribution systems, optimization, power system planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1567
1131 Optimal Maintenance Clustering for Rail Track Components Subject to Possession Capacity Constraints

Authors: Cuong D. Dao, Rob J.I. Basten, Andreas Hartmann

Abstract:

This paper studies the optimal maintenance planning of preventive maintenance and renewal activities for components in a single railway track when the available time for maintenance is limited. The rail-track system consists of several types of components, such as rail, ballast, and switches with different preventive maintenance and renewal intervals. To perform maintenance or renewal on the track, a train free period for maintenance, called a possession, is required. Since a major possession directly affects the regular train schedule, maintenance and renewal activities are clustered as much as possible. In a highly dense and utilized railway network, the possession time on the track is critical since the demand for train operations is very high and a long possession has a severe impact on the regular train schedule. We present an optimization model and investigate the maintenance schedules with and without the possession capacity constraint. In addition, we also integrate the social-economic cost related to the effects of the maintenance time to the variable possession cost into the optimization model. A numerical example is provided to illustrate the model.

Keywords: Rail-track components, maintenance, optimal clustering, possession capacity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1001
1130 Model Reduction of Linear Systems by Conventional and Evolutionary Techniques

Authors: S. Panda, S. K. Tomar, R. Prasad, C. Ardil

Abstract:

Reduction of Single Input Single Output (SISO) continuous systems into Reduced Order Model (ROM), using a conventional and an evolutionary technique is presented in this paper. In the conventional technique, the mixed advantages of Mihailov stability criterion and continued fraction expansions (CFE) technique is employed where the reduced denominator polynomial is derived using Mihailov stability criterion and the numerator is obtained by matching the quotients of the Cauer second form of Continued fraction expansions. In the evolutionary technique method Particle Swarm Optimization (PSO) is employed to reduce the higher order model. PSO method is based on the minimization of the Integral Squared Error (ISE) between the transient responses of original higher order model and the reduced order model pertaining to a unit step input. Both the methods are illustrated through numerical example.

Keywords: Reduced Order Modeling, Stability, Continued Fraction Expansions, Mihailov Stability Criterion, Particle Swarm Optimization, Integral Squared Error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1931
1129 Multi-Objective Multi-Mode Resource-Constrained Project Scheduling Problem by Preemptive Fuzzy Goal Programming

Authors: Phruksaphanrat B.

Abstract:

This research proposes a preemptive fuzzy goal programming model for multi-objective multi-mode resource constrained project scheduling problem. The objectives of the problem are minimization of the total time and the total cost of the project. Objective in a multi-mode resource-constrained project scheduling problem is often a minimization of makespan. However, both time and cost should be considered at the same time with different level of important priorities. Moreover, all elements of cost functions in a project are not included in the conventional cost objective function. Incomplete total project cost causes an error in finding the project scheduling time. In this research, preemptive fuzzy goal programming is presented to solve the multi-objective multi-mode resource constrained project scheduling problem. It can find the compromise solution of the problem. Moreover, it is also flexible in adjusting to find a variety of alternative solutions. 

Keywords: Multi-mode resource constrained project scheduling problem, Fuzzy set, Goal programming, Preemptive fuzzy goal programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2766
1128 Proposing a Pareto-based Multi-Objective Evolutionary Algorithm to Flexible Job Shop Scheduling Problem

Authors: Seyed Habib A. Rahmati

Abstract:

During last decades, developing multi-objective evolutionary algorithms for optimization problems has found considerable attention. Flexible job shop scheduling problem, as an important scheduling optimization problem, has found this attention too. However, most of the multi-objective algorithms that are developed for this problem use nonprofessional approaches. In another words, most of them combine their objectives and then solve multi-objective problem through single objective approaches. Of course, except some scarce researches that uses Pareto-based algorithms. Therefore, in this paper, a new Pareto-based algorithm called controlled elitism non-dominated sorting genetic algorithm (CENSGA) is proposed for the multi-objective FJSP (MOFJSP). Our considered objectives are makespan, critical machine work load, and total work load of machines. The proposed algorithm is also compared with one the best Pareto-based algorithms of the literature on some multi-objective criteria, statistically.

Keywords: Scheduling, Flexible job shop scheduling problem, controlled elitism non-dominated sorting genetic algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1942
1127 Optimum Time Coordination of Overcurrent Relays using Two Phase Simplex Method

Authors: Prashant P. Bedekar, Sudhir R. Bhide, Vijay S. Kale

Abstract:

Overcurrent (OC) relays are the major protection devices in a distribution system. The operating time of the OC relays are to be coordinated properly to avoid the mal-operation of the backup relays. The OC relay time coordination in ring fed distribution networks is a highly constrained optimization problem which can be stated as a linear programming problem (LPP). The purpose is to find an optimum relay setting to minimize the time of operation of relays and at the same time, to keep the relays properly coordinated to avoid the mal-operation of relays. This paper presents two phase simplex method for optimum time coordination of OC relays. The method is based on the simplex algorithm which is used to find optimum solution of LPP. The method introduces artificial variables to get an initial basic feasible solution (IBFS). Artificial variables are removed using iterative process of first phase which minimizes the auxiliary objective function. The second phase minimizes the original objective function and gives the optimum time coordination of OC relays.

Keywords: Constrained optimization, LPP, Overcurrent relaycoordination, Two-phase simplex method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3011
1126 Seismic Control of Tall Building Using a New Optimum Controller Based on GA

Authors: A. Shayeghi, H. Eimani Kalasar, H. Shayeghi

Abstract:

This paper emphasizes on the application of genetic algorithm (GA) to optimize the parameters of the TMD for achieving the best results in the reduction of the building response under earthquake excitations. The Integral of the Time multiplied Absolute value of the Error (ITAE) based on relative displacement of all floors in the building is taken as a performance index of the optimization criterion. The problem of robustly TMD controller design is formatted as an optimization problem based on the ITAE performance index to be solved using GA that has a story ability to find the most optimistic results. An 11–story realistic building, located in the city of Rasht, Iran is considered as a test system to demonstrate effectiveness of the proposed GA based TMD (GATMD) controller without specifying which mode should be controlled. The results of the proposed GATMD controller are compared with the uncontrolled structure through timedomain simulation and some performance indices. The results analysis reveals that the designed GA based TMD controller has an excellent capability in reduction of the seismically excited example building and the ITAE performance, that is so for remains as unknown, can be introduced a new criteria - method for structural dynamic design.

Keywords: Tuned Mass Damper, Genetic Algorithm, TallBuildings, Structural Dynamics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1803
1125 Dynamic Correlations and Portfolio Optimization between Islamic and Conventional Equity Indexes: A Vine Copula-Based Approach

Authors: Imen Dhaou

Abstract:

This study examines conditional Value at Risk by applying the GJR-EVT-Copula model, and finds the optimal portfolio for eight Dow Jones Islamic-conventional pairs. Our methodology consists of modeling the data by a bivariate GJR-GARCH model in which we extract the filtered residuals and then apply the Peak over threshold model (POT) to fit the residual tails in order to model marginal distributions. After that, we use pair-copula to find the optimal portfolio risk dependence structure. Finally, with Monte Carlo simulations, we estimate the Value at Risk (VaR) and the conditional Value at Risk (CVaR). The empirical results show the VaR and CVaR values for an equally weighted portfolio of Dow Jones Islamic-conventional pairs. In sum, we found that the optimal investment focuses on Islamic-conventional US Market index pairs because of high investment proportion; however, all other index pairs have low investment proportion. These results deliver some real repercussions for portfolio managers and policymakers concerning to optimal asset allocations, portfolio risk management and the diversification advantages of these markets.

Keywords: CVaR, Dow Jones Islamic index, GJR-GARCH-EVT-pair copula, portfolio optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1004
1124 Optimization of Process Parameters using Response Surface Methodology for the Removal of Zinc(II) by Solvent Extraction

Authors: B. Guezzen, M.A. Didi, B. Medjahed

Abstract:

A factorial design of experiments and a response surface methodology were implemented to investigate the liquid-liquid extraction process of zinc (II) from acetate medium using the 1-Butyl-imidazolium di(2-ethylhexyl) phosphate [BIm+][D2EHP-]. The optimization process of extraction parameters such as the initial pH effect (2.5, 4.5, and 6.6), ionic liquid concentration (1, 5.5, and 10 mM) and salt effect (0.01, 5, and 10 mM) was carried out using a three-level full factorial design (33). The results of the factorial design demonstrate that all these factors are statistically significant, including the square effects of pH and ionic liquid concentration. The results showed that the order of significance: IL concentration > salt effect > initial pH. Analysis of variance (ANOVA) showing high coefficient of determination (R2 = 0.91) and low probability values (P < 0.05) signifies the validity of the predicted second-order quadratic model for Zn (II) extraction. The optimum conditions for the extraction of zinc (II) at the constant temperature (20 °C), initial Zn (II) concentration (1mM) and A/O ratio of unity were: initial pH (4.8), extractant concentration (9.9 mM), and NaCl concentration (8.2 mM). At the optimized condition, the metal ion could be quantitatively extracted.

Keywords: Ionic liquid, response surface methodology, solvent extraction, zinc acetate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1158
1123 Real Power Generation Scheduling to Improve Steady State Stability Limit in the Java-Bali 500kV Interconnection Power System

Authors: Indar Chaerah Gunadin, Adi Soeprijanto, Ontoseno Penangsang

Abstract:

This paper will discuss about an active power generator scheduling method in order to increase the limit level of steady state systems. Some power generator optimization methods such as Langrange, PLN (Indonesian electricity company) Operation, and the proposed Z-Thevenin-based method will be studied and compared in respect of their steady state aspects. A method proposed in this paper is built upon the thevenin equivalent impedance values between each load respected to each generator. The steady state stability index obtained with the REI DIMO method. This research will review the 500kV-Jawa-Bali interconnection system. The simulation results show that the proposed method has the highest limit level of steady state stability compared to other optimization techniques such as Lagrange, and PLN operation. Thus, the proposed method can be used to create the steady state stability limit of the system especially in the peak load condition.

Keywords: generation scheduling, steady-state stability limit, REI Dimo, margin stability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2288
1122 Scheduling Maintenance Actions for Gas Turbines Aircraft Engines

Authors: Anis Gharbi

Abstract:

This paper considers the problem of scheduling maintenance actions for identical aircraft gas turbine engines. Each one of the turbines consists of parts which frequently require replacement. A finite inventory of spare parts is available and all parts are ready for replacement at any time. The inventory consists of both new and refurbished parts. Hence, these parts have different field lives. The goal is to find a replacement part sequencing that maximizes the time that the aircraft will keep functioning before the inventory is replenished. The problem is formulated as an identical parallel machine scheduling problem where the minimum completion time has to be maximized. Two models have been developed. The first one is an optimization model which is based on a 0-1 linear programming formulation, while the second one is an approximate procedure which consists in decomposing the problem into several two-machine subproblems. Each subproblem is optimally solved using the first model. Both models have been implemented using Lingo and have been tested on two sets of randomly generated data with up to 150 parts and 10 turbines. Experimental results show that the optimization model is able to solve only instances with no more than 4 turbines, while the decomposition procedure often provides near-optimal solutions within a maximum CPU time of 3 seconds.

Keywords: Aircraft turbines, Scheduling, Identical parallel machines, 0-1 linear programming, Heuristic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2009
1121 Cost Analysis of Hybrid Wind Energy Generating System Considering CO2 Emissions

Authors: M. A. Badr, M.N. El Kordy, A. N. Mohib, M. M. Ibrahim

Abstract:

The basic objective of the research is to study the effect of hybrid wind energy on the cost of generated electricity considering the cost of reduction CO2 emissions. The system consists of small wind turbine(s), storage battery bank and a diesel generator (W/D/B). Using an optimization software package, different system configurations are investigated to reach optimum configuration based on the net present cost (NPC) and cost of energy (COE) as economic optimization criteria. The cost of avoided CO2 is taken into consideration. The system is intended to supply the electrical load of a small community (gathering six families) in a remote Egyptian area. The investigated system is not connected to the electricity grid and may replace an existing conventional diesel powered electric supply system to reduce fuel consumption and CO2 emissions. The simulation results showed that W/D energy system is more economic than diesel alone. The estimated COE is 0.308$/kWh and extracting the cost of avoided CO2, the COE reached 0.226 $/kWh which is an external benefit of wind turbine, as there are no pollutant emissions through operational phase.

Keywords: Hybrid wind turbine systems, remote areas electrification, simulation of hybrid energy systems, techno-economic study.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1210
1120 Object-Centric Process Mining Using Process Cubes

Authors: Anahita Farhang Ghahfarokhi, Alessandro Berti, Wil M.P. van der Aalst

Abstract:

Process mining provides ways to analyze business processes. Common process mining techniques consider the process as a whole. However, in real-life business processes, different behaviors exist that make the overall process too complex to interpret. Process comparison is a branch of process mining that isolates different behaviors of the process from each other by using process cubes. Process cubes organize event data using different dimensions. Each cell contains a set of events that can be used as an input to apply process mining techniques. Existing work on process cubes assume single case notions. However, in real processes, several case notions (e.g., order, item, package, etc.) are intertwined. Object-centric process mining is a new branch of process mining addressing multiple case notions in a process. To make a bridge between object-centric process mining and process comparison, we propose a process cube framework, which supports process cube operations such as slice and dice on object-centric event logs. To facilitate the comparison, the framework is integrated with several object-centric process discovery approaches.

Keywords: Process mining, multidimensional process mining, multi-perspective business processes, OLAP, process cubes, process discovery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1123
1119 Optimization of Control Parameters for MRR in Injection Flushing Type of EDM on Stainless Steel 304 Workpiece

Authors: M. S. Reza, M. Hamdi, A.S. Hadi

Abstract:

The operating control parameters of injection flushing type of electrical discharge machining process on stainless steel 304 workpiece with copper tools are being optimized according to its individual machining characteristic i.e. material removal rate (MRR). Lower MRR during EDM machining process may decrease its- machining productivity. Hence, the quality characteristic for MRR is set to higher-the-better to achieve the optimum machining productivity. Taguchi method has been used for the construction, layout and analysis of the experiment for each of the machining characteristic for the MRR. The use of Taguchi method in the experiment saves a lot of time and cost of preparing and machining the experiment samples. Therefore, an L18 Orthogonal array which was the fundamental component in the statistical design of experiments has been used to plan the experiments and Analysis of Variance (ANOVA) is used to determine the optimum machining parameters for this machining characteristic. The control parameters selected for this optimization experiments are polarity, pulse on duration, discharge current, discharge voltage, machining depth, machining diameter and dielectric liquid pressure. The result had shown that the higher the discharge voltage, the higher will be the MRR.

Keywords: ANOVA, EDM, Injection Flushing, L18 OrthogonalArray, MRR, Stainless Steel 304

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1825
1118 Genetic Algorithm Parameters Optimization for Bi-Criteria Multiprocessor Task Scheduling Using Design of Experiments

Authors: Sunita Dhingra, Satinder Bal Gupta, Ranjit Biswas

Abstract:

Multiprocessor task scheduling is a NP-hard problem and Genetic Algorithm (GA) has been revealed as an excellent technique for finding an optimal solution. In the past, several methods have been considered for the solution of this problem based on GAs. But, all these methods consider single criteria and in the present work, minimization of the bi-criteria multiprocessor task scheduling problem has been considered which includes weighted sum of makespan & total completion time. Efficiency and effectiveness of genetic algorithm can be achieved by optimization of its different parameters such as crossover, mutation, crossover probability, selection function etc. The effects of GA parameters on minimization of bi-criteria fitness function and subsequent setting of parameters have been accomplished by central composite design (CCD) approach of response surface methodology (RSM) of Design of Experiments. The experiments have been performed with different levels of GA parameters and analysis of variance has been performed for significant parameters for minimisation of makespan and total completion time simultaneously.

Keywords: Multiprocessor task scheduling, Design of experiments, Genetic Algorithm, Makespan, Total completion time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2850
1117 Statistical Optimization of Medium Components for Biomass Production of Chlorella pyrenoidosa under Autotrophic Conditions and Evaluation of Its Biochemical Composition under Stress Conditions

Authors: N. P. Dhull, K. Gupta, R. Soni, D. K. Rahi, S. K. Soni

Abstract:

The aim of the present work was to statistically design an autotrophic medium for maximum biomass production by Chlorella pyrenoidosa using response surface methodology. After evaluating one factor at a time approach, K2HPO4, KNO3, MgSO4.7H2O and NaHCO3 were preferred over the other components of the fog’s medium as most critical autotrophic medium components. The study showed that the maximum biomass yield was achieved while the concentrations of MgSO4.7H2O, K2HPO4, KNO3 and NaHCO3 were 0.409 g/L, 0.24 g/L, 1.033 g/L, and 3.265 g/L, respectively. The study reported that the biomass productivity of C. pyrenoidosa improved from 0.14 g/L in defined fog’s medium to 1.40 g/L in modified fog’s medium resulting 10 fold increase. The biochemical composition biosynthesis of C. pyrenoidosa was altered using nitrogen limiting stress bringing about 5.23 fold increase in lipid content than control (cell without stress), as analyzed by FTIR integration method.

Keywords: Autotrophic condition, Chlorella pyrenoidosa, FTIR, Response Surface Methodology, Optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2446
1116 Common Sense Leadership in the Example of Turkish Political Leader Devlet Bahçeli

Authors: B. Gültekin, T. Gültekin

Abstract:

Peace diplomacy is the most important international tool to maintain peace all over the World. This study consists of three parts. In the first part, the leadership of Devlet Bahçeli, leader of the Nationalist Movement Party, will be introduced as a tool of peace communication and peace management. Also, in this part, peace communication will be explained by the peace leadership traits of Devlet Bahçeli, who is one of the efficient political leaders representing the concepts of compromise and agreement on different sides of politics. In the second part of study, it is aimed to analyze Devlet Bahçeli’s leadership within the frame of peace communication and the final part of this study is about creating an original public communication model for public diplomacy based on Devlet Bahçeli as an example. As a result, the main purpose of this study is to develop an original peace communication model including peace modules, peace management projects, original dialogue procedures and protocols exhibited in the policies of Devlet Bahçeli. The political leadership represented by Devlet Bahçeli inspires political leaders to provide peace communication. In this study, principles and policies of peace leadership of Devlet Bahçeli will be explained as an original model on a peace communication platform.

Keywords: Dialogue management, public diplomacy, peace diplomacy, peace leadership.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 872
1115 Evolving a Fuzzy Rule-Base for Image Segmentation

Authors: A. Borji, M. Hamidi

Abstract:

A new method for color image segmentation using fuzzy logic is proposed in this paper. Our aim here is to automatically produce a fuzzy system for color classification and image segmentation with least number of rules and minimum error rate. Particle swarm optimization is a sub class of evolutionary algorithms that has been inspired from social behavior of fishes, bees, birds, etc, that live together in colonies. We use comprehensive learning particle swarm optimization (CLPSO) technique to find optimal fuzzy rules and membership functions because it discourages premature convergence. Here each particle of the swarm codes a set of fuzzy rules. During evolution, a population member tries to maximize a fitness criterion which is here high classification rate and small number of rules. Finally, particle with the highest fitness value is selected as the best set of fuzzy rules for image segmentation. Our results, using this method for soccer field image segmentation in Robocop contests shows 89% performance. Less computational load is needed when using this method compared with other methods like ANFIS, because it generates a smaller number of fuzzy rules. Large train dataset and its variety, makes the proposed method invariant to illumination noise

Keywords: Comprehensive learning Particle Swarmoptimization, fuzzy classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1966
1114 A Particle Swarm Optimal Control Method for DC Motor by Considering Energy Consumption

Authors: Yingjie Zhang, Ming Li, Ying Zhang, Jing Zhang, Zuolei Hu

Abstract:

In the actual start-up process of DC motors, the DC drive system often faces a conflict between energy consumption and acceleration performance. To resolve the conflict, this paper proposes a comprehensive performance index that energy consumption index is added on the basis of classical control performance index in the DC motor starting process. Taking the comprehensive performance index as the cost function, particle swarm optimization algorithm is designed to optimize the comprehensive performance. Then it conducts simulations on the optimization of the comprehensive performance of the DC motor on condition that the weight coefficient of the energy consumption index should be properly designed. The simulation results show that as the weight of energy consumption increased, the energy efficiency was significantly improved at the expense of a slight sacrifice of fastness indicators with the comprehensive performance index method. The energy efficiency was increased from 63.18% to 68.48% and the response time reduced from 0.2875s to 0.1736s simultaneously compared with traditional proportion integrals differential controller in energy saving.

Keywords: Comprehensive performance index, energy consumption, acceleration performance, particle swarm optimal control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 653
1113 Optimizing Logistics for Courier Organizations with Considerations of Congestions and Pickups: A Courier Delivery System in Amman as Case Study

Authors: Nader A. Al Theeb, Zaid Abu Manneh, Ibrahim Al-Qadi

Abstract:

Traveling salesman problem (TSP) is a combinatorial integer optimization problem that asks "What is the optimal route for a vehicle to traverse in order to deliver requests to a given set of customers?”. It is widely used by the package carrier companies’ distribution centers. The main goal of applying the TSP in courier organizations is to minimize the time that it takes for the courier in each trip to deliver or pick up the shipments during a day. In this article, an optimization model is constructed to create a new TSP variant to optimize the routing in a courier organization with a consideration of congestion in Amman, the capital of Jordan. Real data were collected by different methods and analyzed. Then, concert technology - CPLEX was used to solve the proposed model for some random generated data instances and for the real collected data. At the end, results have shown a great improvement in time compared with the current trip times, and an economic study was conducted afterwards to figure out the impact of using such models.

Keywords: Travel salesman problem, congestions, pick-up, integer programming, package carriers, service engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 950
1112 Near-Field Robust Adaptive Beamforming Based on Worst-Case Performance Optimization

Authors: Jing-ran Lin, Qi-cong Peng, Huai-zong Shao

Abstract:

The performance of adaptive beamforming degrades substantially in the presence of steering vector mismatches. This degradation is especially severe in the near-field, for the 3-dimensional source location is more difficult to estimate than the 2-dimensional direction of arrival in far-field cases. As a solution, a novel approach of near-field robust adaptive beamforming (RABF) is proposed in this paper. It is a natural extension of the traditional far-field RABF and belongs to the class of diagonal loading approaches, with the loading level determined based on worst-case performance optimization. However, different from the methods solving the optimal loading by iteration, it suggests here a simple closed-form solution after some approximations, and consequently, the optimal weight vector can be expressed in a closed form. Besides simplicity and low computational cost, the proposed approach reveals how different factors affect the optimal loading as well as the weight vector. Its excellent performance in the near-field is confirmed via a number of numerical examples.

Keywords: Robust adaptive beamforming (RABF), near-field, steering vector mismatches, diagonal loading, worst-case performanceoptimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1886
1111 Effect of Process Parameters on the Proximate Composition, Functional and Sensory Properties

Authors: C. I. Omohimi, O. P. Sobukola, K. O. Sarafadeen, L.O. Sanni

Abstract:

Flour from Mucuna beans (Mucuna pruriens) were used in producing texturized meat analogue using a single screw extruder to monitor modifications on the proximate composition and the functional properties at high moisture level. Response surface methodology based on Box Behnken design at three levels of barrel temperature (110, 120, 130°C), screw speed (100,120,140rpm) and feed moisture (44, 47, 50%) were used in 17 runs. Regression models describing the effect of variables on the product responses were obtained. Descriptive profile analyses and consumer acceptability test were carried out on optimized flavoured extruded meat analogue. Responses were mostly affected by barrel temperature and moisture level and to a lesser extent by screw speed. Optimization results based on desirability concept indicated that a barrel temperature of 120.15°C, feed moisture of 47% and screw speed of 119.19 rpm would produce meat analogue of preferable proximate composition, functional and sensory properties which reveals consumers` likeness for the product.

Keywords: Functional properties, mucuna bean flour, optimization, proximate composition, texturized meat analogue.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3018
1110 Mathematical Modeling of the AMCs Cross-Contamination Removal in the FOUPs: Finite Element Formulation and Application in FOUP’s Decontamination

Authors: N. Santatriniaina, J. Deseure, T.Q. Nguyen, H. Fontaine, C. Beitia, L. Rakotomanana

Abstract:

Nowadays, with the increasing of the wafer's size and the decreasing of critical size of integrated circuit manufacturing in modern high-tech, microelectronics industry needs a maximum attention to challenge the contamination control. The move to 300 [mm] is accompanied by the use of Front Opening Unified Pods for wafer and his storage. In these pods an airborne cross contamination may occur between wafers and the pods. A predictive approach using modeling and computational methods is very powerful method to understand and qualify the AMCs cross contamination processes. This work investigates the required numerical tools which are employed in order to study the AMCs cross-contamination transfer phenomena between wafers and FOUPs. Numerical optimization and finite element formulation in transient analysis were established. Analytical solution of one dimensional problem was developed and the calibration process of physical constants was performed. The least square distance between the model (analytical 1D solution) and the experimental data are minimized. The behavior of the AMCs intransient analysis was determined. The model framework preserves the classical forms of the diffusion and convection-diffusion equations and yields to consistent form of the Fick's law. The adsorption process and the surface roughness effect were also traduced as a boundary condition using the switch condition Dirichlet to Neumann and the interface condition. The methodology is applied, first using the optimization methods with analytical solution to define physical constants, and second using finite element method including adsorption kinetic and the switch of Dirichlet to Neumann condition.

Keywords: AMCs, FOUP, cross-contamination, adsorption, diffusion, numerical analysis, wafers, Dirichlet to Neumann, finite elements methods, Fick’s law, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3181
1109 Mining Correlated Bicluster from Web Usage Data Using Discrete Firefly Algorithm Based Biclustering Approach

Authors: K. Thangavel, R. Rathipriya

Abstract:

For the past one decade, biclustering has become popular data mining technique not only in the field of biological data analysis but also in other applications like text mining, market data analysis with high-dimensional two-way datasets. Biclustering clusters both rows and columns of a dataset simultaneously, as opposed to traditional clustering which clusters either rows or columns of a dataset. It retrieves subgroups of objects that are similar in one subgroup of variables and different in the remaining variables. Firefly Algorithm (FA) is a recently-proposed metaheuristic inspired by the collective behavior of fireflies. This paper provides a preliminary assessment of discrete version of FA (DFA) while coping with the task of mining coherent and large volume bicluster from web usage dataset. The experiments were conducted on two web usage datasets from public dataset repository whereby the performance of FA was compared with that exhibited by other population-based metaheuristic called binary Particle Swarm Optimization (PSO). The results achieved demonstrate the usefulness of DFA while tackling the biclustering problem.

Keywords: Biclustering, Binary Particle Swarm Optimization, Discrete Firefly Algorithm, Firefly Algorithm, Usage profile Web usage mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2139