Search results for: production process optimization
22372 Hexagonal Honeycomb Sandwich Plate Optimization Using Gravitational Search Algorithm
Authors: A. Boudjemai, A. Zafrane, R. Hocine
Abstract:
Honeycomb sandwich panels are increasingly used in the construction of space vehicles because of their outstanding strength, stiffness and light weight properties. However, the use of honeycomb sandwich plates comes with difficulties in the design process as a result of the large number of design variables involved, including composite material design, shape and geometry. Hence, this work deals with the presentation of an optimal design of hexagonal honeycomb sandwich structures subjected to space environment. The optimization process is performed using a set of algorithms including the gravitational search algorithm (GSA). Numerical results are obtained and presented for a set of algorithms. The results obtained by the GSA algorithm are much better compared to other algorithms used in this study.Keywords: optimization, gravitational search algorithm, genetic algorithm, honeycomb plate
Procedia PDF Downloads 37722371 Model of Optimal Centroids Approach for Multivariate Data Classification
Authors: Pham Van Nha, Le Cam Binh
Abstract:
Particle swarm optimization (PSO) is a population-based stochastic optimization algorithm. PSO was inspired by the natural behavior of birds and fish in migration and foraging for food. PSO is considered as a multidisciplinary optimization model that can be applied in various optimization problems. PSO’s ideas are simple and easy to understand but PSO is only applied in simple model problems. We think that in order to expand the applicability of PSO in complex problems, PSO should be described more explicitly in the form of a mathematical model. In this paper, we represent PSO in a mathematical model and apply in the multivariate data classification. First, PSOs general mathematical model (MPSO) is analyzed as a universal optimization model. Then, Model of Optimal Centroids (MOC) is proposed for the multivariate data classification. Experiments were conducted on some benchmark data sets to prove the effectiveness of MOC compared with several proposed schemes.Keywords: analysis of optimization, artificial intelligence based optimization, optimization for learning and data analysis, global optimization
Procedia PDF Downloads 20822370 Medium Design and Optimization for High Β-Galactosidase Producing Microbial Strains from Dairy Waste through Fermentation
Authors: Ashish Shukla, K. P. Mishra, Pushplata Tripathi
Abstract:
This paper investigates the production and optimization of β-galactosidase enzyme using synthetic medium by isolated wild strains (S1, S2) mutated strains (M1, M2) through SSF and SmF. Among the different cell disintegration methods used, the highest specific activity was obtained when the cells were permeabilized using isoamyl alcohol. Wet lab experiments were performed to investigate the effects of carbon and nitrogen substrates present in Vogel’s medium on β-galactosidase enzyme activity using S1, S2, and M1, M2 strains through SSF. SmF experiments were performed for effects of carbon and nitrogen sources in YLK2Mg medium on β-galactosidase enzyme activity using S1, S2 and M1, M2 strains. Effect of pH on β-galactosidase enzyme production was also done using S1, S2, and M1, M2 strains. Results were found to be very appreciable in all the cases.Keywords: β-galactosidase, cell disintegration, permeabilized, SSF, SmF
Procedia PDF Downloads 27222369 High Titer Cellulosic Ethanol Production Achieved by Fed-Batch Prehydrolysis Simultaneous Enzymatic Saccharification and Fermentation of Sulfite Pretreated Softwood
Authors: Chengyu Dong, Shao-Yuan Leu
Abstract:
Cellulosic ethanol production from lignocellulosic biomass can reduce our reliance on fossil fuel, mitigate climate change, and stimulate rural economic development. The relative low ethanol production (60 g/L) limits the economic viable of lignocellulose-based biorefinery. The ethanol production can be increased up to 80 g/L by removing nearly all the non-cellulosic materials, while the capital of the pretreatment process increased significantly. In this study, a fed-batch prehydrolysis simultaneously saccharification and fermentation process (PSSF) was designed to converse the sulfite pretreated softwood (~30% residual lignin) to high concentrations of ethanol (80 g/L). The liquefaction time of hydrolysis process was shortened down to 24 h by employing the fed-batch strategy. Washing out the spent liquor with water could eliminate the inhibition of the pretreatment spent liquor. However, the ethanol yield of lignocellulose was reduced as the fermentable sugars were also lost during the process. Fed-batch prehydrolyzing the while slurry (i.e. liquid plus solid fraction) pretreated softwood for 24 h followed by simultaneously saccharification and fermentation process at 28 °C can generate 80 g/L ethanol production. Fed-batch strategy is very effectively to eliminate the “solid effect” of the high gravity saccharification, so concentrating the cellulose to nearly 90% by the pretreatment process is not a necessary step to get high ethanol production. Detoxification of the pretreatment spent liquor caused the loss of sugar and reduced the ethanol yield consequently. The tolerance of yeast to inhibitors was better at 28 °C, therefore, reducing the temperature of the following fermentation process is a simple and valid method to produce high ethanol production.Keywords: cellulosic ethanol, sulfite pretreatment, Fed batch PSSF, temperature
Procedia PDF Downloads 36722368 Statistical Optimization of Vanillin Production by Pycnoporus Cinnabarinus 1181
Authors: Swarali Hingse, Shraddha Digole, Uday Annapure
Abstract:
The present study investigates the biotransformation of ferulic acid to vanillin by Pycnoporus cinnabarinus and its optimization using one-factor-at-a-time method as well as statistical approach. Effect of various physicochemical parameters and medium components was studied using one-factor-at-a-time method. Screening of the significant factors was carried out using L25 Taguchi orthogonal array and then these selected significant factors were further optimized using response surface methodology (RSM). Significant media components obtained using Taguchi L25 orthogonal array were glucose, KH2PO4 and yeast extract. Further, a Box Behnken design was used to investigate the interactive effects of the three most significant media components. The final medium obtained after optimization using RSM containing glucose (34.89 g/L), diammonium tartrate (1 g/L), yeast extract (1.47 g/L), MgSO4•7H2O (0.5 g/L), KH2PO4 (0.15 g/L), and CaCl2•2H2O (20 mg/L) resulted in amplification of vanillin production from 30.88 mg/L to 187.63 mg/L.Keywords: ferulic acid, pycnoporus cinnabarinus, response surface methodology, vanillin
Procedia PDF Downloads 38322367 Titania Assisted Metal-Organic Framework Matrix for Elevated Hydrogen Generation Combined with the Production of Graphene Sheets through Water-Splitting Process
Authors: Heba M. Gobara, Ahmed A. M. El-Naggar, Rasha S. El-Sayed, Amal A. AlKahlawy
Abstract:
In this study, metal organic framework (Cr-MIL-101) and TiO₂ nanoparticles were utilized as two semiconductors for water splitting process. The coupling of both semiconductors in order to improve the photocatalytic reactivity for the hydrogen production in presence of methanol as a hole scavenger under visible light (sunlight) has been performed. The forementioned semiconductors and the collected samples after water splitting application are characterized by several techniques viz., XRD, N₂ adsorption-desorption, TEM, ED, EDX, Raman spectroscopy and the total content of carbon. The results revealed an efficient yield of H₂ production with maximum purity 99.3% with the in-situ formation of graphene oxide nanosheets and multiwalled carbon nanotubes coated over the surface of the physically mixed Cr-MIL-101–TiO₂ system. The amount of H₂ gas produced was stored when using Cr-MIL-101 catalyst individually. The obtained data in this work provides promising candidate materials for pure hydrogen production as a clean fuel acquired from the water splitting process. In addition, the in-situ production of graphene nanosheets and carbon nanotubes is counted as promising advances for the presented process.Keywords: hydrogen production, water splitting, photocatalysts, Graphene
Procedia PDF Downloads 18822366 Numerical Design and Characterization of SiC Single Crystals Obtained with PVT Method
Authors: T. Wejrzanowski, M. Grybczuk, E. Tymicki, K. J. Kurzydlowski
Abstract:
In the present study, numerical simulations of heat and mass transfer in Physical Vapor Transport reactor during silicon carbide single crystal growth are addressed. Silicon carbide is a wide bandgap material with unique properties making it highly applicable for high power electronics applications. Because of high manufacturing costs improvements of SiC production process are required. In this study, numerical simulations were used as a tool of process optimization. Computer modeling allows for cost and time effective analysis of processes occurring during SiC single crystal growth and provides essential information needed for improvement of the process. Quantitative relationship between process conditions, such as temperature or pressure, and crystal growth rate and shape of crystallization front have been studied and verified using experimental data. Basing on modeling results, several process improvements were proposed and implemented.Keywords: Finite Volume Method, semiconductors, Physica Vapor Transport, silicon carbide
Procedia PDF Downloads 49822365 Optimization of End Milling Process Parameters for Minimization of Surface Roughness of AISI D2 Steel
Authors: Pankaj Chandna, Dinesh Kumar
Abstract:
The present work analyses different parameters of end milling to minimize the surface roughness for AISI D2 steel. D2 Steel is generally used for stamping or forming dies, punches, forming rolls, knives, slitters, shear blades, tools, scrap choppers, tyre shredders etc. Surface roughness is one of the main indices that determines the quality of machined products and is influenced by various cutting parameters. In machining operations, achieving desired surface quality by optimization of machining parameters, is a challenging job. In case of mating components the surface roughness become more essential and is influenced by the cutting parameters, because, these quality structures are highly correlated and are expected to be influenced directly or indirectly by the direct effect of process parameters or their interactive effects (i.e. on process environment). In this work, the effects of selected process parameters on surface roughness and subsequent setting of parameters with the levels have been accomplished by Taguchi’s parameter design approach. The experiments have been performed as per the combination of levels of different process parameters suggested by L9 orthogonal array. Experimental investigation of the end milling of AISI D2 steel with carbide tool by varying feed, speed and depth of cut and the surface roughness has been measured using surface roughness tester. Analyses of variance have been performed for mean and signal-to-noise ratio to estimate the contribution of the different process parameters on the process.Keywords: D2 steel, orthogonal array, optimization, surface roughness, Taguchi methodology
Procedia PDF Downloads 54422364 A Holistic Approach for Technical Product Optimization
Authors: Harald Lang, Michael Bader, A. Buchroithner
Abstract:
Holistic methods covering the development process as a whole – e.g. systems engineering – have established themselves in product design. However, technical product optimization, representing improvements in efficiency and/or minimization of loss, usually applies to single components of a system. A holistic approach is being defined based on a hierarchical point of view of systems engineering. This is subsequently presented using the example of an electromechanical flywheel energy storage system for automotive applications.Keywords: design, product development, product optimization, systems engineering
Procedia PDF Downloads 62422363 Scheduling in a Single-Stage, Multi-Item Compatible Process Using Multiple Arc Network Model
Authors: Bokkasam Sasidhar, Ibrahim Aljasser
Abstract:
The problem of finding optimal schedules for each equipment in a production process is considered, which consists of a single stage of manufacturing and which can handle different types of products, where changeover for handling one type of product to the other type incurs certain costs. The machine capacity is determined by the upper limit for the quantity that can be processed for each of the products in a set up. The changeover costs increase with the number of set ups and hence to minimize the costs associated with the product changeover, the planning should be such that similar types of products should be processed successively so that the total number of changeovers and in turn the associated set up costs are minimized. The problem of cost minimization is equivalent to the problem of minimizing the number of set ups or equivalently maximizing the capacity utilization in between every set up or maximizing the total capacity utilization. Further, the production is usually planned against customers’ orders, and generally different customers’ orders are assigned one of the two priorities – “normal” or “priority” order. The problem of production planning in such a situation can be formulated into a Multiple Arc Network (MAN) model and can be solved sequentially using the algorithm for maximizing flow along a MAN and the algorithm for maximizing flow along a MAN with priority arcs. The model aims to provide optimal production schedule with an objective of maximizing capacity utilization, so that the customer-wise delivery schedules are fulfilled, keeping in view the customer priorities. Algorithms have been presented for solving the MAN formulation of the production planning with customer priorities. The application of the model is demonstrated through numerical examples.Keywords: scheduling, maximal flow problem, multiple arc network model, optimization
Procedia PDF Downloads 40222362 Multi-Objective Optimization of an Aerodynamic Feeding System Using Genetic Algorithm
Authors: Jan Busch, Peter Nyhuis
Abstract:
Considering the challenges of short product life cycles and growing variant diversity, cost minimization and manufacturing flexibility increasingly gain importance to maintain a competitive edge in today’s global and dynamic markets. In this context, an aerodynamic part feeding system for high-speed industrial assembly applications has been developed at the Institute of Production Systems and Logistics (IFA), Leibniz Universitaet Hannover. The aerodynamic part feeding system outperforms conventional systems with respect to its process safety, reliability, and operating speed. In this paper, a multi-objective optimisation of the aerodynamic feeding system regarding the orientation rate, the feeding velocity and the required nozzle pressure is presented.Keywords: aerodynamic feeding system, genetic algorithm, multi-objective optimization, workpiece orientation
Procedia PDF Downloads 57722361 Cuckoo Search (CS) Optimization Algorithm for Solving Constrained Optimization
Authors: Sait Ali Uymaz, Gülay Tezel
Abstract:
This paper presents the comparison results on the performance of the Cuckoo Search (CS) algorithm for constrained optimization problems. For constraint handling, CS algorithm uses penalty method. CS algorithm is tested on thirteen well-known test problems and the results obtained are compared to Particle Swarm Optimization (PSO) algorithm. Mean, best, median and worst values were employed for the analyses of performance.Keywords: cuckoo search, particle swarm optimization, constrained optimization problems, penalty method
Procedia PDF Downloads 55722360 The Effect of Immobilization Conditions on Hydrogen Production from Palm Oil Mill Effluent
Authors: A. W. Zularisam, Lakhveer Singh, Mimi Sakinah Abdul Munaim
Abstract:
In this study, the optimization of hydrogen production using polyethylene glycol (PEG) immobilized sludge was investigated in batch tests. Palm oil mill effluent (POME) is used as a substrate that can act as a carbon source. Experiment focus on the effect of some important affecting factors on fermentative hydrogen production. Results showed that immobilized sludge demonstrated the maximum hydrogen production rate of 340 mL/L-POME/h under follow optimal condition: amount of biomass 10 mg VSS/ g bead, PEG concentration 10%, and cell age 24 h or 40 h. More importantly, immobilized sludge not only enhanced hydrogen production but can also tolerate the harsh environment and produce hydrogen at the wide ranges of pH. The present results indicate the potential of PEG-immobilized sludge for large-scale operations as well; these factors play an important role in stable and continuous hydrogen production.Keywords: bioydrogen, immobilization, polyethylene glycol, palm oil mill effluent, dark fermentation
Procedia PDF Downloads 34222359 Techno-Economic Optimization and Evaluation of an Integrated Industrial Scale NMC811 Cathode Active Material Manufacturing Process
Authors: Usama Mohamed, Sam Booth, Aliysn J. Nedoma
Abstract:
As part of the transition to electric vehicles, there has been a recent increase in demand for battery manufacturing. Cathodes typically account for approximately 50% of the total lithium-ion battery cell cost and are a pivotal factor in determining the viability of new industrial infrastructure. Cathodes which offer lower costs whilst maintaining or increasing performance, such as nickel-rich layered cathodes, have a significant competitive advantage when scaling up the manufacturing process. This project evaluates the techno-economic value proposition of an integrated industrial scale cathode active material (CAM) production process, closing the mass and energy balances, and optimizing the operation conditions using a sensitivity analysis. This is done by developing a process model of a co-precipitation synthesis route using Aspen Plus software and validated based on experimental data. The mechanism chemistry and equilibrium conditions were established based on previous literature and HSC-Chemistry software. This is then followed by integrating the energy streams, adding waste recovery and treatment processes, as well as testing the effect of key parameters (temperature, pH, reaction time, etc.) on CAM production yield and emissions. Finally, an economic analysis estimating the fixed and variable costs (including capital expenditure, labor costs, raw materials, etc.) to calculate the cost of CAM ($/kg and $/kWh), total plant cost ($) and net present value (NPV). This work sets the foundational blueprint for future research into sustainable industrial scale processes for CAM manufacturing.Keywords: cathodes, industrial production, nickel-rich layered cathodes, process modelling, techno-economic analysis
Procedia PDF Downloads 10022358 Students’ Experiential Knowledge Production in the Teaching-Learning Process of Universities
Authors: Didiosky Benítez-Erice, Frederik Questier, Dalgys Pérez-Luján
Abstract:
This paper aims to present two models around the production of students’ experiential knowledge in the teaching-learning process of higher education: the teacher-centered production model and the student-centered production model. From a range of knowledge management and experiential learning theories, the paper elaborates into the nature of students’ experiential knowledge and proposes further adjustments of existing second-generation knowledge management theories taking into account the particularities of higher education. Despite its theoretical nature the paper can be relevant for future studies that stress student-driven improvement and innovation at higher education institutions.Keywords: experiential knowledge, higher education, knowledge management, teaching-learning process
Procedia PDF Downloads 44522357 A Sensitivity Analysis on the Production of Potable Water, Green Hydrogen and Derivatives from South-West African Seawater
Authors: Shane David van Zyl, A. J. Burger
Abstract:
The global green energy shift has placed significant value on the production of green hydrogen and its derivatives. The study examines the impact on capital expenditure (CAPEX), operational expenditure (OPEX), levelized cost, and environmental impact, depending on the relationship between various production capacities of potable water, green hydrogen, and green ammonia. A model-based sensitivity analysis approach was used to determine the relevance of various process parameters in the production of potable water combined with green hydrogen or green ammonia production. The effects of changes on CAPEX, OPEX and levelized costs of the products were determined. Furthermore, a qualitative environmental impact analysis was done to determine the effect on the environment. The findings indicated the individual process unit contribution to the overall CAPEX and OPEX while also determining the major contributors to changes in the levelized costs of products. The results emphasize the difference in costs associated with potable water, green hydrogen, and green ammonia production, indicating the extent to which potable water production costs become insignificant in the complete process, which, therefore, can have a large social benefit through increased potable water production resulting in decreased water scarcity in the south-west African region.Keywords: CAPEX and OPEX, desalination, green hydrogen and green ammonia, sensitivity analysis
Procedia PDF Downloads 3922356 Metareasoning Image Optimization Q-Learning
Authors: Mahasa Zahirnia
Abstract:
The purpose of this paper is to explore new and effective ways of optimizing satellite images using artificial intelligence, and the process of implementing reinforcement learning to enhance the quality of data captured within the image. In our implementation of Bellman's Reinforcement Learning equations, associated state diagrams, and multi-stage image processing, we were able to enhance image quality, detect and define objects. Reinforcement learning is the differentiator in the area of artificial intelligence, and Q-Learning relies on trial and error to achieve its goals. The reward system that is embedded in Q-Learning allows the agent to self-evaluate its performance and decide on the best possible course of action based on the current and future environment. Results show that within a simulated environment, built on the images that are commercially available, the rate of detection was 40-90%. Reinforcement learning through Q-Learning algorithm is not just desired but required design criteria for image optimization and enhancements. The proposed methods presented are a cost effective method of resolving uncertainty of the data because reinforcement learning finds ideal policies to manage the process using a smaller sample of images.Keywords: Q-learning, image optimization, reinforcement learning, Markov decision process
Procedia PDF Downloads 21522355 Design and Optimisation of 2-Oxoglutarate Dioxygenase Expression in Escherichia coli Strains for Production of Bioethylene from Crude Glycerol
Authors: Idan Chiyanzu, Maruping Mangena
Abstract:
Crude glycerol, a major by-product from the transesterification of triacylglycerides with alcohol to biodiesel, is known to have a broad range of applications. For example, its bioconversion can afford a wide range of chemicals including alcohols, organic acids, hydrogen, solvents and intermediate compounds. In bacteria, the 2-oxoglutarate dioxygenase (2-OGD) enzymes are widely found among the Pseudomonas syringae species and have been recognized with an emerging importance in ethylene formation. However, the use of optimized enzyme function in recombinant systems for crude glycerol conversion to ethylene is still not been reported. The present study investigated the production of ethylene from crude glycerol using engineered E. coli MG1655 and JM109 strains. Ethylene production with an optimized expression system for 2-OGD in E. coli using a codon optimized construct of the ethylene-forming gene was studied. The codon-optimization resulted in a 20-fold increase of protein production and thus an enhanced production of the ethylene gas. For a reliable bioreactor performance, the effect of temperature, fermentation time, pH, substrate concentration, the concentration of methanol, concentration of potassium hydroxide and media supplements on ethylene yield was investigated. The results demonstrate that the recombinant enzyme can be used for future studies to exploit the conversion of low-priced crude glycerol into advanced value products like light olefins, and tools including recombineering techniques for DNA, molecular biology, and bioengineering can be used to allowing unlimited the production of ethylene directly from the fermentation of crude glycerol. It can be concluded that recombinant E.coli production systems represent significantly secure, renewable and environmentally safe alternative to thermochemical approach to ethylene production.Keywords: crude glycerol, bioethylene, recombinant E. coli, optimization
Procedia PDF Downloads 27922354 Continuous Catalytic Hydrogenation and Purification for Synthesis Non-Phthalate
Authors: Chia-Ling Li
Abstract:
The scope of this article includes the production of 10,000 metric tons of non-phthalate per annum. The production process will include hydrogenation, separation, purification, and recycling of unprocessed feedstock. Based on experimental data, conversion and selectivity were chosen as reaction model parameters. The synthesis and separation processes of non-phthalate and phthalate were established by using Aspen Plus software. The article will be divided into six parts: estimation of physical properties, integration of production processes, purification case study, utility consumption, economic feasibility study and identification of bottlenecks. The purities of products was higher than 99.9 wt. %. Process parameters have important guiding significance to the commercialization of hydrogenation of phthalate.Keywords: economic analysis, hydrogenation, non-phthalate, process simulation
Procedia PDF Downloads 27722353 Safety Approach Highway Alignment Optimization
Authors: Seyed Abbas Tabatabaei, Marjan Naderan Tahan, Arman Kadkhodai
Abstract:
An efficient optimization approach, called feasible gate (FG), is developed to enhance the computation efficiency and solution quality of the previously developed highway alignment optimization (HAO) model. This approach seeks to realistically represent various user preferences and environmentally sensitive areas and consider them along with geometric design constraints in the optimization process. This is done by avoiding the generation of infeasible solutions that violate various constraints and thus focusing the search on the feasible solutions. The proposed method is simple, but improves significantly the model’s computation time and solution quality. On the other, highway alignment optimization through Feasible Gates, eventuates only economic model by considering minimum design constrains includes minimum reduce of circular curves, minimum length of vertical curves and road maximum gradient. This modelling can reduce passenger comfort and road safety. In most of highway optimization models, by adding penalty function for each constraint, final result handles to satisfy minimum constraint. In this paper, we want to propose a safety-function solution by introducing gift function.Keywords: safety, highway geometry, optimization, alignment
Procedia PDF Downloads 40922352 Seat Assignment Model for Student Admissions Process at Saudi Higher Education Institutions
Authors: Mohammed Salem Alzahrani
Abstract:
In this paper, student admission process is studied to optimize the assignment of vacant seats with three main objectives. Utilizing all vacant seats, satisfying all program of study admission requirements and maintaining fairness among all candidates are the three main objectives of the optimization model. Seat Assignment Method (SAM) is used to build the model and solve the optimization problem with help of Northwest Coroner Method and Least Cost Method. A closed formula is derived for applying the priority of assigning seat to candidate based on SAM.Keywords: admission process model, assignment problem, Hungarian Method, Least Cost Method, Northwest Corner Method, SAM
Procedia PDF Downloads 49822351 Solving the Set Covering Problem Using the Binary Cat Swarm Optimization Metaheuristic
Authors: Broderick Crawford, Ricardo Soto, Natalia Berrios, Eduardo Olguin
Abstract:
In this paper, we present a binary cat swarm optimization for solving the Set covering problem. The set covering problem is a well-known NP-hard problem with many practical applications, including those involving scheduling, production planning and location problems. Binary cat swarm optimization is a recent swarm metaheuristic technique based on the behavior of discrete cats. Domestic cats show the ability to hunt and are curious about moving objects. The cats have two modes of behavior: seeking mode and tracing mode. We illustrate this approach with 65 instances of the problem from the OR-Library. Moreover, we solve this problem with 40 new binarization techniques and we select the technical with the best results obtained. Finally, we make a comparison between results obtained in previous studies and the new binarization technique, that is, with roulette wheel as transfer function and V3 as discretization technique.Keywords: binary cat swarm optimization, binarization methods, metaheuristic, set covering problem
Procedia PDF Downloads 39622350 Hybridized Approach for Distance Estimation Using K-Means Clustering
Authors: Ritu Vashistha, Jitender Kumar
Abstract:
Clustering using the K-means algorithm is a very common way to understand and analyze the obtained output data. When a similar object is grouped, this is called the basis of Clustering. There is K number of objects and C number of cluster in to single cluster in which k is always supposed to be less than C having each cluster to be its own centroid but the major problem is how is identify the cluster is correct based on the data. Formulation of the cluster is not a regular task for every tuple of row record or entity but it is done by an iterative process. Each and every record, tuple, entity is checked and examined and similarity dissimilarity is examined. So this iterative process seems to be very lengthy and unable to give optimal output for the cluster and time taken to find the cluster. To overcome the drawback challenge, we are proposing a formula to find the clusters at the run time, so this approach can give us optimal results. The proposed approach uses the Euclidian distance formula as well melanosis to find the minimum distance between slots as technically we called clusters and the same approach we have also applied to Ant Colony Optimization(ACO) algorithm, which results in the production of two and multi-dimensional matrix.Keywords: ant colony optimization, data clustering, centroids, data mining, k-means
Procedia PDF Downloads 12822349 Real-Time Episodic Memory Construction for Optimal Action Selection in Cognitive Robotics
Authors: Deon de Jager, Yahya Zweiri, Dimitrios Makris
Abstract:
The three most important components in the cognitive architecture for cognitive robotics is memory representation, memory recall, and action-selection performed by the executive. In this paper, action selection, performed by the executive, is defined as a memory quantification and optimization process. The methodology describes the real-time construction of episodic memory through semantic memory optimization. The optimization is performed by set-based particle swarm optimization, using an adaptive entropy memory quantification approach for fitness evaluation. The performance of the approach is experimentally evaluated by simulation, where a UAV is tasked with the collection and delivery of a medical package. The experiments show that the UAV dynamically uses the episodic memory to autonomously control its velocity, while successfully completing its mission.Keywords: cognitive robotics, semantic memory, episodic memory, maximum entropy principle, particle swarm optimization
Procedia PDF Downloads 15622348 The Application of Line Balancing Technique and Simulation Program to Increase Productivity in Hard Disk Drive Components
Authors: Alonggot Limcharoen, Jintana Wannarat, Vorawat Panich
Abstract:
This study aims to investigate the balancing of the number of operators (Line Balancing technique) in the production line of hard disk drive components in order to increase efficiency. At present, the trend of using hard disk drives has continuously declined leading to limits in a company’s revenue potential. It is important to improve and develop the production process to create market share and to have the ability to compete with competitors with a higher value and quality. Therefore, an effective tool is needed to support such matters. In this research, the Arena program was applied to analyze the results both before and after the improvement. Finally, the precedent was used before proceeding with the real process. There were 14 work stations with 35 operators altogether in the RA production process where this study was conducted. In the actual process, the average production time was 84.03 seconds per product piece (by timing 30 times in each work station) along with a rating assessment by implementing the Westinghouse principles. This process showed that the rating was 123% underlying an assumption of 5% allowance time. Consequently, the standard time was 108.53 seconds per piece. The Takt time was calculated from customer needs divided by working duration in one day; 3.66 seconds per piece. Of these, the proper number of operators was 30 people. That meant five operators should be eliminated in order to increase the production process. After that, a production model was created from the actual process by using the Arena program to confirm model reliability; the outputs from imitation were compared with the original (actual process) and this comparison indicated that the same output meaning was reliable. Then, worker numbers and their job responsibilities were remodeled into the Arena program. Lastly, the efficiency of production process enhanced from 70.82% to 82.63% according to the target.Keywords: hard disk drive, line balancing, ECRS, simulation, arena program
Procedia PDF Downloads 22522347 Vibration Analysis and Optimization Design of Ultrasonic Horn
Authors: Kuen Ming Shu, Ren Kai Ho
Abstract:
Ultrasonic horn has the functions of amplifying amplitude and reducing resonant impedance in ultrasonic system. Its primary function is to amplify deformation or velocity during vibration and focus ultrasonic energy on the small area. It is a crucial component in design of ultrasonic vibration system. There are five common design methods for ultrasonic horns: analytical method, equivalent circuit method, equal mechanical impedance, transfer matrix method, finite element method. In addition, the general optimization design process is to change the geometric parameters to improve a single performance. Therefore, in the general optimization design process, we couldn't find the relation of parameter and objective. However, a good optimization design must be able to establish the relationship between input parameters and output parameters so that the designer can choose between parameters according to different performance objectives and obtain the results of the optimization design. In this study, an ultrasonic horn provided by Maxwide Ultrasonic co., Ltd. was used as the contrast of optimized ultrasonic horn. The ANSYS finite element analysis (FEA) software was used to simulate the distribution of the horn amplitudes and the natural frequency value. The results showed that the frequency for the simulation values and actual measurement values were similar, verifying the accuracy of the simulation values. The ANSYS DesignXplorer was used to perform Response Surface optimization, which could shows the relation of parameter and objective. Therefore, this method can be used to substitute the traditional experience method or the trial-and-error method for design to reduce material costs and design cycles.Keywords: horn, natural frequency, response surface optimization, ultrasonic vibration
Procedia PDF Downloads 11622346 Evaluation of the Effect of Lactose Derived Monosaccharide on Galactooligosaccharides Production by β-Galactosidase
Authors: Yenny Paola Morales Cortés, Fabián Rico Rodríguez, Juan Carlos Serrato Bermúdez, Carlos Arturo Martínez Riascos
Abstract:
Numerous benefits of galactooligosaccharides (GOS) as prebiotics have motivated the study of enzymatic processes for their production. These processes have special complexities due to several factors that make difficult high productivity, such as enzyme type, reaction medium pH, substrate concentrations and presence of inhibitors, among others. In the present work the production of galactooligosaccharides (with different degrees of polymerization: two, three and four) from lactose was studied. The study considers the formulation of a mathematical model that predicts the production of GOS from lactose using the enzyme β-galactosidase. The effect of pH in the reaction was studied. For that, phosphate buffer was used and with this was evaluated three pH values (6.0.6.5 and 7.0). Thus it was observed that at pH 6.0 the enzymatic activity insignificant. On the other hand, at pH 7.0 the enzymatic activity was approximately 27 times greater than at 6.5. The last result differs from previously reported results. Therefore, pH 7.0 was chosen as working pH. Additionally, the enzyme concentration was analyzed, which allowed observing that the effect of the concentration depends on the pH and the concentration was set for the following studies in 0.272 mM. Afterwards, experiments were performed varying the lactose concentration to evaluate its effects on the process and to generate the data for the adjustment of the mathematical model parameters. The mathematical model considers the reactions of lactose hydrolysis and transgalactosylation for the production of disaccharides and trisaccharides, with their inverse reactions. The production of tetrasaccharides was negligible and, because of that, it was not included in the model. The reaction was monitored by HPLC and for the quantitative analysis of the experimental data the Matlab programming language was used, including solvers for differential equations systems integration (ode15s) and nonlinear problems optimization (fminunc). The results confirm that the transgalactosylation and hydrolysis reactions are reversible, additionally inhibition by glucose and galactose is observed on the production of GOS. In relation to the production process of galactooligosaccharides, the results show that it is necessary to have high initial concentrations of lactose considering that favors the transgalactosylation reaction, while low concentrations favor hydrolysis reactions.Keywords: β-galactosidase, galactooligosaccharides, inhibition, lactose, Matlab, modeling
Procedia PDF Downloads 35822345 Particle Swarm Optimization Based Method for Minimum Initial Marking in Labeled Petri Nets
Authors: Hichem Kmimech, Achref Jabeur Telmoudi, Lotfi Nabli
Abstract:
The estimation of the initial marking minimum (MIM) is a crucial problem in labeled Petri nets. In the case of multiple choices, the search for the initial marking leads to a problem of optimization of the minimum allocation of resources with two constraints. The first concerns the firing sequence that could be legal on the initial marking with respect to the firing vector. The second deals with the total number of tokens that can be minimal. In this article, the MIM problem is solved by the meta-heuristic particle swarm optimization (PSO). The proposed approach presents the advantages of PSO to satisfy the two previous constraints and find all possible combinations of minimum initial marking with the best computing time. This method, more efficient than conventional ones, has an excellent impact on the resolution of the MIM problem. We prove through a set of definitions, lemmas, and examples, the effectiveness of our approach.Keywords: marking, production system, labeled Petri nets, particle swarm optimization
Procedia PDF Downloads 17822344 Experimental Investigation, Analysis and Optimization of Performance and Emission Characteristics of Composite Oil Methyl Esters at 160 bar, 180 bar and 200 bar Injection Pressures by Multifunctional Criteria Technique
Authors: Yogish Huchaiah, Chandrashekara Krishnappa
Abstract:
This study considers the optimization and validation of experimental results using Multi-Functional Criteria Technique (MFCT). MFCT is concerned with structuring and solving decision and planning problems involving multiple variables. Production of biodiesel from Composite Oil Methyl Esters (COME) of Jatropha and Pongamia oils, mixed in various proportions and Biodiesel thus obtained from two step transesterification process were tested for various Physico-Chemical properties and it has been ascertained that they were within limits proposed by ASTME. They were blended with Petrodiesel in various proportions. These Methyl Esters were blended with Petrodiesel in various proportions and coded. These blends were used as fuels in a computerized CI DI engine to investigate Performance and Emission characteristics. From the analysis of results, it was found that 180MEM4B20 blend had the maximum Performance and minimum Emissions. To validate the experimental results, MFCT was used. Characteristics such as Fuel Consumption (FC), Brake Power (BP), Brake Specific Fuel Consumption (BSFC), Brake Thermal Efficiency (BTE), Carbon dioxide (CO2), Carbon Monoxide (CO), Hydro Carbon (HC) and Nitrogen oxide (NOx) were considered as dependent variables. It was found from the application of this method that the optimized combination of Injection Pressure (IP), Mix and Blend is 178MEM4.2B24. Overall corresponding variation between optimization and experimental results was found to be 7.45%.Keywords: COME, IP, MFCT, optimization, PI, PN, PV
Procedia PDF Downloads 21122343 Analysis of the Level of Production Failures by Implementing New Assembly Line
Authors: Joanna Kochanska, Dagmara Gornicka, Anna Burduk
Abstract:
The article examines the process of implementing a new assembly line in a manufacturing enterprise of the household appliances industry area. At the initial stages of the project, a decision was made that one of its foundations should be the concept of lean management. Because of that, eliminating as many errors as possible in the first phases of its functioning was emphasized. During the start-up of the line, there were identified and documented all production losses (from serious machine failures, through any unplanned downtime, to micro-stops and quality defects). During 6 weeks (line start-up period), all errors resulting from problems in various areas were analyzed. These areas were, among the others, production, logistics, quality, and organization. The aim of the work was to analyze the occurrence of production failures during the initial phase of starting up the line and to propose a method for determining their critical level during its full functionality. There was examined the repeatability of the production losses in various areas and at different levels at such an early stage of implementation, by using the methods of statistical process control. Based on the Pareto analysis, there were identified the weakest points in order to focus improvement actions on them. The next step was to examine the effectiveness of the actions undertaken to reduce the level of recorded losses. Based on the obtained results, there was proposed a method for determining the critical failures level in the studied areas. The developed coefficient can be used as an alarm in case of imbalance of the production, which is caused by the increased failures level in production and production support processes in the period of the standardized functioning of the line.Keywords: production failures, level of production losses, new production line implementation, assembly line, statistical process control
Procedia PDF Downloads 128