Search results for: optimization/inverse mapping
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4686

Search results for: optimization/inverse mapping

3636 Time-Domain Simulations of the Coupled Dynamics of Surface Riding Wave Energy Converter

Authors: Chungkuk Jin, Moo-Hyun Kim, HeonYong Kang

Abstract:

A surface riding (SR) wave energy converter (WEC) is designed and its feasibility and performance are numerically simulated by the author-developed floater-mooring-magnet-electromagnetics fully-coupled dynamic analysis computer program. The biggest advantage of the SR-WEC is that the performance is equally effective even in low sea states and its structural robustness is greatly improved by simply riding along the wave surface compared to other existing WECs. By the numerical simulations and actuator testing, it is clearly demonstrated that the concept works and through the optimization process, its efficiency can be improved.

Keywords: computer simulation, electromagnetics fully-coupled dynamics, floater-mooring-magnet, optimization, performance evaluation, surface riding, WEC

Procedia PDF Downloads 140
3635 Optimization of Spatial Light Modulator to Generate Aberration Free Optical Traps

Authors: Deepak K. Gupta, T. R. Ravindran

Abstract:

Holographic Optical Tweezers (HOTs) in general use iterative algorithms such as weighted Gerchberg-Saxton (WGS) to generate multiple traps, which produce traps with 99% uniformity theoretically. But in experiments, it is the phase response of the spatial light modulator (SLM) which ultimately determines the efficiency, uniformity, and quality of the trap spots. In general, SLMs show a nonlinear phase response behavior, and they may even have asymmetric phase modulation depth before and after π. This affects the resolution with which the gray levels are addressed before and after π, leading to a degraded trap performance. We present a method to optimize the SLM for a linear phase response behavior along with a symmetric phase modulation depth around π. Further, we optimize the SLM for its varying phase response over different spatial regions by optimizing the brightness/contrast and gamma of the hologram in different subsections. We show the effect of the optimization on an array of trap spots resulting in improved efficiency and uniformity. We also calculate the spot sharpness metric and trap performance metric and show a tightly focused spot with reduced aberration. The trap performance is compared by calculating the trap stiffness of a trapped particle in a given trap spot before and after aberration correction. The trap stiffness is found to improve by 200% after the optimization.

Keywords: spatial light modulator, optical trapping, aberration, phase modulation

Procedia PDF Downloads 179
3634 Optimization of a Flux Switching Permanent Magnet Machine Using Laminated Segmented Rotor

Authors: Seyedmilad Kazemisangdehi, Seyedmehdi Kazemisangdehi

Abstract:

Flux switching permanent magnet machines are considered for wide range of applications because of their outstanding merits including high torque/power densities, high efficiency, simple and robust rotor structure. Therefore, several topologies have been proposed like the PM exited flux switching machine, hybrid excited flux switching type, and so on. Recently, a novel laminated segmented rotor flux switching permanent magnet machine was introduced. It features flux barriers on rotor structure to enhance the performances of machine including torque ripple reduction and also torque and efficiency improvements at the same time. This is while, the design of barriers was not optimized by the authors. Therefore, in this paper three coefficients regarding the position of the barriers are considered for optimization. The effect of each coefficient on the performance of this machine is investigated by finite element method and finally an optimized design of flux barriers based on these three coefficients is proposed from different points of view including electromagnetic torque maximization and cogging torque/torque ripple minimization. At optimum design from maximum developed torque aspect, this machine generates 0.65 Nm torque higher than that of the not-optimized design with an almost 0.4 % improvement in efficiency.

Keywords: finite element analysis, FSPM, laminated segmented rotor flux switching permanent magnet machine, optimization

Procedia PDF Downloads 223
3633 A Firefly Based Optimization Technique for Optimal Planning of Voltage Controlled Distributed Generators

Authors: M. M. Othman, Walid El-Khattam, Y. G. Hegazy, A. Y. Abdelaziz

Abstract:

This paper presents a method for finding the optimal location and capacity of dispatchable DGs connected to the distribution feeders for optimal planning for a specified power loss without violating the system practical constraints. The distributed generation units in the proposed algorithm is modeled as voltage controlled node with the flexibility to be converted to constant power node in case of reactive power limit violation. The proposed algorithm is implemented in MATLAB and tested on the IEEE 37-nodes feeder. The results that are validated by comparing it with results obtained from other competing methods show the effectiveness, accuracy and speed of the proposed method.

Keywords: distributed generators, firefly technique, optimization, power loss

Procedia PDF Downloads 529
3632 Wind Speed Prediction Using Passive Aggregation Artificial Intelligence Model

Authors: Tarek Aboueldahab, Amin Mohamed Nassar

Abstract:

Wind energy is a fluctuating energy source unlike conventional power plants, thus, it is necessary to accurately predict short term wind speed to integrate wind energy in the electricity supply structure. To do so, we present a hybrid artificial intelligence model of short term wind speed prediction based on passive aggregation of the particle swarm optimization and neural networks. As a result, improvement of the prediction accuracy is obviously obtained compared to the standard artificial intelligence method.

Keywords: artificial intelligence, neural networks, particle swarm optimization, passive aggregation, wind speed prediction

Procedia PDF Downloads 443
3631 Effect of Nanoparticles Concentration, pH and Agitation on Bioethanol Production by Saccharomyces cerevisiae BY4743: An Optimization Study

Authors: Adeyemi Isaac Sanusi, Gueguim E. B. Kana

Abstract:

Nanoparticles have received attention of the scientific community due to their biotechnological potentials. They exhibit advantageous size, shape and concentration-dependent catalytic, stabilizing, immunoassays and immobilization properties. This study investigates the impact of metallic oxide nanoparticles (NPs) on ethanol production by Saccharomyces cerevisiae BY4743. Nine different nanoparticles were synthesized using precipitation method and microwave treatment. The nanoparticles synthesized were characterized by Fourier Transform Infra-Red spectroscopy (FTIR), scanning electron microscopy (SEM) and transmission electron microscopy (TEM). Fermentation processes were carried out at varied NPs concentrations (0 – 0.08 wt%). Highest ethanol concentrations were achieved after 24 h using Cobalt NPs (5.07 g/l), Copper NPs (4.86 g/l) and Manganese NPs (4.74 g/l) at 0.01 wt% NPs concentrations, which represent 13%, 8.7% and 5.4% increase respectively over the control (4.47 g/l). The lowest ethanol concentration (0.17 g/l) was obtained when 0.08 wt% of Silver NPs was used. And lower ethanol concentrations were observed at higher NPs concentration. Ethanol concentration decrease after 24 h for all the processes. In all set up with NPs, the pH was observed to be stable and the stability was directly proportional to nanoparticles concentrations. These findings suggest that the presence of some of the NPs in the bioprocesses has catalytic and pH stabilizing potential. Ethanol production by Saccharomyces cerevisiae BY4743 was enhanced in the presence of Cobalt NPs, Copper NPs and Manganese NPs. Optimization study using response surface methodology (RSM) will further elucidate the impact of these nanoparticles on bioethanol production.

Keywords: agitation, bioethanol, nanoparticles concentration, optimization, pH value

Procedia PDF Downloads 181
3630 Multi-Temporal Mapping of Built-up Areas Using Daytime and Nighttime Satellite Images Based on Google Earth Engine Platform

Authors: S. Hutasavi, D. Chen

Abstract:

The built-up area is a significant proxy to measure regional economic growth and reflects the Gross Provincial Product (GPP). However, an up-to-date and reliable database of built-up areas is not always available, especially in developing countries. The cloud-based geospatial analysis platform such as Google Earth Engine (GEE) provides an opportunity with accessibility and computational power for those countries to generate the built-up data. Therefore, this study aims to extract the built-up areas in Eastern Economic Corridor (EEC), Thailand using day and nighttime satellite imagery based on GEE facilities. The normalized indices were generated from Landsat 8 surface reflectance dataset, including Normalized Difference Built-up Index (NDBI), Built-up Index (BUI), and Modified Built-up Index (MBUI). These indices were applied to identify built-up areas in EEC. The result shows that MBUI performs better than BUI and NDBI, with the highest accuracy of 0.85 and Kappa of 0.82. Moreover, the overall accuracy of classification was improved from 79% to 90%, and error of total built-up area was decreased from 29% to 0.7%, after night-time light data from the Visible and Infrared Imaging Suite (VIIRS) Day Night Band (DNB). The results suggest that MBUI with night-time light imagery is appropriate for built-up area extraction and be utilize for further study of socioeconomic impacts of regional development policy over the EEC region.

Keywords: built-up area extraction, google earth engine, adaptive thresholding method, rapid mapping

Procedia PDF Downloads 118
3629 Finding Optimal Solutions to Management Problems with the use of Econometric and Multiobjective Programming

Authors: M. Moradi Dalini, M. R. Talebi

Abstract:

This research revolves around a technical method according to combines econometric and multiobjective programming to select and obtain optimal solutions to management problems. It is taken for a generation that; it is important to analyze which combination of values of the explanatory variables -in an econometric method- would point to the simultaneous achievement of the best values of the response variables. In this case, if a certain degree of conflict is viewed among the response variables, we suggest a multiobjective method in order to the results obtained from a regression analysis. In fact, with the use of a multiobjective method, we will have the best decision about the conflicting relationship between the response variables and the optimal solution. The combined multiobjective programming and econometrics benefit is an assessment of a balanced “optimal” situation among them because a find of information can hardly be extracted just by econometric techniques.

Keywords: econometrics, multiobjective optimization, management problem, optimization

Procedia PDF Downloads 76
3628 Mapping Thermal Properties Using Resistivity, Lithology and Thermal Conductivity Measurements

Authors: Riccardo Pasquali, Keith Harlin, Mark Muller

Abstract:

The ShallowTherm project is focussed on developing and applying a methodology for extrapolating relatively sparsely sampled thermal conductivity measurements across Ireland using mapped Litho-Electrical (LE) units. The primary data used consist of electrical resistivities derived from the Geological Survey Ireland Tellus airborne electromagnetic dataset, GIS-based maps of Irish geology, and rock thermal conductivities derived from both the current Irish Ground Thermal Properties (IGTP) database and a new programme of sampling and laboratory measurement. The workflow has been developed across three case-study areas that sample a range of different calcareous, arenaceous, argillaceous, and volcanic lithologies. Statistical analysis of resistivity data from individual geological formations has been assessed and integrated with detailed lithological descriptions to define distinct LE units. Thermal conductivity measurements from core and hand samples have been acquired for every geological formation within each study area. The variability and consistency of thermal conductivity measurements within each LE unit is examined with the aim of defining a characteristic thermal conductivity (or range of thermal conductivities) for each LE unit. Mapping of LE units, coupled with characteristic thermal conductivities, provides a method of defining thermal conductivity properties at a regional scale and facilitating the design of ground source heat pump closed-loop collectors.

Keywords: thermal conductivity, ground source heat pumps, resistivity, heat exchange, shallow geothermal, Ireland

Procedia PDF Downloads 172
3627 Applications of Space Technology in Flood Risk Mapping in Parts of Haryana State, India

Authors: B. S. Chaudhary

Abstract:

The severity and frequencies of different disasters on the globe is increasing in recent years. India is also facing the disasters in the form of drought, cyclone, earthquake, landslides, and floods. One of the major causes of disasters in northern India is flood. There are great losses and extensive damage to the agricultural crops, property, human, and animal life. This is causing environmental imbalances at places. The annual global figures for losses due to floods run into over 2 billion dollar. India is a vast country with wide variations in climate and topography. Due to widespread and heavy rainfall during the monsoon months, floods of varying magnitude occur all over the country during June to September. The magnitude depends upon the intensity of rainfall, its duration and also the ground conditions at the time of rainfall. Haryana, one of the agriculturally dominated northern states is also suffering from a number of disasters such as floods, desertification, soil erosion, land degradation etc. Earthquakes are also frequently occurring but of small magnitude so are not causing much concern and damage. Most of the damage in Haryana is due to floods. Floods in Haryana have occurred in 1978, 1988, 1993, 1995, 1998, and 2010 to mention a few. The present paper deals with the Remote Sensing and GIS applications in preparing flood risk maps in parts of Haryana State India. The satellite data of various years have been used for mapping of flood affected areas. The Flooded areas have been interpreted both visually and digitally and two classes-flooded and receded water/ wet areas have been identified for each year. These have been analyzed in GIS environment to prepare the risk maps. This shows the areas of high, moderate and low risk depending on the frequency of flood witness. The floods leave a trail of suffering in the form of unhygienic conditions due to improper sanitation, water logging, filth littered in the area, degradation of materials and unsafe drinking water making the people prone to many type diseases in short and long run. Attempts have also been made to enumerate the causes of floods. The suggestions are given for mitigating the fury of floods and proper management issues related to evacuation and safe places nearby.

Keywords: flood mapping, GIS, Haryana, India, remote sensing, space technology

Procedia PDF Downloads 208
3626 Changes in Textural Properties of Zucchini Slices Under Effects of Partial Predrying and Deep-Fat-Frying

Authors: E. Karacabey, Ş. G. Özçelik, M. S. Turan, C. Baltacıoğlu, E. Küçüköner

Abstract:

Changes in textural properties of any food material during processing is significant for further consumer’s evaluation and directly affects their decisions. Thus any food material should be considered in terms of textural properties after any process. In the present study zucchini slices were partially predried to control and reduce the product’s final oil content. A conventional oven was used for partially dehydration of zucchini slices. Following frying was carried in an industrial fryer having temperature controller. This study was based on the effect of this predrying process on textural properties of fried zucchini slices. Texture profile analysis was performed. Hardness, elasticity, chewiness, cohesiveness were studied texture parameters of fried zucchini slices. Temperature and weight loss were monitored parameters of predrying process, whereas, in frying, oil temperature and process time were controlled. Optimization of two successive processes was done by response surface methodology being one of the common used statistical process optimization tools. Models developed for each texture parameters displayed high success to predict their values as a function of studied processes’ conditions. Process optimization was performed according to target values for each property determined for directly fried zucchini slices taking the highest score from sensory evaluation. Results indicated that textural properties of predried and then fried zucchini slices could be controlled by well-established equations. This is thought to be significant for fried stuff related food industry, where controlling of sensorial properties are crucial to lead consumer’s perception and texture related ones are leaders. This project (113R015) has been supported by TUBITAK.

Keywords: optimization, response surface methodology, texture profile analysis, conventional oven, modelling

Procedia PDF Downloads 430
3625 Updating Stochastic Hosting Capacity Algorithm for Voltage Optimization Programs and Interconnect Standards

Authors: Nicholas Burica, Nina Selak

Abstract:

The ADHCAT (Automated Distribution Hosting Capacity Assessment Tool) was designed to run Hosting Capacity Analysis on the ComEd system via a stochastic DER (Distributed Energy Resource) placement on multiple power flow simulations against a set of violation criteria. The violation criteria in the initial version of the tool captured a limited amount of issues that individual departments design against for DER interconnections. Enhancements were made to the tool to further align with individual department violation and operation criteria, as well as the addition of new modules for use for future load profile analysis. A reporting engine was created for future analytical use based on the simulations and observations in the tool.

Keywords: distributed energy resources, hosting capacity, interconnect, voltage optimization

Procedia PDF Downloads 187
3624 Evaluation of the Effect of Lactose Derived Monosaccharide on Galactooligosaccharides Production by β-Galactosidase

Authors: Yenny Paola Morales Cortés, Fabián Rico Rodríguez, Juan Carlos Serrato Bermúdez, Carlos Arturo Martínez Riascos

Abstract:

Numerous benefits of galactooligosaccharides (GOS) as prebiotics have motivated the study of enzymatic processes for their production. These processes have special complexities due to several factors that make difficult high productivity, such as enzyme type, reaction medium pH, substrate concentrations and presence of inhibitors, among others. In the present work the production of galactooligosaccharides (with different degrees of polymerization: two, three and four) from lactose was studied. The study considers the formulation of a mathematical model that predicts the production of GOS from lactose using the enzyme β-galactosidase. The effect of pH in the reaction was studied. For that, phosphate buffer was used and with this was evaluated three pH values (6.0.6.5 and 7.0). Thus it was observed that at pH 6.0 the enzymatic activity insignificant. On the other hand, at pH 7.0 the enzymatic activity was approximately 27 times greater than at 6.5. The last result differs from previously reported results. Therefore, pH 7.0 was chosen as working pH. Additionally, the enzyme concentration was analyzed, which allowed observing that the effect of the concentration depends on the pH and the concentration was set for the following studies in 0.272 mM. Afterwards, experiments were performed varying the lactose concentration to evaluate its effects on the process and to generate the data for the adjustment of the mathematical model parameters. The mathematical model considers the reactions of lactose hydrolysis and transgalactosylation for the production of disaccharides and trisaccharides, with their inverse reactions. The production of tetrasaccharides was negligible and, because of that, it was not included in the model. The reaction was monitored by HPLC and for the quantitative analysis of the experimental data the Matlab programming language was used, including solvers for differential equations systems integration (ode15s) and nonlinear problems optimization (fminunc). The results confirm that the transgalactosylation and hydrolysis reactions are reversible, additionally inhibition by glucose and galactose is observed on the production of GOS. In relation to the production process of galactooligosaccharides, the results show that it is necessary to have high initial concentrations of lactose considering that favors the transgalactosylation reaction, while low concentrations favor hydrolysis reactions.

Keywords: β-galactosidase, galactooligosaccharides, inhibition, lactose, Matlab, modeling

Procedia PDF Downloads 350
3623 Design Optimization and Thermoacoustic Analysis of Pulse Tube Cryocooler Components

Authors: K. Aravinth, C. T. Vignesh

Abstract:

The usage of pulse tube cryocoolers is significantly increased mainly due to the advantage of the absence of moving parts. The underlying idea of this project is to optimize the design of pulse tube, regenerator, a resonator in cryocooler and analyzing the thermo-acoustic oscillations with respect to the design parameters. Computational Fluid Dynamic (CFD) model with time-dependent validation is done to predict its performance. The continuity, momentum, and energy equations are solved for various porous media regions. The effect of changing the geometries and orientation will be validated and investigated in performance. The pressure, temperature and velocity fields in the regenerator and pulse tube are evaluated. This optimized design performance results will be compared with the existing pulse tube cryocooler design. The sinusoidal behavior of cryocooler in acoustic streaming patterns in pulse tube cryocooler will also be evaluated.

Keywords: acoustics, cryogenics, design, optimization

Procedia PDF Downloads 168
3622 Optimization of Flexible Job Shop Scheduling Problem with Sequence-Dependent Setup Times Using Genetic Algorithm Approach

Authors: Sanjay Kumar Parjapati, Ajai Jain

Abstract:

This paper presents optimization of makespan for ‘n’ jobs and ‘m’ machines flexible job shop scheduling problem with sequence dependent setup time using genetic algorithm (GA) approach. A restart scheme has also been applied to prevent the premature convergence. Two case studies are taken into consideration. Results are obtained by considering crossover probability (pc = 0.85) and mutation probability (pm = 0.15). Five simulation runs for each case study are taken and minimum value among them is taken as optimal makespan. Results indicate that optimal makespan can be achieved with more than one sequence of jobs in a production order.

Keywords: flexible job shop, genetic algorithm, makespan, sequence dependent setup times

Procedia PDF Downloads 325
3621 Geophysical Mapping of the Groundwater Aquifer System in Gode Area, Northeastern Hosanna, Ethiopia

Authors: Esubalew Yehualaw Melaku

Abstract:

In this study, two basic geophysical methods are applied for mapping the groundwater aquifer system in the Gode area along the Guder River, northeast of Hosanna town, near the western margin of the Central Main Ethiopian Rift. The main target of the study is to map the potential aquifer zone and investigate the groundwater potential for current and future development of the resource in the Gode area. The geophysical methods employed in this study include, Vertical Electrical Sounding (VES) and magnetic survey techniques. Electrical sounding was used to examine and map the depth to the potential aquifer zone of the groundwater and its distribution over the area. On the other hand, a magnetic survey was used to delineate contact between lithologic units and geological structures. The 2D magnetic modeling and the geoelectric sections are used for the identification of weak zones, which control the groundwater flow and storage system. The geophysical survey comprises of twelve VES readings collected by using a Schlumberger array along six profile lines and more than four hundred (400) magnetic readings at about 10m station intervals along four profiles and 20m along three random profiles. The study result revealed that the potential aquifer in the area is obtained at a depth range from 45m to 92m. This is the response of the highly weathered/ fractured ignimbrite and pumice layer with sandy soil, which is the main water-bearing horizon. Overall, in the neighborhood of four VES points, VES- 2, VES- 3, VES-10, and VES-11, shows good water-bearing zones in the study area.

Keywords: vertical electrical sounding, magnetic survey, aquifer, groundwater potential

Procedia PDF Downloads 120
3620 Optimization of Black Grass Jelly Formulation to Reduce Leaching and Increase Floating Rate

Authors: M. M. Nor, H. I. Sheikh, M. F. H. Hassan, S. Mokhtar, A. Suganthi, A. Fadhlina

Abstract:

Black grass jelly (BGJ) is a popular black jelly used in preparing various drinks and desserts. Food industries often use preservatives to maintain the physicochemical properties of foods, such as color and texture. These preservatives (e.g., phosphoric acid) are linked with deleterious health effects such as kidney disease. Using gelling agents, carrageenan, and gelatin to make BGJ could improve its physiochemical and textural properties. This study was designed to optimize BGJ-selected physicochemical and textural properties using carrageenan and gelatin. Various black grass jelly formulations (BGJF) were designed using an I-optimal mixture design in Design Expert® software. Data from commercial BGJ were used as a reference during the optimization process. The combination of carrageenan and gelatin added to the formulations was up to 14.38g (~5%), respectively. The results showed that adding 2.5g carrageenan and 2.5g gelatin at approximately 5g (~5%) effectively maintained most of the physiochemical properties with an overall desirability function of 0.81. This formulation was selected as the optimum black grass jelly formulation (OBGJF). The leaching properties and floating duration were measured on the OBGJF and commercial grass jelly for 20 min and 40 min, respectively. The results indicated that OBGJF showed significantly (p<0.0001) lower leaching rate and floating time (p<0.05). Hence, further optimization is needed to increase the floating duration of carrageenan and gelatin-based BGJ.

Keywords: cincau, Mesona chinensis, black grass jelly, carrageenan, gelatin

Procedia PDF Downloads 77
3619 An Insight into Early Stage Detection of Malignant Tumor by Microwave Imaging

Authors: Muhammad Hassan Khalil, Xu Jiadong

Abstract:

Detection of malignant tumor inside the breast of women is a challenging field for the researchers. MWI (Microwave imaging) for breast cancer diagnosis has been of interest for last two decades, newly it suggested for finding cancerous tissues of women breast. A simple and basic idea of the mathematical modeling is used throughout this paper for imaging of malignant tumor. In this paper, the authors explained inverse scattering method in the microwave imaging and also present some simulation results.

Keywords: breast cancer detection, microwave imaging, tomography, tumor

Procedia PDF Downloads 405
3618 Iterative Replanning of Diesel Generator and Energy Storage System for Stable Operation of an Isolated Microgrid

Authors: Jiin Jeong, Taekwang Kim, Kwang Ryel Ryu

Abstract:

The target microgrid in this paper is isolated from the large central power system and is assumed to consist of wind generators, photovoltaic power generators, an energy storage system (ESS), a diesel power generator, the community load, and a dump load. The operation of such a microgrid can be hazardous because of the uncertain prediction of power supply and demand and especially due to the high fluctuation of the output from the wind generators. In this paper, we propose an iterative replanning method for determining the appropriate level of diesel generation and the charging/discharging cycles of the ESS for the upcoming one-hour horizon. To cope with the uncertainty of the estimation of supply and demand, the one-hour plan is built repeatedly in the regular interval of one minute by rolling the one-hour horizon. Since the plan should be built with a sufficiently large safe margin to avoid any possible black-out, some energy waste through the dump load is inevitable. In our approach, the level of safe margin is optimized through learning from the past experience. The simulation experiments show that our method combined with the margin optimization can reduce the dump load compared to the method without such optimization.

Keywords: microgrid, operation planning, power efficiency optimization, supply and demand prediction

Procedia PDF Downloads 430
3617 The Impact of Transaction Costs on Rebalancing an Investment Portfolio in Portfolio Optimization

Authors: B. Marasović, S. Pivac, S. V. Vukasović

Abstract:

Constructing a portfolio of investments is one of the most significant financial decisions facing individuals and institutions. In accordance with the modern portfolio theory maximization of return at minimal risk should be the investment goal of any successful investor. In addition, the costs incurred when setting up a new portfolio or rebalancing an existing portfolio must be included in any realistic analysis. In this paper rebalancing an investment portfolio in the presence of transaction costs on the Croatian capital market is analyzed. The model applied in the paper is an extension of the standard portfolio mean-variance optimization model in which transaction costs are incurred to rebalance an investment portfolio. This model allows different costs for different securities, and different costs for buying and selling. In order to find efficient portfolio, using this model, first, the solution of quadratic programming problem of similar size to the Markowitz model, and then the solution of a linear programming problem have to be found. Furthermore, in the paper the impact of transaction costs on the efficient frontier is investigated. Moreover, it is shown that global minimum variance portfolio on the efficient frontier always has the same level of the risk regardless of the amount of transaction costs. Although efficient frontier position depends of both transaction costs amount and initial portfolio it can be concluded that extreme right portfolio on the efficient frontier always contains only one stock with the highest expected return and the highest risk.

Keywords: Croatian capital market, Markowitz model, fractional quadratic programming, portfolio optimization, transaction costs

Procedia PDF Downloads 381
3616 Process Optimization for Albanian Crude Oil Characterization

Authors: Xhaklina Cani, Ilirjan Malollari, Ismet Beqiraj, Lorina Lici

Abstract:

Oil characterization is an essential step in the design, simulation, and optimization of refining facilities. To achieve optimal crude selection and processing decisions, a refiner must have exact information refer to crude oil quality. This includes crude oil TBP-curve as the main data for correct operation of refinery crude oil atmospheric distillation plants. Crude oil is typically characterized based on a distillation assay. This procedure is reasonably well-defined and is based on the representation of the mixture of actual components that boil within a boiling point interval by hypothetical components that boil at the average boiling temperature of the interval. The crude oil assay typically includes TBP distillation according to ASTM D-2892, which can characterize this part of oil that boils up to 400 C atmospheric equivalent boiling point. To model the yield curves obtained by physical distillation is necessary to compare the differences between the modelling and the experimental data. Most commercial use a different number of components and pseudo-components to represent crude oil. Laboratory tests include distillations, vapor pressures, flash points, pour points, cetane numbers, octane numbers, densities, and viscosities. The aim of the study is the drawing of true boiling curves for different crude oil resources in Albania and to compare the differences between the modeling and the experimental data for optimal characterization of crude oil.

Keywords: TBP distillation curves, crude oil, optimization, simulation

Procedia PDF Downloads 298
3615 Multi-Objective Optimization of an Aerodynamic Feeding System Using Genetic Algorithm

Authors: Jan Busch, Peter Nyhuis

Abstract:

Considering the challenges of short product life cycles and growing variant diversity, cost minimization and manufacturing flexibility increasingly gain importance to maintain a competitive edge in today’s global and dynamic markets. In this context, an aerodynamic part feeding system for high-speed industrial assembly applications has been developed at the Institute of Production Systems and Logistics (IFA), Leibniz Universitaet Hannover. The aerodynamic part feeding system outperforms conventional systems with respect to its process safety, reliability, and operating speed. In this paper, a multi-objective optimisation of the aerodynamic feeding system regarding the orientation rate, the feeding velocity and the required nozzle pressure is presented.

Keywords: aerodynamic feeding system, genetic algorithm, multi-objective optimization, workpiece orientation

Procedia PDF Downloads 570
3614 Optimization of Technical and Technological Solutions for the Development of Offshore Hydrocarbon Fields in the Kaliningrad Region

Authors: Pavel Shcherban, Viktoria Ivanova, Alexander Neprokin, Vladislav Golovanov

Abstract:

Currently, LLC «Lukoil-Kaliningradmorneft» is implementing a comprehensive program for the development of offshore fields of the Kaliningrad region. This is largely associated with the depletion of the resource base of land in the region, as well as the positive results of geological investigation surrounding the Baltic Sea area and the data on the volume of hydrocarbon recovery from a single offshore field are working on the Kaliningrad region – D-6 «Kravtsovskoye».The article analyzes the main stages of the LLC «Lukoil-Kaliningradmorneft»’s development program for the development of the hydrocarbon resources of the region's shelf and suggests an optimization algorithm that allows managing a multi-criteria process of development of shelf deposits. The algorithm is formed on the basis of the problem of sequential decision making, which is a section of dynamic programming. Application of the algorithm during the consolidation of the initial data, the elaboration of project documentation, the further exploration and development of offshore fields will allow to optimize the complex of technical and technological solutions and increase the economic efficiency of the field development project implemented by LLC «Lukoil-Kaliningradmorneft».

Keywords: offshore fields of hydrocarbons of the Baltic Sea, development of offshore oil and gas fields, optimization of the field development scheme, solution of multicriteria tasks in oil and gas complex, quality management in oil and gas complex

Procedia PDF Downloads 195
3613 An Evolutionary Approach for QAOA for Max-Cut

Authors: Francesca Schiavello

Abstract:

This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.

Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization

Procedia PDF Downloads 55
3612 An Optimization Algorithm for Reducing the Liquid Oscillation in the Moving Containers

Authors: Reza Babajanivalashedi, Stefania Lo Feudo, Jean-Luc Dion

Abstract:

Liquid sloshing is a crucial problem for the dynamic of moving containers in the packaging industries. Sloshing issues have been so far mainly modeled within the framework of fluid dynamics or by using equivalent mechanical models with different kinds of movements and shapes of containers. Nevertheless, these approaches do not allow to determinate the shape of the free surface of the liquid in case of the irregular shape of the moving containers, so that experimental measurements may be required. If there is too much slosh in the moving tank, the liquid can be splashed out on the packages. So, the free surface oscillation must be controlled/reduced to eliminate the splashing. The purpose of this research is to propose an optimization algorithm for finding an optimum command law to reduce surface elevation. In the first step, the free surface of the liquid is simulated based on the separation variable and weak formulation models. Then Genetic and Gradient algorithms are developed for finding the optimum command law. The optimum command law is compared with existing command laws, and the results show that there is a significant difference in surface oscillation between optimum and existing command laws. This algorithm is applicable for different varieties of bottles in case of using the camera for detecting the liquid elevation, and it can produce new command laws for different kinds of tanks to reduce the surface oscillation and remove the splashing phenomenon.

Keywords: sloshing phenomenon, separation variables, weak formulation, optimization algorithm, command law

Procedia PDF Downloads 143
3611 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data

Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton

Abstract:

The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.

Keywords: analytics, digitization, industry 4.0, manufacturing

Procedia PDF Downloads 108
3610 Machine Learning and Metaheuristic Algorithms in Short Femoral Stem Custom Design to Reduce Stress Shielding

Authors: Isabel Moscol, Carlos J. Díaz, Ciro Rodríguez

Abstract:

Hip replacement becomes necessary when a person suffers severe pain or considerable functional limitations and the best option to enhance their quality of life is through the replacement of the damaged joint. One of the main components in femoral prostheses is the stem which distributes the loads from the joint to the proximal femur. To preserve more bone stock and avoid weakening of the diaphysis, a short starting stem was selected, generated from the intramedullary morphology of the patient's femur. It ensures the implantability of the design and leads to geometric delimitation for personalized optimization with machine learning (ML) and metaheuristic algorithms. The present study attempts to design a cementless short stem to make the strain deviation before and after implantation close to zero, promoting its fixation and durability. Regression models developed to estimate the percentage change of maximum principal stresses were used as objective optimization functions by the metaheuristic algorithm. The latter evaluated different geometries of the short stem with the modification of certain parameters in oblique sections from the osteotomy plane. The optimized geometry reached a global stress shielding (SS) of 18.37% with a determination factor (R²) of 0.667. The predicted results favour implantability integration in the short stem optimization to effectively reduce SS in the proximal femur.

Keywords: machine learning techniques, metaheuristic algorithms, short-stem design, stress shielding, hip replacement

Procedia PDF Downloads 192
3609 Cognitive Model of Analogy Based on Operation of the Brain Cells: Glial, Axons and Neurons

Authors: Ozgu Hafizoglu

Abstract:

Analogy is an essential tool of human cognition that enables connecting diffuse and diverse systems with attributional, deep structural, casual relations that are essential to learning, to innovation in artificial worlds, and to discovery in science. Cognitive Model of Analogy (CMA) leads and creates information pattern transfer within and between domains and disciplines in science. This paper demonstrates the Cognitive Model of Analogy (CMA) as an evolutionary approach to scientific research. The model puts forward the challenges of deep uncertainty about the future, emphasizing the need for flexibility of the system in order to enable reasoning methodology to adapt to changing conditions. In this paper, the model of analogical reasoning is created based on brain cells, their fractal, and operational forms within the system itself. Visualization techniques are used to show correspondences. Distinct phases of the problem-solving processes are divided thusly: encoding, mapping, inference, and response. The system is revealed relevant to brain activation considering each of these phases with an emphasis on achieving a better visualization of the brain cells: glial cells, axons, axon terminals, and neurons, relative to matching conditions of analogical reasoning and relational information. It’s found that encoding, mapping, inference, and response processes in four-term analogical reasoning are corresponding with the fractal and operational forms of brain cells: glial, axons, and neurons.

Keywords: analogy, analogical reasoning, cognitive model, brain and glials

Procedia PDF Downloads 180
3608 Optimization Techniques of Doubly-Fed Induction Generator Controller Design for Reliability Enhancement of Wind Energy Conversion Systems

Authors: Om Prakash Bharti, Aanchal Verma, R. K. Saket

Abstract:

The Doubly-Fed Induction Generator (DFIG) is suggested for Wind Energy Conversion System (WECS) to extract wind power. DFIG is preferably employed due to its robustness towards variable wind and rotor speed. DFIG has the adaptable property because the system parameters are smoothly dealt with, including real power, reactive power, DC-link voltage, and the transient and dynamic responses, which are needed to analyze constantly. The analysis becomes more prominent during any unusual condition in the electrical power system. Hence, the study and improvement in the system parameters and transient response performance of DFIG are required to be accomplished using some controlling techniques. For fulfilling the task, the present work implements and compares the optimization methods for the design of the DFIG controller for WECS. The bio-inspired optimization techniques are applied to get the optimal controller design parameters for DFIG-based WECS. The optimized DFIG controllers are then used to retrieve the transient response performance of the six-order DFIG model with a step input. The results using MATLAB/Simulink show the betterment of the Firefly algorithm (FFA) over other control techniques when compared with the other controller design methods.

Keywords: doubly-fed induction generator, wind turbine, wind energy conversion system, induction generator, transfer function, proportional, integral, derivatives

Procedia PDF Downloads 89
3607 Wind Turbines Optimization: Shield Structure for a High Wind Speed Conditions

Authors: Daniyar Seitenov, Nazim Mir-Nasiri

Abstract:

Optimization of horizontal axis semi-exposed wind turbine has been performed using a shield protection that automatically protects the generator shaft at extreme wind speeds from over speeding, mechanical damage and continues generating electricity during the high wind speed conditions. A semi-exposed to wind generator has been designed and its structure has been described in this paper. The simplified point-force dynamic load model on the blades has been derived for normal and extreme wind conditions with and without shield involvement. Numerical simulation has been conducted at different values of wind speed to study the efficiency of shield application. The obtained results show that the maximum power generated by the wind turbine with shield does not exceed approximately the rated value of the generator, where shield serves as an automatic break for extreme wind speed values of 15 m/sec and above. Meantime the wind turbine without shield produced a power that is much larger than the rated value. The optimized horizontal axis semi-exposed wind turbine with shield protection is suitable for low and medium power generation when installed on the roofs of high rise buildings for harvesting wind energy. Wind shield works automatically with no power consumption. The structure of the generator with the protection, math simulation of kinematics and dynamics of power generation has been described in details in this paper.

Keywords: renewable energy, wind turbine, wind turbine optimization, high wind speed

Procedia PDF Downloads 172