Search results for: geometry optimization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4096

Search results for: geometry optimization

1516 Optimal Cropping Pattern in an Irrigation Project: A Hybrid Model of Artificial Neural Network and Modified Simplex Algorithm

Authors: Safayat Ali Shaikh

Abstract:

Software has been developed for optimal cropping pattern in an irrigation project considering land constraint, water availability constraint and pick up flow constraint using modified Simplex Algorithm. Artificial Neural Network Models (ANN) have been developed to predict rainfall. AR (1) model used to generate 1000 years rainfall data to train the ANN. Simulation has been done with expected rainfall data. Eight number crops and three types of soil class have been considered for optimization model. Area under each crop and each soil class have been quantified using Modified Simplex Algorithm to get optimum net return. Efficacy of the software has been tested using data of large irrigation project in India.

Keywords: artificial neural network, large irrigation project, modified simplex algorithm, optimal cropping pattern

Procedia PDF Downloads 185
1515 Using Greywolf Optimized Machine Learning Algorithms to Improve Accuracy for Predicting Hospital Readmission for Diabetes

Authors: Vincent Liu

Abstract:

Machine learning algorithms (ML) can achieve high accuracy in predicting outcomes compared to classical models. Metaheuristic, nature-inspired algorithms can enhance traditional ML algorithms by optimizing them such as by performing feature selection. We compare ten ML algorithms to predict 30-day hospital readmission rates for diabetes patients in the US using a dataset from UCI Machine Learning Repository with feature selection performed by Greywolf nature-inspired algorithm. The baseline accuracy for the initial random forest model was 65%. After performing feature engineering, SMOTE for class balancing, and Greywolf optimization, the machine learning algorithms showed better metrics, including F1 scores, accuracy, and confusion matrix with improvements ranging in 10%-30%, and a best model of XGBoost with an accuracy of 95%. Applying machine learning this way can improve patient outcomes as unnecessary rehospitalizations can be prevented by focusing on patients that are at a higher risk of readmission.

Keywords: diabetes, machine learning, 30-day readmission, metaheuristic

Procedia PDF Downloads 34
1514 Effect of Process Parameters on Tensile Strength of Aluminum Alloy ADC 10 Produced through Ceramic Shell Investment Casting

Authors: Balwinder Singh

Abstract:

Castings are produced by using aluminum alloy ADC 10 through the process of Ceramic Shell Investment Casting. Experiments are conducted as per the Taguchi L9 orthogonal array. In order to evaluate the effect of process parameters such as mould preheat temperature, preheat time, firing temperature and pouring temperature on surface roughness of ceramic shell investment castings, the Taguchi parameter design and optimization approach is used. Plots of means of significant factors and S/N ratios have been used to determine the best relationship between the responses and model parameters. It is found that the pouring temperature is the most significant factor. The best tensile strength of aluminum alloy ADC 10 is given by 150 ºC shell preheat temperature, 45 minutes preheat time, 900 ºC firing temperature, 650 ºC pouring temperature.

Keywords: investment casting, shell preheat temperature, firing temperature, Taguchi method

Procedia PDF Downloads 157
1513 Order Picking Problem: An Exact and Heuristic Algorithms for the Generalized Travelling Salesman Problem With Geographical Overlap Between Clusters

Authors: Farzaneh Rajabighamchi, Stan van Hoesel, Christof Defryn

Abstract:

The generalized traveling salesman problem (GTSP) is an extension of the traveling salesman problem (TSP) where the set of nodes is partitioned into clusters, and the salesman must visit exactly one node per cluster. In this research, we apply the definition of the GTSP to an order picker routing problem with multiple locations per product. As such, each product represents a cluster and its corresponding nodes are the locations at which the product can be retrieved. To pick a certain product item from the warehouse, the picker needs to visit one of these locations during its pick tour. As all products are scattered throughout the warehouse, the product clusters not separated geographically. We propose an exact LP model as well as heuristic and meta-heuristic solution algorithms for the order picking problem with multiple product locations.

Keywords: warehouse optimization, order picking problem, generalised travelling salesman problem, heuristic algorithm

Procedia PDF Downloads 93
1512 How to Improve Teaching and Learning Strategies Through Educational Research. An Experience of Peer Observation in Legal Education

Authors: Luigina Mortari, Alessia Bevilacqua, Roberta Silva

Abstract:

The experience presented in this paper aims to understand how educational research can support the introduction and optimization of teaching innovations in legal education. In this increasingly complex context, a strong need to introduce paths aimed at acquiring not only professional knowledge and skills but also transversal such as reflective, critical, and problem-solving skills emerges. Through a peer observation intertwined with an analysis of discursive practices, researchers and the teacher worked together through a process of participatory and transformative accompaniment whose objective was to promote the active participation and engagement of students in learning processes, an element indispensable to work in the more specific direction of strengthening key competences. This reflective faculty development path led the teacher to activate metacognitive processes, becoming thus aware of the strengths and areas of improvement of his teaching innovation.

Keywords: legal education, teaching innovation, peer observation, discursive analysis, faculty development

Procedia PDF Downloads 145
1511 Integration of Quality Function Deployment and Modular Function Deployment in Product Development

Authors: Naga Velamakuri, Jyothi K. Reddy

Abstract:

Quality must be designed into a product and not inspected has become the main motto of all the companies globally. Due to the rapidly increasing technology in the past few decades, the nature of demands from the consumers has become more sophisticated. To sustain this global revolution of innovation in production systems, companies have to take steps to accommodate this technology growth. In this process of understanding the customers' expectations, all the firms globally take steps to deliver a perfect output. Most of these techniques also concentrate on the consistent development and optimization of the product to exceed the expectations. Quality Function Deployment(QFD) and Modular Function Deployment(MFD) are such techniques which rely on the voice of the customer and help deliver the needs. In this paper, Quality Function Deployment and Modular Function Deployment techniques which help in converting the quantitative descriptions to qualitative outcomes are discussed. The area of interest would be to understand the scope of each of the techniques and the application range in product development when these are applied together to any problem. The research question would be mainly aimed at comprehending the limitations using modularity in product development.

Keywords: quality function deployment, modular function deployment, house of quality, methodology

Procedia PDF Downloads 305
1510 Towards a Robust Patch Based Multi-View Stereo Technique for Textureless and Occluded 3D Reconstruction

Authors: Ben Haines, Li Bai

Abstract:

Patch based reconstruction methods have been and still are one of the top performing approaches to 3D reconstruction to date. Their local approach to refining the position and orientation of a patch, free of global minimisation and independent of surface smoothness, make patch based methods extremely powerful in recovering fine grained detail of an objects surface. However, patch based approaches still fail to faithfully reconstruct textureless or highly occluded surface regions thus though performing well under lab conditions, deteriorate in industrial or real world situations. They are also computationally expensive. Current patch based methods generate point clouds with holes in texturesless or occluded regions that require expensive energy minimisation techniques to fill and interpolate a high fidelity reconstruction. Such shortcomings hinder the adaptation of the methods for industrial applications where object surfaces are often highly textureless and the speed of reconstruction is an important factor. This paper presents on-going work towards a multi-resolution approach to address the problems, utilizing particle swarm optimisation to reconstruct high fidelity geometry, and increasing robustness to textureless features through an adapted approach to the normalised cross correlation. The work also aims to speed up the reconstruction using advances in GPU technologies and remove the need for costly initialization and expansion. Through the combination of these enhancements, it is the intention of this work to create denser patch clouds even in textureless regions within a reasonable time. Initial results show the potential of such an approach to construct denser point clouds with a comparable accuracy to that of the current top-performing algorithms.

Keywords: 3D reconstruction, multiview stereo, particle swarm optimisation, photo consistency

Procedia PDF Downloads 186
1509 A Retrievable Genetic Algorithm for Efficient Solving of Sudoku Puzzles

Authors: Seyed Mehran Kazemi, Bahare Fatemi

Abstract:

Sudoku is a logic-based combinatorial puzzle game which is popular among people of different ages. Due to this popularity, computer softwares are being developed to generate and solve Sudoku puzzles with different levels of difficulty. Several methods and algorithms have been proposed and used in different softwares to efficiently solve Sudoku puzzles. Various search methods such as stochastic local search have been applied to this problem. Genetic Algorithm (GA) is one of the algorithms which have been applied to this problem in different forms and in several works in the literature. In these works, chromosomes with little or no information were considered and obtained results were not promising. In this paper, we propose a new way of applying GA to this problem which uses more-informed chromosomes than other works in the literature. We optimize the parameters of our GA using puzzles with different levels of difficulty. Then we use the optimized values of the parameters to solve various puzzles and compare our results to another GA-based method for solving Sudoku puzzles.

Keywords: genetic algorithm, optimization, solving Sudoku puzzles, stochastic local search

Procedia PDF Downloads 403
1508 Simulation-Based Validation of Safe Human-Robot-Collaboration

Authors: Titanilla Komenda

Abstract:

Human-machine-collaboration defines a direct interaction between humans and machines to fulfil specific tasks. Those so-called collaborative machines are used without fencing and interact with humans in predefined workspaces. Even though, human-machine-collaboration enables a flexible adaption to variable degrees of freedom, industrial applications are rarely found. The reasons for this are not technical progress but rather limitations in planning processes ensuring safety for operators. Until now, humans and machines were mainly considered separately in the planning process, focusing on ergonomics and system performance respectively. Within human-machine-collaboration, those aspects must not be seen in isolation from each other but rather need to be analysed in interaction. Furthermore, a simulation model is needed that can validate the system performance and ensure the safety for the operator at any given time. Following on from this, a holistic simulation model is presented, enabling a simulative representation of collaborative tasks – including both, humans and machines. The presented model does not only include a geometry and a motion model of interacting humans and machines but also a numerical behaviour model of humans as well as a Boole’s probabilistic sensor model. With this, error scenarios can be simulated by validating system behaviour in unplanned situations. As these models can be defined on the basis of Failure Mode and Effects Analysis as well as probabilities of errors, the implementation in a collaborative model is discussed and evaluated regarding limitations and simulation times. The functionality of the model is shown on industrial applications by comparing simulation results with video data. The analysis shows the impact of considering human factors in the planning process in contrast to only meeting system performance. In this sense, an optimisation function is presented that meets the trade-off between human and machine factors and aids in a successful and safe realisation of collaborative scenarios.

Keywords: human-machine-system, human-robot-collaboration, safety, simulation

Procedia PDF Downloads 344
1507 Developing Medium Term Maintenance Plan For Road Networks

Authors: Helen S. Ghali, Haidy S. Ghali, Salma Ibrahim, Ossama Hosny, Hatem S. Elbehairy

Abstract:

Infrastructure systems are essential assets in any community; accordingly, authorities aim to maximize its life span while minimizing the life cycle cost. This requires studying the asset conditions throughout its operation and forming a cost-efficient maintenance strategy plan. The objective of this study is to develop a highway management system that provides medium-term maintenance plans with the minimum life cycle cost subject to budget constraints. The model is applied to data collected for the highway network in India with the aim to output a 5-year maintenance plan strategy from 2019 till 2023. The main element considered is the surface coarse, either rigid or flexible pavement. The model outputs a 5-year maintenance plan for each segment given the budget constraint while maximizing the new pavement condition rating and minimizing its life cycle cost.

Keywords: infrastructure, asset management, optimization, maintenance plan

Procedia PDF Downloads 196
1506 Numerical Analysis of Solar Cooling System

Authors: Nadia Allouache, Mohamed Belmedani

Abstract:

Energy source is a sustainable, totally inexhaustible and environmentally friendly alternative to the fossil fuels available. It is a renewable and economical energy that can be harnessed sustainably over the long term and thus stabilizes energy costs. Solar cooling technologies have been developed to decrease the augmentation electricity consumption for air conditioning and to displace the peak load during hot summer days. A numerical analysis of thermal and solar performances of an annular finned adsorber, which is the most important component of the adsorption solar refrigerating system, is considered in this work. Different adsorbent/adsorbate pairs, such as activated carbon AC35/methanol, activated carbon AC35/ethanol, and activated carbon BPL/Ammoniac, are undertaken in this study. The modeling of the adsorption cooling machine requires the resolution of the equation describing the energy and mass transfer in the tubular finned adsorber. The Wilson and Dubinin- Astakhov models of the solid-adsorbate equilibrium are used to calculate the adsorbed quantity. The porous medium and the fins are contained in the annular space, and the adsorber is heated by solar energy. Effects of key parameters on the adsorbed quantity and on the thermal and solar performances are analysed and discussed. The AC35/methanol pair is the best pair compared to BPL/Ammoniac and AC35/ethanol pairs in terms of system performance. The system performances are sensitive to the fin geometry. For the considered data measured for clear type days of July 2023 in Algeria and Morocco, the performances of the cooling system are very significant in Algeria.

Keywords: activated carbon AC35-methanol pair, activated carbon AC35-ethanol pair, activated carbon BPL-ammoniac pair, annular finned adsorber, performance coefficients, numerical analysis, solar cooling system

Procedia PDF Downloads 38
1505 Optimizing Design Parameters for Efficient Saturated Steam Production in Fire Tube Boilers: A Cost-Effective Approach

Authors: Yoftahe Nigussie Worku

Abstract:

This research focuses on advancing fire tube boiler technology by systematically optimizing design parameters to achieve efficient saturated steam production. The main objective is to design a high-performance boiler with a production capacity of 2000kg/h at a 12-bar design pressure while minimizing costs. The methodology employs iterative analysis, utilizing relevant formulas, and considers material selection and production methods. The study successfully results in a boiler operating at 85.25% efficiency, with a fuel consumption rate of 140.37kg/hr and a heat output of 1610kW. Theoretical importance lies in balancing efficiency, safety considerations, and cost minimization. The research addresses key questions on parameter optimization, material choices, and safety-efficiency balance, contributing valuable insights to fire tube boiler design.

Keywords: safety consideration, efficiency, production methods, material selection

Procedia PDF Downloads 43
1504 Flood Mapping and Inoudation on Weira River Watershed (in the Case of Hadiya Zone, Shashogo Woreda)

Authors: Alilu Getahun Sulito

Abstract:

Exceptional floods are now prevalent in many places in Ethiopia, resulting in a large number of human deaths and property destruction. Lake Boyo watershed, in particular, had also traditionally been vulnerable to flash floods throughout the Boyo watershed. The goal of this research is to create flood and inundation maps for the Boyo Catchment. The integration of Geographic information system(GIS) technology and the hydraulic model (HEC-RAS) were utilized as methods to attain the objective. The peak discharge was determined using Fuller empirical methodology for intervals of 5, 10, 15, and 25 years, and the results were 103.2 m3/s, 158 m3/s, 222 m3/s, and 252 m3/s, respectively. River geometry, boundary conditions, manning's n value of varying land cover, and peak discharge at various return periods were all entered into HEC-RAS, and then an unsteady flow study was performed. The results of the unsteady flow study demonstrate that the water surface elevation in the longitudinal profile rises as the different periods increase. The flood inundation charts clearly show that regions on the right and left sides of the river with the greatest flood coverage were 15.418 km2 and 5.29 km2, respectively, flooded by 10,20,30, and 50 years. High water depths typically occur along the main channel and progressively spread to the floodplains. The latest study also found that flood-prone areas were disproportionately affected on the river's right bank. As a result, combining GIS with hydraulic modelling to create a flood inundation map is a viable solution. The findings of this study can be used to care again for the right bank of a Boyo River catchment near the Boyo Lake kebeles, according to the conclusion. Furthermore, it is critical to promote an early warning system in the kebeles so that people can be evacuated before a flood calamity happens. Keywords: Flood, Weira River, Boyo, GIS, HEC- GEORAS, HEC- RAS, Inundation Mapping

Keywords: Weira River, Boyo, GIS, HEC- GEORAS, HEC- RAS, Inundation Mapping

Procedia PDF Downloads 34
1503 2D Numerical Analysis for Determination of the Effect of Bored Piles Constructed against the Landslide near Karabuk University Stadium

Authors: Dogan Cetin, Burak Turk, Mahmut Candan

Abstract:

Landslides cause remarkable damage and loss of human life every year around the world. They may be made more likely by factors such as earthquakes, heavy precipitation, and incorrect construction activities near or on slopes. The stadium of Karabük University is located at the bottom of a very high slope. After construction of the stadium, severe deformations were observed on the social activity area surrounding the stadium. Some inclinometers were placed behind the stadium to detect the possible landslide activity. According to measurements of the inclinometers, irregular soil movements were detected at depths between 20 m and 45 m. Also, significant heaves and settlements were observed behind the stadium walls located at the toe of the slope. The heaves indicate that the stadium walls were under threat of a significant landslide. After inclinometer readings and field observations, the potential failure geometry was estimated. The protection system was designed based on numerous numerical analysis performed by 2-D Plaxis software. After the design was completed, protective geotechnical work was started. Before the geotechnical work began, new inclinometers were installed to monitor earth movement during the work and afterward. The total horizontal length of the possible failure surface is 220 m. Geotechnical work included two-row-pile construction and three-row-pile construction on the slope. The bored piles were 120 cm in diameter for two-row-pile construction, and 150 cm in diameter for three-row-pile construction. Pile length is 31.30 m for two-row-pile construction and 31.40 m for three-row-pile construction. The distance between two-row-pile and three-row-pile construction is 60 m. With these bored piles, the landslide was divided into three parts. In this way, the earth's pressure was reduced. After a number of inclinometer readings, it was seen that deformation continued during the work, but after the work was done, the movement reversed, and total deformation stayed in mm dimension. It can be said that the protection work eliminated the possible landslide.

Keywords: landslide, landslide protection, inclinometer measurement, bored piles

Procedia PDF Downloads 135
1502 Intrusion Detection Using Dual Artificial Techniques

Authors: Rana I. Abdulghani, Amera I. Melhum

Abstract:

With the abnormal growth of the usage of computers over networks and under the consideration or agreement of most of the computer security experts who said that the goal of building a secure system is never achieved effectively, all these points led to the design of the intrusion detection systems(IDS). This research adopts a comparison between two techniques for network intrusion detection, The first one used the (Particles Swarm Optimization) that fall within the field (Swarm Intelligence). In this Act, the algorithm Enhanced for the purpose of obtaining the minimum error rate by amending the cluster centers when better fitness function is found through the training stages. Results show that this modification gives more efficient exploration of the original algorithm. The second algorithm used a (Back propagation NN) algorithm. Finally a comparison between the results of two methods used were based on (NSL_KDD) data sets for the construction and evaluation of intrusion detection systems. This research is only interested in clustering the two categories (Normal and Abnormal) for the given connection records. Practices experiments result in intrude detection rate (99.183818%) for EPSO and intrude detection rate (69.446416%) for BP neural network.

Keywords: IDS, SI, BP, NSL_KDD, PSO

Procedia PDF Downloads 365
1501 Agent-Based Simulation for Supply Chain Transport Corridors

Authors: Kamalendu Pal

Abstract:

Supply chains are the spinal cord of trade and commerce. Their logistics use different transport corridors on regular basis for operational purpose. The international supply chain transport corridors include different infrastructure elements (e.g. weighbridge, package handling equipment, border clearance authorities, and so on) in supply chains. This paper presents the use of multi-agent systems (MAS) to model and simulate some aspects of transportation corridors, and in particular the area of weighbridge resource optimization for operational profit generation purpose. An underlying multi-agent model provides a means of modeling the relationships among stakeholders in order to enable coordination in a transport corridor environment. Simulations of the costs of container unloading, reloading, and waiting time for queuing up tracks have been carried out using data sets. Results of the simulation provide the potential guidance in making decisions about optimal service resource allocation in a trade corridor.

Keywords: multi-agent systems, simulation, supply chain, transport corridor, weighbridge

Procedia PDF Downloads 338
1500 Android Graphics System: Study of Dual-Software VSync Synchronization Architecture and Optimization

Authors: Prafulla Kumar Choubey, Krishna Kishor Jha, S. B. Vaisakh Punnekkattu Chirayil

Abstract:

In Graphics-display subsystem, frame buffers are shared between producer i.e. content rendering and consumer i.e. display. If a common buffer is operated by both producer and consumer simultaneously, their processing rates mismatch can cause tearing effect in displayed content. Therefore, Android OS employs triple buffered system, taking in to account an additional composition stage. Three stages-rendering, composition and display refresh, operate synchronously on three different buffers, which is achieved by using vsync pulses. This synchronization, however, brings in to the pipeline an additional latency of up to 26ms. The present study details about the existing synchronization mechanism of android graphics-display pipeline and discusses a new adaptive architecture which reduces the wait time to 5ms-16ms in all the use-cases. The proposed method uses two adaptive software vsyncs (PLL) for achieving the same result.

Keywords: Android graphics system, vertical synchronization, atrace, adaptive system

Procedia PDF Downloads 293
1499 Comparison of Heuristic Methods for Solving Traveling Salesman Problem

Authors: Regita P. Permata, Ulfa S. Nuraini

Abstract:

Traveling Salesman Problem (TSP) is the most studied problem in combinatorial optimization. In simple language, TSP can be described as a problem of finding a minimum distance tour to a city, starting and ending in the same city, and exactly visiting another city. In product distribution, companies often get problems in determining the minimum distance that affects the time allocation. In this research, we aim to apply TSP heuristic methods to simulate nodes as city coordinates in product distribution. The heuristics used are sub tour reversal, nearest neighbor, farthest insertion, cheapest insertion, nearest insertion, and arbitrary insertion. We have done simulation nodes using Euclidean distances to compare the number of cities and processing time, thus we get optimum heuristic method. The results show that the optimum heuristic methods are farthest insertion and nearest insertion. These two methods can be recommended to solve product distribution problems in certain companies.

Keywords: Euclidean, heuristics, simulation, TSP

Procedia PDF Downloads 110
1498 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models

Authors: Rossella Arcucci, Luisa D'Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti

Abstract:

This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.

Keywords: data assimilation, GPU architectures, ocean models, parallel algorithm

Procedia PDF Downloads 392
1497 Advanced Mechatronic Design of Robot Manipulator Using Hardware-In-The-Loop Simulation

Authors: Reza Karami, Ali Akbar Ebrahimi

Abstract:

This paper discusses concurrent engineering of robot manipulators, based on the Holistic Concurrent Design (HCD) methodology and by using a hardware-in-the-loop simulation platform. The methodology allows for considering numerous design variables with different natures concurrently. It redefines the ultimate goal of design based on the notion of satisfaction, resulting in the simplification of the multi-objective constrained optimization process. It also formalizes the effect of designer’s subjective attitude in the process. To enhance modeling efficiency for both computation and accuracy, a hardware-in-the-loop simulation platform is used, which involves physical joint modules and the control unit in addition to the software modules. This platform is implemented in the HCD design architecture to reliably evaluate the design attributes and performance super criterion during the design process. The resulting overall architecture is applied to redesigning kinematic, dynamic and control parameters of an industrial robot manipulator.

Keywords: concurrent engineering, hardware-in-the-loop simulation, robot manipulator, multidisciplinary systems, mechatronics

Procedia PDF Downloads 429
1496 A Hybrid System of Hidden Markov Models and Recurrent Neural Networks for Learning Deterministic Finite State Automata

Authors: Pavan K. Rallabandi, Kailash C. Patidar

Abstract:

In this paper, we present an optimization technique or a learning algorithm using the hybrid architecture by combining the most popular sequence recognition models such as Recurrent Neural Networks (RNNs) and Hidden Markov models (HMMs). In order to improve the sequence or pattern recognition/ classification performance by applying a hybrid/neural symbolic approach, a gradient descent learning algorithm is developed using the Real Time Recurrent Learning of Recurrent Neural Network for processing the knowledge represented in trained Hidden Markov Models. The developed hybrid algorithm is implemented on automata theory as a sample test beds and the performance of the designed algorithm is demonstrated and evaluated on learning the deterministic finite state automata.

Keywords: hybrid systems, hidden markov models, recurrent neural networks, deterministic finite state automata

Procedia PDF Downloads 366
1495 High-Resolution Flood Hazard Mapping Using Two-Dimensional Hydrodynamic Model Anuga: Case Study of Jakarta, Indonesia

Authors: Hengki Eko Putra, Dennish Ari Putro, Tri Wahyu Hadi, Edi Riawan, Junnaedhi Dewa Gede, Aditia Rojali, Fariza Dian Prasetyo, Yudhistira Satya Pribadi, Dita Fatria Andarini, Mila Khaerunisa, Raditya Hanung Prakoswa

Abstract:

Catastrophe risk management can only be done if we are able to calculate the exposed risks. Jakarta is an important city economically, socially, and politically and in the same time exposed to severe floods. On the other hand, flood risk calculation is still very limited in the area. This study has calculated the risk of flooding for Jakarta using 2-Dimensional Model ANUGA. 2-Dimensional model ANUGA and 1-Dimensional Model HEC-RAS are used to calculate the risk of flooding from 13 major rivers in Jakarta. ANUGA can simulate physical and dynamical processes between the streamflow against river geometry and land cover to produce a 1-meter resolution inundation map. The value of streamflow as an input for the model obtained from hydrological analysis on rainfall data using hydrologic model HEC-HMS. The probabilistic streamflow derived from probabilistic rainfall using statistical distribution Log-Pearson III, Normal and Gumbel, through compatibility test using Chi Square and Smirnov-Kolmogorov. Flood event on 2007 is used as a comparison to evaluate the accuracy of model output. Property damage estimations were calculated based on flood depth for 1, 5, 10, 25, 50, and 100 years return period against housing value data from the BPS-Statistics Indonesia, Centre for Research and Development of Housing and Settlements, Ministry of Public Work Indonesia. The vulnerability factor was derived from flood insurance claim. Jakarta's flood loss estimation for the return period of 1, 5, 10, 25, 50, and 100 years, respectively are Rp 1.30 t; Rp 16.18 t; Rp 16.85 t; Rp 21.21 t; Rp 24.32 t; and Rp 24.67 t of the total value of building Rp 434.43 t.

Keywords: 2D hydrodynamic model, ANUGA, flood, flood modeling

Procedia PDF Downloads 255
1494 Optimal Hedging of a Portfolio of European Options in an Extended Binomial Model under Proportional Transaction Costs

Authors: Norm Josephy, Lucy Kimball, Victoria Steblovskaya

Abstract:

Hedging of a portfolio of European options under proportional transaction costs is considered. Our discrete time financial market model extends the binomial market model with transaction costs to the case where the underlying stock price ratios are distributed over a bounded interval rather than over a two-point set. An optimal hedging strategy is chosen from a set of admissible non-self-financing hedging strategies. Our approach to optimal hedging of a portfolio of options is based on theoretical foundation that includes determination of a no-arbitrage option price interval as well as on properties of the non-self-financing strategies and their residuals. A computational algorithm for optimizing an investor relevant criterion over the set of admissible non-self-financing hedging strategies is developed. Applicability of our approach is demonstrated using both simulated data and real market data.

Keywords: extended binomial model, non-self-financing hedging, optimization, proportional transaction costs

Procedia PDF Downloads 236
1493 Development of an Erodable Matrix Drug Delivery Platform for Controled Delivery of Non Steroidal Anti Inflamatory Drugs Using Melt Granulation Process

Authors: A. Hilsana, Vinay U. Rao, M. Sudhakar

Abstract:

Even though a number of non-steroidal anti-inflammatory drugs (NSAIDS) are available with different chemistries, they share a common solubility characteristic that is they are relatively more soluble in alkaline environment and practically insoluble in acidic environment. This work deals with developing a wax matrix drug delivery platform for controlled delivery of three model NSAIDS, Diclofenac sodium (DNa), Mefenamic acid (MA) and Naproxen (NPX) using the melt granulation technique. The aim of developing the platform was to have a general understanding on how an erodible matrix system modulates drug delivery rate and extent and how it can be optimized to give a delivery system which shall release the drug as per a common target product profile (TPP). Commonly used waxes like Cetostearyl alcohol and stearic acid were used singly an in combination to achieve a TPP of not 15 to 35% in 1 hour and not less than 80% Q in 24 hours. Full factorial design of experiments was followed for optimization of the formulation.

Keywords: NSAIDs, controlled delivery, target product profile, melt granulation

Procedia PDF Downloads 317
1492 Bi-Objective Optimization for Sustainable Supply Chain Network Design in Omnichannel

Authors: Veerpaul Maan, Gaurav Mishra

Abstract:

The evolution of omnichannel has revolutionized the supply chain of the organizations by enhancing customer shopping experience. For these organizations need to develop well-integrated multiple distribution channels to leverage the benefits of omnichannel. To adopt an omnichannel system in the supply chain has resulted in structuring and reconfiguring the practices of the traditional supply chain distribution network. In this paper a multiple distribution supply chain network (MDSCN) have been proposed which integrates online giants with a local retailers distribution network in uncertain environment followed by sustainability. To incorporate sustainability, an additional objective function is added to reduce the carbon content through minimizing the travel distance of the product. Through this proposed model, customers are free to access product and services as per their choice of channels which increases their convenience, reach and satisfaction. Further, a numerical illustration is being shown along with interpretation of results to validate the proposed model.

Keywords: sustainable supply chain network, omnichannel, multiple distribution supply chain network, integrate multiple distribution channels

Procedia PDF Downloads 203
1491 Modeling and Simulation Analysis and Design of Components of the Microgrid Prototype System

Authors: Draou Azeddine, Mazin Alahmadi, Abdulrahmane Alkassem, Alamri Abdullah

Abstract:

The demand for electric power in Saudi Arabia is steadily increasing with economic growth. More power plants should be installed to increase generation capacity and meet demand. Electricity in Saudi Arabia is mainly dependent on fossil fuels, which are a major problem as they deplete natural resources and increase CO₂ emissions. In this research work, performance and techno-economic analyzes are conducted to evaluate a microgrid system based on hybrid PV/wind diesel power sources as a stand-alone system for rural electrification in Saudi Arabia. The total power flow, maximum power point tracking (MPPT) efficiency, effectiveness of the proposed control strategy, and total harmonic distortion (THD) are analyzed in MATLAB/Simulink environment. Various simulation studies have been carried out under different irradiation conditions. The sizing, optimization, and economic feasibility analysis were performed using Homer energy software.

Keywords: WIND, solar, microgrid, energy

Procedia PDF Downloads 86
1490 Model-Based Process Development for the Comparison of a Radial Riveting and Roller Burnishing Process in Mechanical Joining Technology

Authors: Tobias Beyer, Christoph Friedrich

Abstract:

Modern simulation methodology using finite element models is nowadays a recognized tool for product design/optimization. Likewise, manufacturing process design is increasingly becoming the focus of simulation methodology in order to enable sustainable results based on reduced real-life tests here as well. In this article, two process simulations -radial riveting and roller burnishing- used for mechanical joining of components are explained. In the first step, the required boundary conditions are developed and implemented in the respective simulation models. This is followed by process space validation. With the help of the validated models, the interdependencies of the input parameters are investigated and evaluated by means of sensitivity analyses. Limit case investigations are carried out and evaluated with the aid of the process simulations. Likewise, a comparison of the two joining methods to each other becomes possible.

Keywords: FEM, model-based process development, process simulation, radial riveting, roller burnishing, sensitivity analysis

Procedia PDF Downloads 91
1489 Smart Production Planning: The Case of Aluminium Foundry

Authors: Samira Alvandi

Abstract:

In the context of the circular economy, production planning aims to eliminate waste and emissions and maximize resource efficiency. Historically production planning is challenged through arrays of uncertainty and complexity arising from the interdependence and variability of products, processes, and systems. Manufacturers worldwide are facing new challenges in tackling various environmental issues such as climate change, resource depletion, and land degradation. In managing the inherited complexity and uncertainty and yet maintaining profitability, the manufacturing sector is in need of a holistic framework that supports energy efficiency and carbon emission reduction schemes. The proposed framework addresses the current challenges and integrates simulation modeling with optimization for finding optimal machine-job allocation to maximize throughput and total energy consumption while minimizing lead time. The aluminium refinery facility in western Sydney, Australia, is used as an exemplar to validate the proposed framework.

Keywords: smart production planning, simulation-optimisation, energy aware capacity planning, energy intensive industries

Procedia PDF Downloads 46
1488 Investigation of Comfort Properties of Knitted Fabrics

Authors: Mehmet Karahan, Nevin Karahan

Abstract:

Water and air permeability and thermal resistance of fabrics are the important attributes which strongly influence the thermo-physiological comfort properties of sportswear fabrics in different environmental conditions. In this work, terry and fleece fabrics were developed by varying the fiber content and areal density of fabrics. Further, the thermo-physical properties, including air permeability, water vapor permeability, and thermal resistance, of the developed fabrics were analyzed before and after washing. The multi-response optimization of thermo-physiological comfort properties was done by using principal component analysis (PCA) and Taguchi signal to noise ratio (PCA-S/N ratio) for optimal properties. It was found that the selected parameters resulted in a significant effect on thermo-physiological comfort properties of knitted fabrics. The PCA analysis showed that before wash, 100% cotton fabric with an aerial weight of 220 g.m⁻² gave optimum values of thermo-physiological comfort.

Keywords: thermo-physiological comfort, fleece knitted fabric, air permeability, water vapor transmission, cotton/polyester

Procedia PDF Downloads 93
1487 Software Verification of Systematic Resampling for Optimization of Particle Filters

Authors: Osiris Terry, Kenneth Hopkinson, Laura Humphrey

Abstract:

Systematic resampling is the most popularly used resampling method in particle filters. This paper seeks to further the understanding of systematic resampling by defining a formula made up of variables from the sampling equation and the particle weights. The formula is then verified via SPARK, a software verification language. The verified systematic resampling formula states that the minimum/maximum number of possible samples taken of a particle is equal to the floor/ceiling value of particle weight divided by the sampling interval, respectively. This allows for the creation of a randomness spectrum that each resampling method can fall within. Methods on the lower end, e.g., systematic resampling, have less randomness and, thus, are quicker to reach an estimate. Although lower randomness allows for error by having a larger bias towards the size of the weight, having this bias creates vulnerabilities to the noise in the environment, e.g., jamming. Conclusively, this is the first step in characterizing each resampling method. This will allow target-tracking engineers to pick the best resampling method for their environment instead of choosing the most popularly used one.

Keywords: SPARK, software verification, resampling, systematic resampling, particle filter, tracking

Procedia PDF Downloads 61