Search results for: agent based simulation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 31179

Search results for: agent based simulation

30759 Improvements in Double Q-Learning for Anomalous Radiation Source Searching

Authors: Bo-Bin Xiaoa, Chia-Yi Liua

Abstract:

In the task of searching for anomalous radiation sources, personnel holding radiation detectors to search for radiation sources may be exposed to unnecessary radiation risk, and automated search using machines becomes a required project. The research uses various sophisticated algorithms, which are double Q learning, dueling network, and NoisyNet, of deep reinforcement learning to search for radiation sources. The simulation environment, which is a 10*10 grid and one shielding wall setting in it, improves the development of the AI model by training 1 million episodes. In each episode of training, the radiation source position, the radiation source intensity, agent position, shielding wall position, and shielding wall length are all set randomly. The three algorithms are applied to run AI model training in four environments where the training shielding wall is a full-shielding wall, a lead wall, a concrete wall, and a lead wall or a concrete wall appearing randomly. The 12 best performance AI models are selected by observing the reward value during the training period and are evaluated by comparing these AI models with the gradient search algorithm. The results show that the performance of the AI model, no matter which one algorithm, is far better than the gradient search algorithm. In addition, the simulation environment becomes more complex, the AI model which applied Double DQN combined Dueling and NosiyNet algorithm performs better.

Keywords: double Q learning, dueling network, NoisyNet, source searching

Procedia PDF Downloads 85
30758 Analysis of Information Sharing and Capacity Constraint on Backlog Bullwhip Effect in Two Level Supply Chain

Authors: Matloub Hussaina

Abstract:

This paper investigates the impact of information sharing and capacity constraints on backlog bullwhip effect of Automatic Pipe Line Inventory and Order Based Production Control System (APIOBPCS). System dynamic simulation using iThink Software has been applied. It has been found that smooth ordering by Tier 1 can be achieved when Tier 1 has medium capacity constraints. Simulation experiments also show that information sharing helps to reduce 50% of backlog bullwhip effect in capacitated supply chains. This knowledge is of value per se, giving supply chain operations managers and designers a practical way in to controlling the backlog bullwhip effect. Future work should investigate the total cost implications of capacity constraints and safety stocks in multi-echelon supply chain.

Keywords: supply chain dynamics, information sharing, capacity constraints, simulation, APIOBPCS

Procedia PDF Downloads 293
30757 Performance Evaluation of Flexible Manufacturing System: A Simulation Study

Authors: Mohammed Ali

Abstract:

In this paper, evaluation of flexible manufacturing system is made under different manufacturing strategies. The objective of this paper is to test the impact of pallets and routing flexibility on system performance operating at different sequencing rules, dispatching rules and at unbalanced load condition. A computer simulation model is developed to evaluate the effects of aforementioned manufacturing strategies on the make-span performance of flexible manufacturing system. The impact of number of pallets is shown with the different levels of routing flexibility. In this paper, the same manufacturing system is modeled under different combination of sequencing and dispatching rules. A series of simulation experiments are conducted and results analyzed. The result of the simulation shows that there is impact of pallets and routing flexibility on the performance of the system.

Keywords: flexibility, flexible manufacturing system, pallets, make-span, simulation

Procedia PDF Downloads 400
30756 Knowledge Creation Environment in the Iranian Universities: A Case Study

Authors: Mahdi Shaghaghi, Amir Ghaebi, Fariba Ahmadi

Abstract:

Purpose: The main purpose of the present research is to analyze the knowledge creation environment at a Iranian University (Alzahra University) as a typical University in Iran, using a combination of the i-System and Ba models. This study is necessary for understanding the determinants of knowledge creation at Alzahra University as a typical University in Iran. Methodology: To carry out the present research, which is an applied study in terms of purpose, a descriptive survey method was used. In this study, a combination of the i-System and Ba models has been used to analyze the knowledge creation environment at Alzahra University. i-System consists of 5 constructs including intervention (input), intelligence (process), involvement (process), imagination (process), and integration (output). The Ba environment has three pillars, namely the infrastructure, the agent, and the information. The integration of these two models resulted in 11 constructs which were as follows: intervention (input), infrastructure-intelligence, agent-intelligence, information-intelligence (process); infrastructure-involvement, agent-involvement, information-involvement (process); infrastructure-imagination, agent-imagination, information-imagination (process); and integration (output). These 11 constructs were incorporated into a 52-statement questionnaire and the validity and reliability of the questionnaire were examined and confirmed. The statistical population included the faculty members of Alzahra University (344 people). A total of 181 participants were selected through the stratified random sampling technique. The descriptive statistics, binomial test, regression analysis, and structural equation modeling (SEM) methods were also utilized to analyze the data. Findings: The research findings indicated that among the 11 research constructs, the levels of intervention, information-intelligence, infrastructure-involvement, and agent-imagination constructs were average and not acceptable. The levels of infrastructure-intelligence and information-imagination constructs ranged from average to low. The levels of agent-intelligence and information-involvement constructs were also completely average. The level of infrastructure-imagination construct was average to high and thus was considered acceptable. The levels of agent-involvement and integration constructs were above average and were in a highly acceptable condition. Furthermore, the regression analysis results indicated that only two constructs, viz. the information-imagination and agent-involvement constructs, positively and significantly correlate with the integration construct. The results of the structural equation modeling also revealed that the intervention, intelligence, and involvement constructs are related to the integration construct with the complete mediation of imagination. Discussion and conclusion: The present research suggests that knowledge creation at Alzahra University relatively complies with the combination of the i-System and Ba models. Unlike this model, the intervention, intelligence, and involvement constructs are not directly related to the integration construct and this seems to have three implications: 1) the information sources are not frequently used to assess and identify the research biases; 2) problem finding is probably of less concern at the end of studies and at the time of assessment and validation; 3) the involvement of others has a smaller role in the summarization, assessment, and validation of the research.

Keywords: i-System, Ba model , knowledge creation , knowledge management, knowledge creation environment, Iranian Universities

Procedia PDF Downloads 86
30755 Reducing Component Stress during Encapsulation of Electronics: A Simulative Examination of Thermoplastic Foam Injection Molding

Authors: Constantin Ott, Dietmar Drummer

Abstract:

The direct encapsulation of electronic components is an effective way of protecting components against external influences. In addition to achieving a sufficient protective effect, there are two other big challenges for satisfying the increasing demand for encapsulated circuit boards. The encapsulation process should be both suitable for mass production and offer a low component load. Injection molding is a method with good suitability for large series production but also with typically high component stress. In this article, two aims were pursued: first, the development of a calculation model that allows an estimation of the occurring forces based on process variables and material parameters. Second, the evaluation of a new approach for stress reduction by means of thermoplastic foam injection molding. For this purpose, simulation-based process data was generated with the Moldflow simulation tool. Based on this, component stresses were calculated with the calculation model. At the same time, this paper provided a model for estimating the forces occurring during overmolding and derived a solution method for reducing these forces. The suitability of this approach was clearly demonstrated and a significant reduction in shear forces during overmolding was achieved. It was possible to demonstrate a process development that makes it possible to meet the two main requirements of direct encapsulation in addition to a high protective effect.

Keywords: encapsulation, stress reduction, foam-injection-molding, simulation

Procedia PDF Downloads 107
30754 Case-Based Reasoning for Build Order in Real-Time Strategy Games

Authors: Ben G. Weber, Michael Mateas

Abstract:

We present a case-based reasoning technique for selecting build orders in a real-time strategy game. The case retrieval process generalizes features of the game state and selects cases using domain-specific recall methods, which perform exact matching on a subset of the case features. We demonstrate the performance of the technique by implementing it as a component of the integrated agent framework of McCoy and Mateas. Our results demonstrate that the technique outperforms nearest-neighbor retrieval when imperfect information is enforced in a real-time strategy game.

Keywords: case based reasoning, real time strategy systems, requirements elicitation, requirement analyst, artificial intelligence

Procedia PDF Downloads 413
30753 Simulation Model of Biosensor Based on Gold Nanoparticles

Authors: Kholod Hajo

Abstract:

In this study COMSOL Multiphysics was used to design lateral flow biosensors (LFBs) which provide advantages in low cost, simplicity, rapidity, stability and portability thus making LFBs popular in biomedical, agriculture, food and environmental sciences. This study was focused on simulation model of biosensor based on gold nanoparticles (GNPs) designed using software package (COMSOL Multiphysics), the magnitude of the laminar velocity field in the flow cell, concentration distribution in the analyte stream and surface coverage of adsorbed species and average fractional surface coverage of adsorbed analyte were discussed from the model and couples of suggestion was given in order to functionalize GNPs and to increase the accuracy of the biosensor design, all above were obtained acceptable results.

Keywords: model, gold nanoparticles, biosensor, COMSOL Multiphysics

Procedia PDF Downloads 233
30752 About Multi-Resolution Techniques for Large Eddy Simulation of Reactive Multi-Phase Flows

Authors: Giacomo Rossi, Bernardo Favini, Eugenio Giacomazzi, Franca Rita Picchia, Nunzio Maria Salvatore Arcidiacono

Abstract:

A numerical technique for mesh refinement in the HeaRT (Heat Release and Transfer) numerical code is presented. In the CFD framework, Large Eddy Simulation (LES) approach is gaining in importance as a tool for simulating turbulent combustion processes, also if this approach has an high computational cost due to the complexity of the turbulent modeling and the high number of grid points necessary to obtain a good numerical solution. In particular, when a numerical simulation of a big domain is performed with a structured grid, the number of grid points can increase so much that the simulation becomes impossible: this problem can be overcame with a mesh refinement technique. Mesh refinement technique developed for HeaRT numerical code (a staggered finite difference code) is based on an high order reconstruction of the variables at the grid interfaces by means of a least square quasi-ENO interpolation: numerical code is written in modern Fortran (2003 standard of newer) and is parallelized using domain decomposition and message passing interface (MPI) standard.

Keywords: LES, multi-resolution, ENO, fortran

Procedia PDF Downloads 342
30751 Adsorptive Waste Heat Based Air-Conditioning Control Strategy for Automotives

Authors: Indrasen Raghupatruni, Michael Glora, Ralf Diekmann, Thomas Demmer

Abstract:

As the trend in automotive technology is fast moving towards hybridization and electrification to curb emissions as well as to improve the fuel efficiency, air-conditioning systems in passenger cars have not caught up with this trend and still remain as the major energy consumers amongst others. Adsorption based air-conditioning systems, e.g. with silica-gel water pair, which are already in use for residential and commercial applications, are now being considered as a technology leap once proven feasible for the passenger cars. In this paper we discuss a methodology, challenges and feasibility of implementing an adsorption based air-conditioning system in a passenger car utilizing the exhaust waste heat. We also propose an optimized control strategy with interfaces to the engine control unit of the vehicle for operating this system with reasonable efficiency supported by our simulation and validation results in a prototype vehicle, additionally comparing to existing implementations, simulation based as well as experimental. Finally we discuss the influence of start-stop and hybrid systems on the operation strategy of the adsorption air-conditioning system.

Keywords: adsorption air-conditioning, feasibility study, optimized control strategy, prototype vehicle

Procedia PDF Downloads 410
30750 Plant Layout Analysis by Computer Simulation for Electronic Manufacturing Service Plant

Authors: D. Visuwan, B. Phruksaphanrat

Abstract:

In this research, computer simulation is used for Electronic Manufacturing Service (EMS) plant layout analysis. The current layout of this manufacturing plant is a process layout, which is not suitable due to the nature of an EMS that has high-volume and high-variety environment. Moreover, quick response and high flexibility are also needed. Then, cellular manufacturing layout design was determined for the selected group of products. Systematic layout planning (SLP) was used to analyse and design the possible cellular layouts for the factory. The cellular layout was selected based on the main criteria of the plant. Computer simulation was used to analyse and compare the performance of the proposed cellular layout and the current layout. It is found that the proposed cellular layout can generate better performances than the current layout. In this research, computer simulation is used for Electronic Manufacturing Service (EMS) plant layout analysis. The current layout of this manufacturing plant is a process layout, which is not suitable due to the nature of an EMS that has high-volume and high-variety environment. Moreover, quick response and high flexibility are also needed. Then, cellular manufacturing layout design was determined for the selected group of products. Systematic layout planning (SLP) was used to analyse and design the possible cellular layouts for the factory. The cellular layout was selected based on the main criteria of the plant. Computer simulation was used to analyse and compare the performance of the proposed cellular layout and the current layout. It found that the proposed cellular layout can generate better performances than the current layout.

Keywords: layout, electronic manufacturing service plant, computer simulation, cellular manufacturing system

Procedia PDF Downloads 285
30749 Characteristic Function in Estimation of Probability Distribution Moments

Authors: Vladimir S. Timofeev

Abstract:

In this article the problem of distributional moments estimation is considered. The new approach of moments estimation based on usage of the characteristic function is proposed. By statistical simulation technique, author shows that new approach has some robust properties. For calculation of the derivatives of characteristic function there is used numerical differentiation. Obtained results confirmed that author’s idea has a certain working efficiency and it can be recommended for any statistical applications.

Keywords: characteristic function, distributional moments, robustness, outlier, statistical estimation problem, statistical simulation

Procedia PDF Downloads 481
30748 Analysis of Silicon Controlled Rectifier-Based Electrostatic Discharge Protection Circuits with Electrical Characteristics for the 5V Power Clamp

Authors: Jun-Geol Park, Kyoung-Il Do, Min-Ju Kwon, Kyung-Hyun Park, Yong-Seo Koo

Abstract:

This paper analyzed the SCR (Silicon Controlled Rectifier)-based ESD (Electrostatic Discharge) protection circuits with the turn-on time characteristics. The structures are the LVTSCR (Low Voltage Triggered SCR), the ZTSCR (Zener Triggered SCR) and the PTSCR (P-Substrate Triggered SCR). The three structures are for the 5V power clamp. In general, the structures with the low trigger voltage structure can have the fast turn-on characteristics than other structures. All the ESD protection circuits have the low trigger voltage by using the N+ bridge region of LVTSCR, by using the zener diode structure of ZTSCR, by increasing the trigger current of PTSCR. The simulation for the comparison with the turn-on time was conducted by the Synopsys TCAD simulator. As the simulation results, the LVTSCR has the turn-on time of 2.8 ns, ZTSCR of 2.1 ns and the PTSCR of 2.4 ns. The HBM simulation results, however, show that the PTSCR is the more robust structure of 430K in HBM 8kV standard than 450K of LVTSCR and 495K of ZTSCR. Therefore the PTSCR is the most effective ESD protection circuit for the 5V power clamp.

Keywords: ESD, SCR, turn-on time, trigger voltage, power clamp

Procedia PDF Downloads 324
30747 Performance Evaluation and Plugging Characteristics of Controllable Self-Aggregating Colloidal Particle Profile Control Agent

Authors: Zhiguo Yang, Xiangan Yue, Minglu Shao, Yue Yang, Rongjie Yan

Abstract:

It is difficult to realize deep profile control because of the small pore-throats and easy water channeling in low-permeability heterogeneous reservoir, and the traditional polymer microspheres have the contradiction between injection and plugging. In order to solve this contradiction, the controllable self-aggregating colloidal particles (CSA) containing amide groups on the surface of microspheres was prepared based on emulsion polymerization of styrene and acrylamide. The dispersed solution of CSA colloidal particles, whose particle size is much smaller than the diameter of pore-throats, was injected into the reservoir. When the microspheres migrated to the deep part of reservoir, , these CSA colloidal particles could automatically self-aggregate into large particle clusters under the action of the shielding agent and the control agent, so as to realize the plugging of the water channels. In this paper, the morphology, temperature resistance and self-aggregation properties of CSA microspheres were studied by transmission electron microscopy (TEM) and bottle test. The results showed that CSA microspheres exhibited heterogeneous core-shell structure, good dispersion, and outstanding thermal stability. The microspheres remain regular and uniform spheres at 100℃ after aging for 35 days. With the increase of the concentration of the cations, the self-aggregation time of CSA was gradually shortened, and the influence of bivalent cations was greater than that of monovalent cations. Core flooding experiments showed that CSA polymer microspheres have good injection properties, CSA particle clusters can effective plug the water channels and migrate to the deep part of the reservoir for profile control.

Keywords: heterogeneous reservoir, deep profile control, emulsion polymerization, colloidal particles, plugging characteristic

Procedia PDF Downloads 201
30746 High Performance Computing Enhancement of Agent-Based Economic Models

Authors: Amit Gill, Lalith Wijerathne, Sebastian Poledna

Abstract:

This research presents the details of the implementation of high performance computing (HPC) extension of agent-based economic models (ABEMs) to simulate hundreds of millions of heterogeneous agents. ABEMs offer an alternative approach to study the economy as a dynamic system of interacting heterogeneous agents, and are gaining popularity as an alternative to standard economic models. Over the last decade, ABEMs have been increasingly applied to study various problems related to monetary policy, bank regulations, etc. When it comes to predicting the effects of local economic disruptions, like major disasters, changes in policies, exogenous shocks, etc., on the economy of the country or the region, it is pertinent to study how the disruptions cascade through every single economic entity affecting its decisions and interactions, and eventually affect the economic macro parameters. However, such simulations with hundreds of millions of agents are hindered by the lack of HPC enhanced ABEMs. In order to address this, a scalable Distributed Memory Parallel (DMP) implementation of ABEMs has been developed using message passing interface (MPI). A balanced distribution of computational load among MPI-processes (i.e. CPU cores) of computer clusters while taking all the interactions among agents into account is a major challenge for scalable DMP implementations. Economic agents interact on several random graphs, some of which are centralized (e.g. credit networks, etc.) whereas others are dense with random links (e.g. consumption markets, etc.). The agents are partitioned into mutually-exclusive subsets based on a representative employer-employee interaction graph, while the remaining graphs are made available at a minimum communication cost. To minimize the number of communications among MPI processes, real-life solutions like the introduction of recruitment agencies, sales outlets, local banks, and local branches of government in each MPI-process, are adopted. Efficient communication among MPI-processes is achieved by combining MPI derived data types with the new features of the latest MPI functions. Most of the communications are overlapped with computations, thereby significantly reducing the communication overhead. The current implementation is capable of simulating a small open economy. As an example, a single time step of a 1:1 scale model of Austria (i.e. about 9 million inhabitants and 600,000 businesses) can be simulated in 15 seconds. The implementation is further being enhanced to simulate 1:1 model of Euro-zone (i.e. 322 million agents).

Keywords: agent-based economic model, high performance computing, MPI-communication, MPI-process

Procedia PDF Downloads 108
30745 Power Integrity Analysis of Power Delivery System in High Speed Digital FPGA Board

Authors: Anil Kumar Pandey

Abstract:

Power plane noise is the most significant source of signal integrity (SI) issues in a high-speed digital design. In this paper, power integrity (PI) analysis of multiple power planes in a power delivery system of a 12-layer high-speed FPGA board is presented. All 10 power planes of HSD board are analyzed separately by using 3D Electromagnetic based PI solver, then the transient simulation is performed on combined PI data of all planes along with voltage regulator modules (VRMs) and 70 current drawing chips to get the board level power noise coupling on different high-speed signals. De-coupling capacitors are placed between power planes and ground to reduce power noise coupling with signals.

Keywords: power integrity, power-aware signal integrity analysis, electromagnetic simulation, channel simulation

Procedia PDF Downloads 414
30744 Emergence of Information Centric Networking and Web Content Mining: A Future Efficient Internet Architecture

Authors: Sajjad Akbar, Rabia Bashir

Abstract:

With the growth of the number of users, the Internet usage has evolved. Due to its key design principle, there is an incredible expansion in its size. This tremendous growth of the Internet has brought new applications (mobile video and cloud computing) as well as new user’s requirements i.e. content distribution environment, mobility, ubiquity, security and trust etc. The users are more interested in contents rather than their communicating peer nodes. The current Internet architecture is a host-centric networking approach, which is not suitable for the specific type of applications. With the growing use of multiple interactive applications, the host centric approach is considered to be less efficient as it depends on the physical location, for this, Information Centric Networking (ICN) is considered as the potential future Internet architecture. It is an approach that introduces uniquely named data as a core Internet principle. It uses the receiver oriented approach rather than sender oriented. It introduces the naming base information system at the network layer. Although ICN is considered as future Internet architecture but there are lot of criticism on it which mainly concerns that how ICN will manage the most relevant content. For this Web Content Mining(WCM) approaches can help in appropriate data management of ICN. To address this issue, this paper contributes by (i) discussing multiple ICN approaches (ii) analyzing different Web Content Mining approaches (iii) creating a new Internet architecture by merging ICN and WCM to solve the data management issues of ICN. From ICN, Content-Centric Networking (CCN) is selected for the new architecture, whereas, Agent-based approach from Web Content Mining is selected to find most appropriate data.

Keywords: agent based web content mining, content centric networking, information centric networking

Procedia PDF Downloads 446
30743 A Human Centered Design of an Exoskeleton Using Multibody Simulation

Authors: Sebastian Kölbl, Thomas Reitmaier, Mathias Hartmann

Abstract:

Trial and error approaches to adapt wearable support structures to human physiology are time consuming and elaborate. However, during preliminary design, the focus lies on understanding the interaction between exoskeleton and the human body in terms of forces and moments, namely body mechanics. For the study at hand, a multi-body simulation approach has been enhanced to evaluate actual forces and moments in a human dummy model with and without a digital mock-up of an active exoskeleton. Therefore, different motion data have been gathered and processed to perform a musculosceletal analysis. The motion data are ground reaction forces, electromyography data (EMG) and human motion data recorded with a marker-based motion capture system. Based on the experimental data, the response of the human dummy model has been calibrated. Subsequently, the scalable human dummy model, in conjunction with the motion data, is connected with the exoskeleton structure. The results of the human-machine interaction (HMI) simulation platform are in particular resulting contact forces and human joint forces to compare with admissible values with regard to the human physiology. Furthermore, it provides feedback for the sizing of the exoskeleton structure in terms of resulting interface forces (stress justification) and the effect of its compliance. A stepwise approach for the setup and validation of the modeling strategy is presented and the potential for a more time and cost-effective development of wearable support structures is outlined.

Keywords: assistive devices, ergonomic design, inverse dynamics, inverse kinematics, multibody simulation

Procedia PDF Downloads 137
30742 Evaluation of Easy-to-Use Energy Building Design Tools for Solar Access Analysis in Urban Contexts: Comparison of Friendly Simulation Design Tools for Architectural Practice in the Early Design Stage

Authors: M. Iommi, G. Losco

Abstract:

Current building sector is focused on reduction of energy requirements, on renewable energy generation and on regeneration of existing urban areas. These targets need to be solved with a systemic approach, considering several aspects simultaneously such as climate conditions, lighting conditions, solar radiation, PV potential, etc. The solar access analysis is an already known method to analyze the solar potentials, but in current years, simulation tools have provided more effective opportunities to perform this type of analysis, in particular in the early design stage. Nowadays, the study of the solar access is related to the easiness of the use of simulation tools, in rapid and easy way, during the design process. This study presents a comparison of three simulation tools, from the point of view of the user, with the aim to highlight differences in the easy-to-use of these tools. Using a real urban context as case study, three tools; Ecotect, Townscope and Heliodon, are tested, performing models and simulations and examining the capabilities and output results of solar access analysis. The evaluation of the ease-to-use of these tools is based on some detected parameters and features, such as the types of simulation, requirements of input data, types of results, etc. As a result, a framework is provided in which features and capabilities of each tool are shown. This framework shows the differences among these tools about functions, features and capabilities. The aim of this study is to support users and to improve the integration of simulation tools for solar access with the design process.

Keywords: energy building design tools, solar access analysis, solar potential, urban planning

Procedia PDF Downloads 322
30741 Performance Evaluation of Using Genetic Programming Based Surrogate Models for Approximating Simulation Complex Geochemical Transport Processes

Authors: Hamed K. Esfahani, Bithin Datta

Abstract:

Transport of reactive chemical contaminant species in groundwater aquifers is a complex and highly non-linear physical and geochemical process especially for real life scenarios. Simulating this transport process involves solving complex nonlinear equations and generally requires huge computational time for a given aquifer study area. Development of optimal remediation strategies in aquifers may require repeated solution of such complex numerical simulation models. To overcome this computational limitation and improve the computational feasibility of large number of repeated simulations, Genetic Programming based trained surrogate models are developed to approximately simulate such complex transport processes. Transport process of acid mine drainage, a hazardous pollutant is first simulated using a numerical simulated model: HYDROGEOCHEM 5.0 for a contaminated aquifer in a historic mine site. Simulation model solution results for an illustrative contaminated aquifer site is then approximated by training and testing a Genetic Programming (GP) based surrogate model. Performance evaluation of the ensemble GP models as surrogate models for the reactive species transport in groundwater demonstrates the feasibility of its use and the associated computational advantages. The results show the efficiency and feasibility of using ensemble GP surrogate models as approximate simulators of complex hydrogeologic and geochemical processes in a contaminated groundwater aquifer incorporating uncertainties in historic mine site.

Keywords: geochemical transport simulation, acid mine drainage, surrogate models, ensemble genetic programming, contaminated aquifers, mine sites

Procedia PDF Downloads 252
30740 Modeling the Transport of Charge Carriers in the Active Devices MESFET Based of GaInP by the Monte Carlo Method

Authors: N. Massoum, A. Guen. Bouazza, B. Bouazza, A. El Ouchdi

Abstract:

The progress of industry integrated circuits in recent years has been pushed by continuous miniaturization of transistors. With the reduction of dimensions of components at 0.1 micron and below, new physical effects come into play as the standard simulators of two dimensions (2D) do not consider. In fact the third dimension comes into play because the transverse and longitudinal dimensions of the components are of the same order of magnitude. To describe the operation of such components with greater fidelity, we must refine simulation tools and adapted to take into account these phenomena. After an analytical study of the static characteristics of the component, according to the different operating modes, a numerical simulation is performed of field-effect transistor with submicron gate MESFET GaInP. The influence of the dimensions of the gate length is studied. The results are used to determine the optimal geometric and physical parameters of the component for their specific applications and uses.

Keywords: Monte Carlo simulation, transient electron transport, MESFET device, GaInP

Procedia PDF Downloads 400
30739 Simulating Lean and Green Correlation in Supply Chain Context

Authors: Rachid Benmoussa, Fatima Ezzahra Essaber, Roland De Guio, Fatima Zahra Ben Moussa

Abstract:

Implementing green practices in supply chain management is a complex task mainly because ecological, economical and operational goals are usually in conflict. Green practices might thus face companies’ reluctance because managers can consider its implementation obviously as a performance lean degradation. To implement lean and green practices successfully, companies need relevant decision-making tools to highlight the correlation between them. To contribute to this issue, this work tries to answer the following research question: How to use simulation to assess correlation (antagonism or convergence) between lean and green goals? To answer this question, we propose in this paper a based simulation process that measures correlation generally between two variables. So as to prove its relevance, a logistics academic case study is used to illustrate all its stages. It shows, as for example, that Lean goal 'Stock' and Green goal 'CO₂ emission' are not conceptually correlated (linearly).

Keywords: simulation, lean, green, supply chain

Procedia PDF Downloads 463
30738 WEMax: Virtual Manned Assembly Line Generation

Authors: Won Kyung Ham, Kang Hoon Cho, Sang C. Park

Abstract:

Presented in this paper is a framework of a software ‘WEMax’. The WEMax is invented for analysis and simulation for manned assembly lines to sustain and improve performance of manufacturing systems. In a manufacturing system, performance, such as productivity, is a key of competitiveness for output products. However, the manned assembly lines are difficult to forecast performance, because human labors are not expectable factors by computer simulation models or mathematical models. Existing approaches to performance forecasting of the manned assembly lines are limited to matters of the human itself, such as ergonomic and workload design, and non-human-factor-relevant simulation. Consequently, an approach for the forecasting and improvement of manned assembly line performance is needed to research. As a solution of the current problem, this study proposes a framework that is for generation and simulation of virtual manned assembly lines, and the framework has been implemented as a software.

Keywords: performance forecasting, simulation, virtual manned assembly line, WEMax

Procedia PDF Downloads 310
30737 An Insite to the Probabilistic Assessment of Reserves in Conventional Reservoirs

Authors: Sai Sudarshan, Harsh Vyas, Riddhiman Sherlekar

Abstract:

The oil and gas industry has been unwilling to adopt stochastic definition of reserves. Nevertheless, Monte Carlo simulation methods have gained acceptance by engineers, geoscientists and other professionals who want to evaluate prospects or otherwise analyze problems that involve uncertainty. One of the common applications of Monte Carlo simulation is the estimation of recoverable hydrocarbon from a reservoir.Monte Carlo Simulation makes use of random samples of parameters or inputs to explore the behavior of a complex system or process. It finds application whenever one needs to make an estimate, forecast or decision where there is significant uncertainty. First, the project focuses on performing Monte-Carlo Simulation on a given data set using U. S Department of Energy’s MonteCarlo Software, which is a freeware e&p tool. Further, an algorithm for simulation has been developed for MATLAB and program performs simulation by prompting user for input distributions and parameters associated with each distribution (i.e. mean, st.dev, min., max., most likely, etc.). It also prompts user for desired probability for which reserves are to be calculated. The algorithm so developed and tested in MATLAB further finds implementation in Python where existing libraries on statistics and graph plotting have been imported to generate better outcome. With PyQt designer, codes for a simple graphical user interface have also been written. The graph so plotted is then validated with already available results from U.S DOE MonteCarlo Software.

Keywords: simulation, probability, confidence interval, sensitivity analysis

Procedia PDF Downloads 354
30736 Crude Distillation Process Simulation Using Unisim Design Simulator

Authors: C. Patrascioiu, M. Jamali

Abstract:

The paper deals with the simulation of the crude distillation process using the Unisim Design simulator. The necessity of simulating this process is argued both by considerations related to the design of the crude distillation column, but also by considerations related to the design of advanced control systems. In order to use the Unisim Design simulator to simulate the crude distillation process, the identification of the simulators used in Romania and an analysis of the PRO/II, HYSYS, and Aspen HYSYS simulators were carried out. Analysis of the simulators for the crude distillation process has allowed the authors to elaborate the conclusions of the success of the crude modelling. A first aspect developed by the authors is the implementation of specific problems of petroleum liquid-vapors equilibrium using Unisim Design simulator. The second major element of the article is the development of the methodology and the elaboration of the simulation program for the crude distillation process, using Unisim Design resources. The obtained results validate the proposed methodology and will allow dynamic simulation of the process.  

Keywords: crude oil, distillation, simulation, Unisim Design, simulators

Procedia PDF Downloads 219
30735 Practical Simulation Model of Floating-Gate MOS Transistor in Sub 100 nm Technologies

Authors: Zina Saheb, Ezz El-Masry

Abstract:

As CMOS technology scaling down, Silicon oxide thickness (SiO2) become very thin (few Nano meters). When SiO2 is less than 3nm, gate direct tunneling (DT) leakage current becomes a dormant problem that impacts the transistor performance. Floating gate MOSFET (FGMOSFET) has been used in many low-voltage and low-power applications. Most of the available simulation models of FGMOSFET for analog circuit design does not account for gate DT current and there is no accurate analysis for the gate DT. It is a crucial to use an accurate mode in order to get a realistic simulation result that account for that DT impact on FGMOSFET performance effectively.

Keywords: CMOS transistor, direct-tunneling current, floating-gate, gate-leakage current, simulation model

Procedia PDF Downloads 507
30734 Vine Copula Structure among Yield, Price and Weather Variables for Rating Crop Insurance Premium

Authors: Jiemiao Chen, Shuoxun Xu

Abstract:

The main goal of our research is to apply the Vine copula measuring dependency between price, temperature, and precipitation indices to calculate a fair crop insurance premium. This research is focused on Worth, Iowa, United States, over the period from 2000 to 2020, where the farmers are dependent on precipitation and average temperature during the growth period of corn. Our proposed insurance considers both the natural risk and the price risk in agricultural production. We first estimate the distributions of crops using parametric methods based on Goodness of Fit tests, and then Vine Copula is applied to model dependence between yield price, crop yield, and weather indices. Once the vine structure and its parameters are determined based on AIC/BIC criteria and forecasting price and yield are obtained from the ARIMA model, we calculate this crop insurance premium using the simulation data generated from the vine copula by the Monte Carlo Simulation method. It is shown that, compared with traditional crop insurance, our proposed insurance is more fair and thus less costly for the farmers and government.

Keywords: vine copula, weather index, crop insurance premium, insurance risk management, Monte Carlo simulation

Procedia PDF Downloads 179
30733 Gas Sweetening Process Simulation: Investigation on Recovering Waste Hydraulic Energy

Authors: Meisam Moghadasi, Hassan Ali Ozgoli, Foad Farhani

Abstract:

In this research, firstly, a commercial gas sweetening unit with methyl-di-ethanol-amine (MDEA) solution is simulated and comprised in an integrated model in accordance with Aspen HYSYS software. For evaluation purposes, in the second step, the results of the simulation are compared with operating data gathered from South Pars Gas Complex (SPGC). According to the simulation results, the considerable energy potential contributed to the pressure difference between absorber and regenerator columns causes this energy driving force to be applied in power recovery turbine (PRT). In the last step, the amount of waste hydraulic energy is calculated, and its recovery methods are investigated.

Keywords: gas sweetening unit, simulation, MDEA, power recovery turbine, waste-to-energy

Procedia PDF Downloads 154
30732 Performance Comparison and Visualization of COMSOL Multiphysics, Matlab, and Fortran for Predicting the Reservoir Pressure on Oil Production in a Multiple Leases Reservoir with Boundary Element Method

Authors: N. Alias, W. Z. W. Muhammad, M. N. M. Ibrahim, M. Mohamed, H. F. S. Saipol, U. N. Z. Ariffin, N. A. Zakaria, M. S. Z. Suardi

Abstract:

This paper presents the performance comparison of some computation software for solving the boundary element method (BEM). BEM formulation is the numerical technique and high potential for solving the advance mathematical modeling to predict the production of oil well in arbitrarily shaped based on multiple leases reservoir. The limitation of data validation for ensuring that a program meets the accuracy of the mathematical modeling is considered as the research motivation of this paper. Thus, based on this limitation, there are three steps involved to validate the accuracy of the oil production simulation process. In the first step, identify the mathematical modeling based on partial differential equation (PDE) with Poisson-elliptic type to perform the BEM discretization. In the second step, implement the simulation of the 2D BEM discretization using COMSOL Multiphysic and MATLAB programming languages. In the last step, analyze the numerical performance indicators for both programming languages by using the validation of Fortran programming. The performance comparisons of numerical analysis are investigated in terms of percentage error, comparison graph and 2D visualization of pressure on oil production of multiple leases reservoir. According to the performance comparison, the structured programming in Fortran programming is the alternative software for implementing the accurate numerical simulation of BEM. As a conclusion, high-level language for numerical computation and numerical performance evaluation are satisfied to prove that Fortran is well suited for capturing the visualization of the production of oil well in arbitrarily shaped.

Keywords: performance comparison, 2D visualization, COMSOL multiphysic, MATLAB, Fortran, modelling and simulation, boundary element method, reservoir pressure

Procedia PDF Downloads 470
30731 Pattern Recognition Based on Simulation of Chemical Senses (SCS)

Authors: Nermeen El Kashef, Yasser Fouad, Khaled Mahar

Abstract:

No AI-complete system can model the human brain or behavior, without looking at the totality of the whole situation and incorporating a combination of senses. This paper proposes a Pattern Recognition model based on Simulation of Chemical Senses (SCS) for separation and classification of sign language. The model based on human taste controlling strategy. The main idea of the introduced model is motivated by the facts that the tongue cluster input substance into its basic tastes first, and then the brain recognizes its flavor. To implement this strategy, two level architecture is proposed (this is inspired from taste system). The separation-level of the architecture focuses on hand posture cluster, while the classification-level of the architecture to recognizes the sign language. The efficiency of proposed model is demonstrated experimentally by recognizing American Sign Language (ASL) data set. The recognition accuracy obtained for numbers of ASL is 92.9 percent.

Keywords: artificial intelligence, biocybernetics, gustatory system, sign language recognition, taste sense

Procedia PDF Downloads 269
30730 Free Will and Compatibilism in Decision Theory: A Solution to Newcomb’s Paradox

Authors: Sally Heyeon Hwang

Abstract:

Within decision theory, there are normative principles that dictate how one should act in addition to empirical theories of actual behavior. As a normative guide to one’s actual behavior, evidential or causal decision-theoretic equations allow one to identify outcomes with maximal utility values. The choice that each person makes, however, will, of course, differ according to varying assignments of weight and probability values. Regarding these different choices, it remains a subject of considerable philosophical controversy whether individual subjects have the capacity to exercise free will with respect to the assignment of probabilities, or whether instead the assignment is in some way constrained. A version of this question is given a precise form in Richard Jeffrey’s assumption that free will is necessary for Newcomb’s paradox to count as a decision problem. This paper will argue, against Jeffrey, that decision theory does not require the assumption of libertarian freedom. One of the hallmarks of decision-making is its application across a wide variety of contexts; the implications of a background assumption of free will is similarly varied. One constant across the contexts of decision is that there are always at least two levels of choice for a given agent, depending on the degree of prior constraint. Within the context of Newcomb’s problem, when the predictor is attempting to guess the choice the agent will make, he or she is analyzing the determined aspects of the agent such as past characteristics, experiences, and knowledge. On the other hand, as David Lewis’ backtracking argument concerning the relationship between past and present events brings to light, there are similarly varied ways in which the past can actually be dependent on the present. One implication of this argument is that even in deterministic settings, an agent can have more free will than it may seem. This paper will thus argue against the view that a stable background assumption of free will or determinism in decision theory is necessary, arguing instead for a compatibilist decision theory yielding a novel treatment of Newcomb’s problem.

Keywords: decision theory, compatibilism, free will, Newcomb’s problem

Procedia PDF Downloads 300