Search results for: monte carlo simulation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5068

Search results for: monte carlo simulation

4768 Explicit Numerical Approximations for a Pricing Weather Derivatives Model

Authors: Clarinda V. Nhangumbe, Ercília Sousa

Abstract:

Weather Derivatives are financial instruments used to cover non-catastrophic weather events and can be expressed in the form of standard or plain vanilla products, structured or exotics products. The underlying asset, in this case, is the weather index, such as temperature, rainfall, humidity, wind, and snowfall. The complexity of the Weather Derivatives structure shows the weakness of the Black Scholes framework. Therefore, under the risk-neutral probability measure, the option price of a weather contract can be given as a unique solution of a two-dimensional partial differential equation (parabolic in one direction and hyperbolic in other directions), with an initial condition and subjected to adequate boundary conditions. To calculate the price of the option, one can use numerical methods such as the Monte Carlo simulations and implicit finite difference schemes conjugated with Semi-Lagrangian methods. This paper is proposed two explicit methods, namely, first-order upwind in the hyperbolic direction combined with Lax-Wendroff in the parabolic direction and first-order upwind in the hyperbolic direction combined with second-order upwind in the parabolic direction. One of the advantages of these methods is the fact that they take into consideration the boundary conditions obtained from the financial interpretation and deal efficiently with the different choices of the convection coefficients.

Keywords: incomplete markets, numerical methods, partial differential equations, stochastic process, weather derivatives

Procedia PDF Downloads 70
4767 A Case Study on the Numerical-Probability Approach for Deep Excavation Analysis

Authors: Komeil Valipourian

Abstract:

Urban advances and the growing need for developing infrastructures has increased the importance of deep excavations. In this study, after the introducing probability analysis as an important issue, an attempt has been made to apply it for the deep excavation project of Bangkok’s Metro as a case study. For this, the numerical probability model has been developed based on the Finite Difference Method and Monte Carlo sampling approach. The results indicate that disregarding the issue of probability in this project will result in an inappropriate design of the retaining structure. Therefore, probabilistic redesign of the support is proposed and carried out as one of the applications of probability analysis. A 50% reduction in the flexural strength of the structure increases the failure probability just by 8% in the allowable range and helps improve economic conditions, while maintaining mechanical efficiency. With regard to the lack of efficient design in most deep excavations, by considering geometrical and geotechnical variability, an attempt was made to develop an optimum practical design standard for deep excavations based on failure probability. On this basis, a practical relationship is presented for estimating the maximum allowable horizontal displacement, which can help improve design conditions without developing the probability analysis.

Keywords: numerical probability modeling, deep excavation, allowable maximum displacement, finite difference method (FDM)

Procedia PDF Downloads 104
4766 A Flexible Bayesian State-Space Modelling for Population Dynamics of Wildlife and Livestock Populations

Authors: Sabyasachi Mukhopadhyay, Joseph Ogutu, Hans-Peter Piepho

Abstract:

We aim to model dynamics of wildlife or pastoral livestock population for understanding of their population change and hence for wildlife conservation and promoting human welfare. The study is motivated by an age-sex structured population counts in different regions of Serengeti-Mara during the period 1989-2003. Developing reliable and realistic models for population dynamics of large herbivore population can be a very complex and challenging exercise. However, the Bayesian statistical domain offers some flexible computational methods that enable the development and efficient implementation of complex population dynamics models. In this work, we have used a novel Bayesian state-space model to analyse the dynamics of topi and hartebeest populations in the Serengeti-Mara Ecosystem of East Africa. The state-space model involves survival probabilities of the animals which further depend on various factors like monthly rainfall, size of habitat, etc. that cause recent declines in numbers of the herbivore populations and potentially threaten their future population viability in the ecosystem. Our study shows that seasonal rainfall is the most important factors shaping the population size of animals and indicates the age-class which most severely affected by any change in weather conditions.

Keywords: bayesian state-space model, Markov Chain Monte Carlo, population dynamics, conservation

Procedia PDF Downloads 183
4765 Autonomous Kuka Youbot Navigation Based on Machine Learning and Path Planning

Authors: Carlos Gordon, Patricio Encalada, Henry Lema, Diego Leon, Dennis Chicaiza

Abstract:

The following work presents a proposal of autonomous navigation of mobile robots implemented in an omnidirectional robot Kuka Youbot. We have been able to perform the integration of robotic operative system (ROS) and machine learning algorithms. ROS mainly provides two distributions; ROS hydro and ROS Kinect. ROS hydro allows managing the nodes of odometry, kinematics, and path planning with statistical and probabilistic, global and local algorithms based on Adaptive Monte Carlo Localization (AMCL) and Dijkstra. Meanwhile, ROS Kinect is responsible for the detection block of dynamic objects which can be in the points of the planned trajectory obstructing the path of Kuka Youbot. The detection is managed by artificial vision module under a trained neural network based on the single shot multibox detector system (SSD), where the main dynamic objects for detection are human beings and domestic animals among other objects. When the objects are detected, the system modifies the trajectory or wait for the decision of the dynamic obstacle. Finally, the obstacles are skipped from the planned trajectory, and the Kuka Youbot can reach its goal thanks to the machine learning algorithms.

Keywords: autonomous navigation, machine learning, path planning, robotic operative system, open source computer vision library

Procedia PDF Downloads 154
4764 The Diffusion of Membrane Nanodomains with Specific Ganglioside Composition

Authors: Barbora Chmelova, Radek Sachl

Abstract:

Gangliosides are amphipathic membrane lipids. Due to the composition of bulky oligosaccharide chains containing one or more sialic acids linked to the hydrophobic ceramide base, gangliosides are classified among glycosphingolipids. This unique structure induces a high self-aggregating tendency of gangliosides and, therefore, the formation of nanoscopic clusters called nanodomains. Gangliosides are preferentially present in an extracellular membrane leaflet of all human tissues and thus have an impact on a huge number of biological processes, such as intercellular communication, cell signalling, membrane trafficking, and regulation of receptor activity. Defects in their metabolism, impairment of proper ganglioside function, or changes in their organization lead to serious health conditions such as Alzheimer´s and Parkinson´s diseases, autoimmune diseases, tumour growth, etc. This work mainly focuses on ganglioside organization into nanodomains and their dynamics within the plasma membrane. Current research investigates static ganglioside nanodomains characterization; nevertheless, the information about their diffusion is missing. In our study, fluorescence correlation spectroscopy is implemented together with stimulated emission depletion (STED-FCS), which combines the diffraction-unlimited spatial resolution with high temporal resolution. By comparison of the experiments performed on model vesicles containing 4 % of either GM1, GM2, or GM3 and Monte Carlo simulations of diffusion on the plasma membrane, the description of ganglioside clustering, diffusion of nanodomains, and even diffusion of ganglioside molecules inside investigated nanodomains are described.

Keywords: gangliosides, nanodomains, STED-FCS, flourescence microscopy, membrane diffusion

Procedia PDF Downloads 58
4763 Flash Flood in Gabes City (Tunisia): Hazard Mapping and Vulnerability Assessment

Authors: Habib Abida, Noura Dahri

Abstract:

Flash floods are among the most serious natural hazards that have disastrous environmental and human impacts. They are associated with exceptional rain events, characterized by short durations, very high intensities, rapid flows and small spatial extent. Flash floods happen very suddenly and are difficult to forecast. They generally cause damage to agricultural crops and property, infrastructures, and may even result in the loss of human lives. The city of Gabes (South-eastern Tunisia) has been exposed to numerous damaging floods because of its mild topography, clay soil, high urbanization rate and erratic rainfall distribution. The risks associated with this situation are expected to increase further in the future because of climate change, deemed responsible for the increase of the frequency and the severity of this natural hazard. Recently, exceptional events hit Gabes City causing death and major property losses. A major flooding event hit the region on June 2nd, 2014, causing human deaths and major material losses. It resulted in the stagnation of storm water in the numerous low zones of the study area, endangering thereby human health and causing disastrous environmental impacts. The characterization of flood risk in Gabes Watershed (South-eastern Tunisia) is considered an important step for flood management. Analytical Hierarchy Process (AHP) method coupled with Monte Carlo simulation and geographic information system were applied to delineate and characterize flood areas. A spatial database was developed based on geological map, digital elevation model, land use, and rainfall data in order to evaluate the different factors susceptible to affect flood analysis. Results obtained were validated by remote sensing data for the zones that showed very high flood hazard during the extreme rainfall event of June 2014 that hit the study basin. Moreover, a survey was conducted from different areas of the city in order to understand and explore the different causes of this disaster, its extent and its consequences.

Keywords: analytical hierarchy process, flash floods, Gabes, remote sensing, Tunisia

Procedia PDF Downloads 86
4762 The Use of Simulation Programs of Leakage of Harmful Substances for Crisis Management

Authors: Jiří Barta

Abstract:

The paper deals with simulation programs of spread of harmful substances. Air pollution has a direct impact on the quality of human life and environmental protection is currently a very hot topic. Therefore, the paper focuses on the simulation of release of harmful substances. The first part of article deals with perspectives and possibilities of implementation outputs of simulations programs into the system which is education and of practical training of the management staff during emergency events in the frame of critical infrastructure. The last part shows the practical testing and evaluation of simulation programs. Of the tested simulations software been selected Symos97. The tool offers advanced features for setting leakage. Gradually allows the user to model the terrain, location, and method of escape of harmful substances.

Keywords: Computer Simulation, Symos97, Spread, Simulation Software, Harmful Substances

Procedia PDF Downloads 266
4761 Joint Modeling of Longitudinal and Time-To-Event Data with Latent Variable

Authors: Xinyuan Y. Song, Kai Kang

Abstract:

Joint models for analyzing longitudinal and survival data are widely used to investigate the relationship between a failure time process and time-variant predictors. A common assumption in conventional joint models in the survival analysis literature is that all predictors are observable. However, this assumption may not always be supported because unobservable traits, namely, latent variables, which are indirectly observable and should be measured through multiple observed variables, are commonly encountered in the medical, behavioral, and financial research settings. In this study, a joint modeling approach to deal with this feature is proposed. The proposed model comprises three parts. The first part is a dynamic factor analysis model for characterizing latent variables through multiple observed indicators over time. The second part is a random coefficient trajectory model for describing the individual trajectories of latent variables. The third part is a proportional hazard model for examining the effects of time-invariant predictors and the longitudinal trajectories of time-variant latent risk factors on hazards of interest. A Bayesian approach coupled with a Markov chain Monte Carlo algorithm to perform statistical inference. An application of the proposed joint model to a study on the Alzheimer's disease neuroimaging Initiative is presented.

Keywords: Bayesian analysis, joint model, longitudinal data, time-to-event data

Procedia PDF Downloads 121
4760 Bayesian Semiparametric Geoadditive Modelling of Underweight Malnutrition of Children under 5 Years in Ethiopia

Authors: Endeshaw Assefa Derso, Maria Gabriella Campolo, Angela Alibrandi

Abstract:

Objectives:Early childhood malnutrition can have long-term and irreversible effects on a child's health and development. This study uses the Bayesian method with spatial variation to investigate the flexible trends of metrical covariates and to identify communities at high risk of injury. Methods: Cross-sectional data on underweight are collected from the 2016 Ethiopian Demographic and Health Survey (EDHS). The Bayesian geo-additive model is performed. Appropriate prior distributions were provided for scall parameters in the models, and the inference is entirely Bayesian, using Monte Carlo Markov chain (MCMC) stimulation. Results: The results show that metrical covariates like child age, maternal body mass index (BMI), and maternal age affect a child's underweight non-linearly. Lower and higher maternal BMI seem to have a significant impact on the child’s high underweight. There was also a significant spatial heterogeneity, and based on IDW interpolation of predictive values, the western, central, and eastern parts of the country are hotspot areas. Conclusion: Socio-demographic and community- based programs development should be considered compressively in Ethiopian policy to combat childhood underweight malnutrition.

Keywords: bayesX, Ethiopia, malnutrition, MCMC, semi-parametric bayesian analysis, spatial distribution, P- splines

Procedia PDF Downloads 57
4759 Parameter Estimation for the Mixture of Generalized Gamma Model

Authors: Wikanda Phaphan

Abstract:

Mixture generalized gamma distribution is a combination of two distributions: generalized gamma distribution and length biased generalized gamma distribution. These two distributions were presented by Suksaengrakcharoen and Bodhisuwan in 2014. The findings showed that probability density function (pdf) had fairly complexities, so it made problems in estimating parameters. The problem occurred in parameter estimation was that we were unable to calculate estimators in the form of critical expression. Thus, we will use numerical estimation to find the estimators. In this study, we presented a new method of the parameter estimation by using the expectation – maximization algorithm (EM), the conjugate gradient method, and the quasi-Newton method. The data was generated by acceptance-rejection method which is used for estimating α, β, λ and p. λ is the scale parameter, p is the weight parameter, α and β are the shape parameters. We will use Monte Carlo technique to find the estimator's performance. Determining the size of sample equals 10, 30, 100; the simulations were repeated 20 times in each case. We evaluated the effectiveness of the estimators which was introduced by considering values of the mean squared errors and the bias. The findings revealed that the EM-algorithm had proximity to the actual values determined. Also, the maximum likelihood estimators via the conjugate gradient and the quasi-Newton method are less precision than the maximum likelihood estimators via the EM-algorithm.

Keywords: conjugate gradient method, quasi-Newton method, EM-algorithm, generalized gamma distribution, length biased generalized gamma distribution, maximum likelihood method

Procedia PDF Downloads 201
4758 Development and Verification of the Idom Shielding Optimization Tool

Authors: Omar Bouhassoun, Cristian Garrido, César Hueso

Abstract:

The radiation shielding design is an optimization problem with multiple -constrained- objective functions (radiation dose, weight, price, etc.) that depend on several parameters (material, thickness, position, etc.). The classical approach for shielding design consists of a brute force trial-and-error process subject to previous designer experience. Therefore, the result is an empirical solution but not optimal, which can degrade the overall performance of the shielding. In order to automate the shielding design procedure, the IDOM Shielding Optimization Tool (ISOT) has been developed. This software combines optimization algorithms with the capabilities to read/write input files, run calculations, as well as parse output files for different radiation transport codes. In the first stage, the software was established to adjust the input files for two well-known Monte Carlo codes (MCNP and Serpent) and optimize the result (weight, volume, price, dose rate) using multi-objective genetic algorithms. Nevertheless, its modular implementation easily allows the inclusion of more radiation transport codes and optimization algorithms. The work related to the development of ISOT and its verification on a simple 3D multi-layer shielding problem using both MCNP and Serpent will be presented. ISOT looks very promising for achieving an optimal solution to complex shielding problems.

Keywords: optimization, shielding, nuclear, genetic algorithm

Procedia PDF Downloads 91
4757 Application of Harris Hawks Optimization Metaheuristic Algorithm and Random Forest Machine Learning Method for Long-Term Production Scheduling Problem under Uncertainty in Open-Pit Mines

Authors: Kamyar Tolouei, Ehsan Moosavi

Abstract:

In open-pit mines, the long-term production scheduling optimization problem (LTPSOP) is a complicated problem that contains constraints, large datasets, and uncertainties. Uncertainty in the output is caused by several geological, economic, or technical factors. Due to its dimensions and NP-hard nature, it is usually difficult to find an ideal solution to the LTPSOP. The optimal schedule generally restricts the ore, metal, and waste tonnages, average grades, and cash flows of each period. Past decades have witnessed important measurements of long-term production scheduling and optimal algorithms since researchers have become highly cognizant of the issue. In fact, it is not possible to consider LTPSOP as a well-solved problem. Traditional production scheduling methods in open-pit mines apply an estimated orebody model to produce optimal schedules. The smoothing result of some geostatistical estimation procedures causes most of the mine schedules and production predictions to be unrealistic and imperfect. With the expansion of simulation procedures, the risks from grade uncertainty in ore reserves can be evaluated and organized through a set of equally probable orebody realizations. In this paper, to synthesize grade uncertainty into the strategic mine schedule, a stochastic integer programming framework is presented to LTPSOP. The objective function of the model is to maximize the net present value and minimize the risk of deviation from the production targets considering grade uncertainty simultaneously while satisfying all technical constraints and operational requirements. Instead of applying one estimated orebody model as input to optimize the production schedule, a set of equally probable orebody realizations are applied to synthesize grade uncertainty in the strategic mine schedule and to produce a more profitable and risk-based production schedule. A mixture of metaheuristic procedures and mathematical methods paves the way to achieve an appropriate solution. This paper introduced a hybrid model between the augmented Lagrangian relaxation (ALR) method and the metaheuristic algorithm, the Harris Hawks optimization (HHO), to solve the LTPSOP under grade uncertainty conditions. In this study, the HHO is experienced to update Lagrange coefficients. Besides, a machine learning method called Random Forest is applied to estimate gold grade in a mineral deposit. The Monte Carlo method is used as the simulation method with 20 realizations. The results specify that the progressive versions have been considerably developed in comparison with the traditional methods. The outcomes were also compared with the ALR-genetic algorithm and ALR-sub-gradient. To indicate the applicability of the model, a case study on an open-pit gold mining operation is implemented. The framework displays the capability to minimize risk and improvement in the expected net present value and financial profitability for LTPSOP. The framework could control geological risk more effectively than the traditional procedure considering grade uncertainty in the hybrid model framework.

Keywords: grade uncertainty, metaheuristic algorithms, open-pit mine, production scheduling optimization

Procedia PDF Downloads 82
4756 Spectrogram Pre-Processing to Improve Isotopic Identification to Discriminate Gamma and Neutrons Sources

Authors: Mustafa Alhamdi

Abstract:

Industrial application to classify gamma rays and neutron events is investigated in this study using deep machine learning. The identification using a convolutional neural network and recursive neural network showed a significant improvement in predication accuracy in a variety of applications. The ability to identify the isotope type and activity from spectral information depends on feature extraction methods, followed by classification. The features extracted from the spectrum profiles try to find patterns and relationships to present the actual spectrum energy in low dimensional space. Increasing the level of separation between classes in feature space improves the possibility to enhance classification accuracy. The nonlinear nature to extract features by neural network contains a variety of transformation and mathematical optimization, while principal component analysis depends on linear transformations to extract features and subsequently improve the classification accuracy. In this paper, the isotope spectrum information has been preprocessed by finding the frequencies components relative to time and using them as a training dataset. Fourier transform implementation to extract frequencies component has been optimized by a suitable windowing function. Training and validation samples of different isotope profiles interacted with CdTe crystal have been simulated using Geant4. The readout electronic noise has been simulated by optimizing the mean and variance of normal distribution. Ensemble learning by combing voting of many models managed to improve the classification accuracy of neural networks. The ability to discriminate gamma and neutron events in a single predication approach using deep machine learning has shown high accuracy using deep learning. The paper findings show the ability to improve the classification accuracy by applying the spectrogram preprocessing stage to the gamma and neutron spectrums of different isotopes. Tuning deep machine learning models by hyperparameter optimization of neural network models enhanced the separation in the latent space and provided the ability to extend the number of detected isotopes in the training database. Ensemble learning contributed significantly to improve the final prediction.

Keywords: machine learning, nuclear physics, Monte Carlo simulation, noise estimation, feature extraction, classification

Procedia PDF Downloads 127
4755 Resource Allocation Modeling and Simulation in Border Security Application

Authors: Kai Jin, Hua Li, Qing Song

Abstract:

Homeland security and border safety is an issue for any country. This paper takes the border security of US as an example to discuss the usage and efficiency of simulation tools in the homeland security application. In this study, available resources and different illegal infiltration parameters are defined, including their individual behavior and objective, in order to develop a model that describes border patrol system. A simulation model is created in Arena. This simulation model is used to study the dynamic activities in the border security. Possible factors that may affect the effectiveness of the border patrol system are proposed. Individual and factorial analysis of these factors is conducted and some suggestions are made.

Keywords: resource optimization, simulation, modeling, border security

Procedia PDF Downloads 493
4754 Tracing Sources of Sediment in an Arid River, Southern Iran

Authors: Hesam Gholami

Abstract:

Elevated suspended sediment loads in riverine systems resulting from accelerated erosion due to human activities are a serious threat to the sustainable management of watersheds and ecosystem services therein worldwide. Therefore, mitigation of deleterious sediment effects as a distributed or non-point pollution source in the catchments requires reliable provenance information. Sediment tracing or sediment fingerprinting, as a combined process consisting of sampling, laboratory measurements, different statistical tests, and the application of mixing or unmixing models, is a useful technique for discriminating the sources of sediments. From 1996 to the present, different aspects of this technique, such as grouping the sources (spatial and individual sources), discriminating the potential sources by different statistical techniques, and modification of mixing and unmixing models, have been introduced and modified by many researchers worldwide, and have been applied to identify the provenance of fine materials in agricultural, rural, mountainous, and coastal catchments, and in large catchments with numerous lakes and reservoirs. In the last two decades, efforts exploring the uncertainties associated with sediment fingerprinting results have attracted increasing attention. The frameworks used to quantify the uncertainty associated with fingerprinting estimates can be divided into three groups comprising Monte Carlo simulation, Bayesian approaches and generalized likelihood uncertainty estimation (GLUE). Given the above background, the primary goal of this study was to apply geochemical fingerprinting within the GLUE framework in the estimation of sub-basin spatial sediment source contributions in the arid Mehran River catchment in southern Iran, which drains into the Persian Gulf. The accuracy of GLUE predictions generated using four different sets of statistical tests for discriminating three sub-basin spatial sources was evaluated using 10 virtual sediments (VS) samples with known source contributions using the root mean square error (RMSE) and mean absolute error (MAE). Based on the results, the contributions modeled by GLUE for the western, central and eastern sub-basins are 1-42% (overall mean 20%), 0.5-30% (overall mean 12%) and 55-84% (overall mean 68%), respectively. According to the mean absolute fit (MAF; ≥ 95% for all target sediment samples) and goodness-of-fit (GOF; ≥ 99% for all samples), our suggested modeling approach is an accurate technique to quantify the source of sediments in the catchments. Overall, the estimated source proportions can help watershed engineers plan the targeting of conservation programs for soil and water resources.

Keywords: sediment source tracing, generalized likelihood uncertainty estimation, virtual sediment mixtures, Iran

Procedia PDF Downloads 54
4753 Simulation for Squat Exercise of an Active Controlled Vibration Isolation and Stabilization System for Astronaut’s Exercise Platform

Authors: Ziraguen O. Williams, Shield B. Lin, Fouad N. Matari, Leslie J. Quiocho

Abstract:

In a task to assist NASA in analyzing the dynamic forces caused by operational countermeasures of an astronaut’s exercise platform impacting the spacecraft, feedback delay, and signal noise were added to a simulation model of an active-controlled vibration isolation system to regulate the movement of the exercise platform. Previous simulation work was conducted primarily via MATLAB/Simulink. Two additional simulation tools used in this study were Trick and MBDyn, NASA co-developed software simulation environments. Simulation results obtained from these three tools were very similar. All simulation results support the hypothesis that an active-controlled vibration isolation system outperforms a passive-controlled system even with the addition of feedback delay and signal noise to the active-controlled system. In this paper, squat exercise was used in creating excited force to the simulation model. The exciter force from a squat exercise was calculated from the motion capture of an exerciser. The simulation results demonstrate much greater transmitted force reduction in the active-controlled system than the passive-controlled system.

Keywords: control, counterweight, isolation, vibration

Procedia PDF Downloads 87
4752 The “Bright Side” of COVID-19: Effects of Livestream Affordances on Consumer Purchase Willingness: Explicit IT Affordances Perspective

Authors: Isaac Owusu Asante, Yushi Jiang, Hailin Tao

Abstract:

Live streaming marketing, the new electronic commerce element, became an optional marketing channel following the COVID-19 pandemic. Many sellers have leveraged the features presented by live streaming to increase sales. Studies on live streaming have focused on gaming and consumers’ loyalty to brands through live streaming, using interview questionnaires. This study, however, was conducted to measure real-time observable interactions between consumers and sellers. Based on the affordance theory, this study conceptualized constructs representing the interactive features and examined how they drive consumers’ purchase willingness during live streaming sessions using 1238 datasets from Amazon Live, following the manual observation of transaction records. Using structural equation modeling, the ordinary least square regression suggests that live viewers, new followers, live chats, and likes positively affect purchase willingness. The Sobel and Monte Carlo tests show that new followers, live chats, and likes significantly mediate the relationship between live viewers and purchase willingness. The study introduces a new way of measuring interactions in live streaming commerce and proposes a way to manually gather data on consumer behaviors in live streaming platforms when the application programming interface (API) of such platforms does not support data mining algorithms.

Keywords: livestreaming marketing, live chats, live viewers, likes, new followers, purchase willingness

Procedia PDF Downloads 50
4751 Introducing and Effectiveness Evaluation of Innovative Logistics System Simulation Teaching: Theoretical Integration and Verification

Authors: Tsai-Pei Liu, Zhi-Rou Zheng, Tzu-Tzu Wen

Abstract:

Innovative logistics system simulation teaching is to extract the characteristics of the system through simulation methodology. The system has randomness and interaction problems in the execution time. Therefore, the simulation model can usually deal with more complex logistics process problems, giving students different learning modes. Students have more autonomy in learning time and learning progress. System simulation has become a new educational tool, but it still needs to accept many tests to use it in the teaching field. Although many business management departments in Taiwan have started to promote, this kind of simulation system teaching is still not popular, and the prerequisite for popularization is to be supported by students. This research uses an extension of Integration Unified Theory of Acceptance and Use of Technology (UTAUT2) to explore the acceptance of students in universities of science and technology to use system simulation as a learning tool. At the same time, it is hoped that this innovation can explore the effectiveness of the logistics system simulation after the introduction of teaching. The results indicated the significant influence of performance expectancy, social influence and learning value on students’ intention towards confirmed the influence of facilitating conditions and behavioral intention. The extended UTAUT2 framework helps in understanding students’ perceived value in the innovative logistics system teaching context.

Keywords: UTAUT2, logistics system simulation, learning value, Taiwan

Procedia PDF Downloads 87
4750 A Hierarchical Method for Multi-Class Probabilistic Classification Vector Machines

Authors: P. Byrnes, F. A. DiazDelaO

Abstract:

The Support Vector Machine (SVM) has become widely recognised as one of the leading algorithms in machine learning for both regression and binary classification. It expresses predictions in terms of a linear combination of kernel functions, referred to as support vectors. Despite its popularity amongst practitioners, SVM has some limitations, with the most significant being the generation of point prediction as opposed to predictive distributions. Stemming from this issue, a probabilistic model namely, Probabilistic Classification Vector Machines (PCVM), has been proposed which respects the original functional form of SVM whilst also providing a predictive distribution. As physical system designs become more complex, an increasing number of classification tasks involving industrial applications consist of more than two classes. Consequently, this research proposes a framework which allows for the extension of PCVM to a multi class setting. Additionally, the original PCVM framework relies on the use of type II maximum likelihood to provide estimates for both the kernel hyperparameters and model evidence. In a high dimensional multi class setting, however, this approach has been shown to be ineffective due to bad scaling as the number of classes increases. Accordingly, we propose the application of Markov Chain Monte Carlo (MCMC) based methods to provide a posterior distribution over both parameters and hyperparameters. The proposed framework will be validated against current multi class classifiers through synthetic and real life implementations.

Keywords: probabilistic classification vector machines, multi class classification, MCMC, support vector machines

Procedia PDF Downloads 206
4749 Feedback Matrix Approach for Relativistic Runaway Electron Avalanches Dynamics in Complex Electric Field Structures

Authors: Egor Stadnichuk

Abstract:

Relativistic runaway electron avalanches (RREA) are a widely accepted source of thunderstorm gamma-radiation. In regions with huge electric field strength, RREA can multiply via relativistic feedback. The relativistic feedback is caused both by positron production and by runaway electron bremsstrahlung gamma-rays reversal. In complex multilayer thunderstorm electric field structures, an additional reactor feedback mechanism appears due to gamma-ray exchange between separate strong electric field regions with different electric field directions. The study of this reactor mechanism in conjunction with the relativistic feedback with Monte Carlo simulations or by direct solution of the kinetic Boltzmann equation requires a significant amount of computational time. In this work, a theoretical approach to study feedback mechanisms in RREA physics is developed. It is based on the matrix of feedback operators construction. With the feedback matrix, the problem of the dynamics of avalanches in complex electric structures is reduced to the problem of finding eigenvectors and eigenvalues. A method of matrix elements calculation is proposed. The proposed concept was used to study the dynamics of RREAs in multilayer thunderclouds.

Keywords: terrestrial Gamma-ray flashes, thunderstorm ground enhancement, relativistic runaway electron avalanches, gamma-rays, high-energy atmospheric physics, TGF, TGE, thunderstorm, relativistic feedback, reactor feedback, reactor model

Procedia PDF Downloads 146
4748 The Use of Computer Simulation as Technological Education for Crisis Management Staff

Authors: Jiří Barta, Josef Krahulec, Jiří F. Urbánek

Abstract:

Education and practical training crisis management members are a topical issue nowadays. The paper deals with the perspectives and possibilities of ‘smart solutions’ to education for crisis management staff. Currently, there are a large number of simulation tools, which notes that they are suitable for practical training of crisis management staff. The first part of the paper is focused on the introduction of the technology simulation tools. The simulators aim is to create a realistic environment for the practical training of extending units of crisis staff. The second part of the paper concerns the possibilities of using the simulation technology to the education process. The aim of this section is to introduce the practical capabilities and potential of the simulation programs for practical training of crisis management staff.

Keywords: crisis management staff, computer simulation, software, technological education

Procedia PDF Downloads 328
4747 Preliminary Study on the Factors Affecting Safety Parameters of (Th, U)O₂ Fuel Cycle: The Basis for Choosing Three Fissile Enrichment Zones

Authors: E. H. Uguru, S. F. A. Sani, M. U. Khandaker, M. H. Rabir

Abstract:

The beginning of cycle transient safety parameters is paramount for smooth reactor operation. The enhanced operational safety of UO₂ fuelled AP1000 reactor being the first using three fissile enrichment zones motivated this research for (Th, U)O₂ fuel. This study evaluated the impact of fissile enrichment, soluble boron, and gadolinia on the transient safety parameters to determine the basis for choosing the three fissile enrichment zones. Fuel assembly and core model of Westinghouse small modular reactor were investigated using different fuel and reactivity control arrangements. The Monte Carlo N-Particle eXtended (MCNPX) integrated with CINDER90 burn-up code was used for the calculations. The results show that the moderator temperature coefficient of reactivity (MTC) and the fuel temperature coefficient of reactivity (FTC) were respectively negative and decreased with increasing fissile enrichment. Soluble boron significantly decreased the MTC but slightly increased FTC while gadolinia followed the same trend with a minor impact. However, the MTC and FTC respectively decreased significantly with increasing change in temperature. These results provide a guide on the considerable factors in choosing the three fissile enrichment zones for (Th, U)O₂ fuel in anticipation of their impact on safety parameters. Therefore, this study provides foundational results on the factors that must be considered in choosing three fissile arrangement zones for (Th, U)O₂ fuel.

Keywords: reactivity, safety parameters, small modular reactor, soluble boron, thorium fuel cycle

Procedia PDF Downloads 111
4746 Analyzing the Impact of Migration on HIV and AIDS Incidence Cases in Malaysia

Authors: Ofosuhene O. Apenteng, Noor Azina Ismail

Abstract:

The human immunodeficiency virus (HIV) that causes acquired immune deficiency syndrome (AIDS) remains a global cause of morbidity and mortality. It has caused panic since its emergence. Relationships between migration and HIV/AIDS have become complex. In the absence of prospectively designed studies, dynamic mathematical models that take into account the migration movement which will give very useful information. We have explored the utility of mathematical models in understanding transmission dynamics of HIV and AIDS and in assessing the magnitude of how migration has impact on the disease. The model was calibrated to HIV and AIDS incidence data from Malaysia Ministry of Health from the period of 1986 to 2011 using Bayesian analysis with combination of Markov chain Monte Carlo method (MCMC) approach to estimate the model parameters. From the estimated parameters, the estimated basic reproduction number was 22.5812. The rate at which the susceptible individual moved to HIV compartment has the highest sensitivity value which is more significant as compared to the remaining parameters. Thus, the disease becomes unstable. This is a big concern and not good indicator from the public health point of view since the aim is to stabilize the epidemic at the disease-free equilibrium. However, these results suggest that the government as a policy maker should make further efforts to curb illegal activities performed by migrants. It is shown that our models reflect considerably the dynamic behavior of the HIV/AIDS epidemic in Malaysia and eventually could be used strategically for other countries.

Keywords: epidemic model, reproduction number, HIV, MCMC, parameter estimation

Procedia PDF Downloads 347
4745 Planning of Construction Material Flow Using Hybrid Simulation Modeling

Authors: A. M. Naraghi, V. Gonzalez, M. O'Sullivan, C. G. Walker, M. Poshdar, F. Ying, M. Abdelmegid

Abstract:

Discrete Event Simulation (DES) and Agent Based Simulation (ABS) are two simulation approaches that have been proposed to support decision-making in the construction industry. Despite the wide use of these simulation approaches in the construction field, their applications for production and material planning is still limited. This is largely due to the dynamic and complex nature of construction material supply chain systems. Moreover, managing the flow of construction material is not well integrated with site logistics in traditional construction planning methods. This paper presents a hybrid of DES and ABS to simulate on-site and off-site material supply processes. DES is applied to determine the best production scenarios with information of on-site production systems, while ABS is used to optimize the supply chain network. A case study of a construction piling project in New Zealand is presented illustrating the potential benefits of using the proposed hybrid simulation model in construction material flow planning. The hybrid model presented can be used to evaluate the impact of different decisions on construction supply chain management.

Keywords: construction supply-chain management, simulation modeling, decision-support tools, hybrid simulation

Procedia PDF Downloads 182
4744 Study on Beta-Ray Detection System in Water Using a MCNP Simulation

Authors: Ki Hyun Park, Hye Min Park, Jeong Ho Kim, Chan Jong Park, Koan Sik Joo

Abstract:

In the modern days, the use of radioactive substances is on the rise in the areas like chemical weaponry, industrial usage, and power plants. Although there are various technologies available to detect and monitor radioactive substances in the air, the technologies to detect underwater radioactive substances are scarce. In this study, computer simulation of the underwater detection system measuring beta-ray, a radioactive substance, has been done through MCNP. CaF₂, YAP(Ce) and YAG(Ce) have been used in the computer simulation to detect beta-ray as scintillator. Also, the source used in the computer simulation is Sr-90 and Y-90, both of them emitting only pure beta-ray. The distance between the source and the detector was shifted from 1mm to 10mm by 1 mm in the computer simulation. The result indicated that Sr-90 was impossible to measure below 1 mm since its emission energy is low while Y-90 was able to be measured up to 10mm underwater. In addition, the detector designed with CaF₂ had the highest efficiency among 3 scintillators used in the computer simulation. Since it was possible to verify the detectable range and the detection efficiency according to modeling through MCNP simulation, it is expected that such result will reduce the time and cost in building the actual beta-ray detector and evaluating its performances, thereby contributing the research and development.

Keywords: Beta-ray, CaF₂, detector, MCNP simulation, scintillator

Procedia PDF Downloads 483
4743 Recovery of an Area Degraded by Gullies in the Municipality of Monte Alto (SP), Brazil

Authors: Layane Sara Vieira, Paulo Egidio Bernardo, Roberto Saverio Souza Costa

Abstract:

Anthropogenic occupations and agricultural explorations without concern for the preservation and sustainability of the activity result in soil degradation that can make rural activity unfeasible. The objective of this work was to characterize and evaluate the recovery costs of an area degraded by major erosion (gully) in the municipality of Monte Alto (SP). Topographic characterization was carried out by means of a planialtimetric survey with a total station. The contours of the gully, internal area, slope height, contribution area, volume, and costs of operations for the recovery of the gully were delimited. The results obtained showed that the gully has a length of 145.56 m, a maximum width of 36.61 m, and a gap of 19.48 m. The external area of the gully is 1,039.8741 m², and the internal area is 119.3470 m². The calculated volume was 3,282.63 m³. The intervention area for breaking slopes was measured at 8,471.29 m², requiring the construction of 19 terraces in this area, vertically spaced at 2.8 m. The estimated costs for mechanical recovery of the gully were R$ 19,167.84 (US$ 3.657,98).

Keywords: erosion, volumetric assessment, soil degradation, terraces

Procedia PDF Downloads 80
4742 Building a Stochastic Simulation Model for Blue Crab Population Evolution in Antinioti Lagoon

Authors: Nikolaos Simantiris, Markos Avlonitis

Abstract:

This work builds a simulation platform, modeling the spatial diffusion of the invasive species Callinectes sapidus (blue crab) as a random walk, incorporating also generation, fatality, and fishing rates modeling the time evolution of its population. Antinioti lagoon in West Greece was used as a testbed for applying the simulation model. Field measurements from June 2020 to June 2021 on the lagoon’s setting, bathymetry, and blue crab juveniles provided the initial population simulation of blue crabs, as well as biological parameters from the current literature were used to calibrate simulation parameters. The scope of this study is to render the authors able to predict the evolution of the blue crab population in confined environments of the Ionian Islands region in West Greece. The first result of the simulation experiments shows the possibility for a robust prediction for blue crab population evolution in the Antinioti lagoon.

Keywords: antinioti lagoon, blue crab, stochastic simulation, random walk

Procedia PDF Downloads 196
4741 Evaluation of Progressive Collapse of Transmission Tower

Authors: Jeong-Hwan Choi, Hyo-Sang Park, Tae-Hyung Lee

Abstract:

The transmission tower is one of the crucial lifeline structures in a modern society, and it needs to be protected against extreme loading conditions. However, the transmission tower is a very complex structure and, therefore, it is very difficult to simulate the actual damage and the collapse behavior of the tower structure. In this study, the actual collapse behavior of the transmission tower due to lateral loading conditions such as wind load is evaluated through the computational simulation. For that, a progressive collapse procedure is applied to the simulation. In this procedure, after running the simulation, if a member of the tower structure fails, the failed member is removed and the simulation run again. The 154kV transmission tower is selected for this study. The simulation is performed by nonlinear static analysis procedure, namely pushover analysis, using OpenSEES, an earthquake simulation platform. Three-dimensional finite element models of those towers are developed.

Keywords: transmission tower, OpenSEES, pushover, progressive collapse

Procedia PDF Downloads 331
4740 Predicting Polyethylene Processing Properties Based on Reaction Conditions via a Coupled Kinetic, Stochastic and Rheological Modelling Approach

Authors: Kristina Pflug, Markus Busch

Abstract:

Being able to predict polymer properties and processing behavior based on the applied operating reaction conditions in one of the key challenges in modern polymer reaction engineering. Especially, for cost-intensive processes such as the high-pressure polymerization of low-density polyethylene (LDPE) with high safety-requirements, the need for simulation-based process optimization and product design is high. A multi-scale modelling approach was set-up and validated via a series of high-pressure mini-plant autoclave reactor experiments. The approach starts with the numerical modelling of the complex reaction network of the LDPE polymerization taking into consideration the actual reaction conditions. While this gives average product properties, the complex polymeric microstructure including random short- and long-chain branching is calculated via a hybrid Monte Carlo-approach. Finally, the processing behavior of LDPE -its melt flow behavior- is determined in dependence of the previously determined polymeric microstructure using the branch on branch algorithm for randomly branched polymer systems. All three steps of the multi-scale modelling approach can be independently validated against analytical data. A triple-detector GPC containing an IR, viscosimetry and multi-angle light scattering detector is applied. It serves to determine molecular weight distributions as well as chain-length dependent short- and long-chain branching frequencies. 13C-NMR measurements give average branching frequencies, and rheological measurements in shear and extension serve to characterize the polymeric flow behavior. The accordance of experimental and modelled results was found to be extraordinary, especially taking into consideration that the applied multi-scale modelling approach does not contain parameter fitting of the data. This validates the suggested approach and proves its universality at the same time. In the next step, the modelling approach can be applied to other reactor types, such as tubular reactors or industrial scale. Moreover, sensitivity analysis for systematically varying process conditions is easily feasible. The developed multi-scale modelling approach finally gives the opportunity to predict and design LDPE processing behavior simply based on process conditions such as feed streams and inlet temperatures and pressures.

Keywords: low-density polyethylene, multi-scale modelling, polymer properties, reaction engineering, rheology

Procedia PDF Downloads 108
4739 Probabilistic Damage Tolerance Methodology for Solid Fan Blades and Discs

Authors: Andrej Golowin, Viktor Denk, Axel Riepe

Abstract:

Solid fan blades and discs in aero engines are subjected to high combined low and high cycle fatigue loads especially around the contact areas between blade and disc. Therefore, special coatings (e.g. dry film lubricant) and surface treatments (e.g. shot peening or laser shock peening) are applied to increase the strength with respect to combined cyclic fatigue and fretting fatigue, but also to improve damage tolerance capability. The traditional deterministic damage tolerance assessment based on fracture mechanics analysis, which treats service damage as an initial crack, often gives overly conservative results especially in the presence of vibratory stresses. A probabilistic damage tolerance methodology using crack initiation data has been developed for fan discs exposed to relatively high vibratory stresses in cross- and tail-wind conditions at certain resonance speeds for limited time periods. This Monte-Carlo based method uses a damage databank from similar designs, measured vibration levels at typical aircraft operations and wind conditions and experimental crack initiation data derived from testing of artificially damaged specimens with representative surface treatment under combined fatigue conditions. The proposed methodology leads to a more realistic prediction of the minimum damage tolerance life for the most critical locations applicable to modern fan disc designs.

Keywords: combined fatigue, damage tolerance, engine, surface treatment

Procedia PDF Downloads 459