Search results for: Carlo Gadingan
258 Comparison of Water Equivalent Ratio of Several Dosimetric Materials in Proton Therapy Using Monte Carlo Simulations and Experimental Data
Authors: M. R. Akbari , H. Yousefnia, E. Mirrezaei
Abstract:
Range uncertainties of protons are currently a topic of interest in proton therapy. Two of the parameters that are often used to specify proton range are water equivalent thickness (WET) and water equivalent ratio (WER). Since WER values for a specific material is nearly constant at different proton energies, it is a more useful parameter to compare. In this study, WER values were calculated for different proton energies in polymethyl methacrylate (PMMA), polystyrene (PS) and aluminum (Al) using FLUKA and TRIM codes. The results were compared with analytical, experimental and simulated SEICS code data obtained from the literature. In FLUKA simulation, a cylindrical phantom, 1000 mm in height and 300 mm in diameter, filled with the studied materials was simulated. A typical mono-energetic proton pencil beam in a wide range of incident energies usually applied in proton therapy (50 MeV to 225 MeV) impinges normally on the phantom. In order to obtain the WER values for the considered materials, cylindrical detectors, 1 mm in height and 20 mm in diameter, were also simulated along the beam trajectory in the phantom. In TRIM calculations, type of projectile, energy and angle of incidence, type of target material and thickness should be defined. The mode of 'detailed calculation with full damage cascades' was selected for proton transport in the target material. The biggest difference in WER values between the codes was 3.19%, 1.9% and 0.67% for Al, PMMA and PS, respectively. In Al and PMMA, the biggest difference between each code and experimental data was 1.08%, 1.26%, 2.55%, 0.94%, 0.77% and 0.95% for SEICS, FLUKA and SRIM, respectively. FLUKA and SEICS had the greatest agreement (≤0.77% difference in PMMA and ≤1.08% difference in Al, respectively) with the available experimental data in this study. It is concluded that, FLUKA and TRIM codes have capability for Bragg curves simulation and WER values calculation in the studied materials. They can also predict Bragg peak location and range of proton beams with acceptable accuracy.Keywords: water equivalent ratio, dosimetric materials, proton therapy, Monte Carlo simulations
Procedia PDF Downloads 326257 Measurement and Simulation of Axial Neutron Flux Distribution in Dry Tube of KAMINI Reactor
Authors: Manish Chand, Subhrojit Bagchi, R. Kumar
Abstract:
A new dry tube (DT) has been installed in the tank of KAMINI research reactor, Kalpakkam India. This tube will be used for neutron activation analysis of small to large samples and testing of neutron detectors. DT tube is 375 cm height and 7.5 cm in diameter, located 35 cm away from the core centre. The experimental thermal flux at various axial positions inside the tube has been measured by irradiating the flux monitor (¹⁹⁷Au) at 20kW reactor power. The measured activity of ¹⁹⁸Au and the thermal cross section of ¹⁹⁷Au (n,γ) ¹⁹⁸Au reaction were used for experimental thermal flux measurement. The flux inside the tube varies from 10⁹ to 10¹⁰ and maximum flux was (1.02 ± 0.023) x10¹⁰ n cm⁻²s⁻¹ at 36 cm from the bottom of the tube. The Au and Zr foils without and with cadmium cover of 1-mm thickness were irradiated at the maximum flux position in the DT to find out the irradiation specific input parameters like sub-cadmium to epithermal neutron flux ratio (f) and the epithermal neutron flux shape factor (α). The f value was 143 ± 5, indicates about 99.3% thermal neutron component and α value was -0.2886 ± 0.0125, indicates hard epithermal neutron spectrum due to insufficient moderation. The measured flux profile has been validated using theoretical model of KAMINI reactor through Monte Carlo N-Particle Code (MCNP). In MCNP, the complex geometry of the entire reactor is modelled in 3D, ensuring minimum approximations for all the components. Continuous energy cross-section data from ENDF-B/VII.1 as well as S (α, β) thermal neutron scattering functions are considered. The neutron flux has been estimated at the corresponding axial locations of the DT using mesh tally. The thermal flux obtained from the experiment shows good agreement with the theoretically predicted values by MCNP, it was within ± 10%. It can be concluded that this MCNP model can be utilized for calculating other important parameters like neutron spectra, dose rate, etc. and multi elemental analysis can be carried out by irradiating the sample at maximum flux position using measured f and α parameters by k₀-NAA standardization.Keywords: neutron flux, neutron activation analysis, neutron flux shape factor, MCNP, Monte Carlo N-Particle Code
Procedia PDF Downloads 164256 Reliability Levels of Reinforced Concrete Bridges Obtained by Mixing Approaches
Authors: Adrián D. García-Soto, Alejandro Hernández-Martínez, Jesús G. Valdés-Vázquez, Reyna A. Vizguerra-Alvarez
Abstract:
Reinforced concrete bridges designed by code are intended to achieve target reliability levels adequate for the geographical environment where the code is applicable. Several methods can be used to estimate such reliability levels. Many of them require the establishment of an explicit limit state function (LSF). When such LSF is not available as a close-form expression, the simulation techniques are often employed. The simulation methods are computing intensive and time consuming. Note that if the reliability of real bridges designed by code is of interest, numerical schemes, the finite element method (FEM) or computational mechanics could be required. In these cases, it can be quite difficult (or impossible) to establish a close-form of the LSF, and the simulation techniques may be necessary to compute reliability levels. To overcome the need for a large number of simulations when no explicit LSF is available, the point estimate method (PEM) could be considered as an alternative. It has the advantage that only the probabilistic moments of the random variables are required. However, in the PEM, fitting of the resulting moments of the LSF to a probability density function (PDF) is needed. In the present study, a very simple alternative which allows the assessment of the reliability levels when no explicit LSF is available and without the need of extensive simulations is employed. The alternative includes the use of the PEM, and its applicability is shown by assessing reliability levels of reinforced concrete bridges in Mexico when a numerical scheme is required. Comparisons with results by using the Monte Carlo simulation (MCS) technique are included. To overcome the problem of approximating the probabilistic moments from the PEM to a PDF, a well-known distribution is employed. The approach mixes the PEM and other classic reliability method (first order reliability method, FORM). The results in the present study are in good agreement whit those computed with the MCS. Therefore, the alternative of mixing the reliability methods is a very valuable option to determine reliability levels when no close form of the LSF is available, or if numerical schemes, the FEM or computational mechanics are employed.Keywords: structural reliability, reinforced concrete bridges, combined approach, point estimate method, monte carlo simulation
Procedia PDF Downloads 346255 Consideration of Uncertainty in Engineering
Authors: A. Mohammadi, M. Moghimi, S. Mohammadi
Abstract:
Engineers need computational methods which could provide solutions less sensitive to the environmental effects, so the techniques should be used which take the uncertainty to account to control and minimize the risk associated with design and operation. In order to consider uncertainty in engineering problem, the optimization problem should be solved for a suitable range of the each uncertain input variable instead of just one estimated point. Using deterministic optimization problem, a large computational burden is required to consider every possible and probable combination of uncertain input variables. Several methods have been reported in the literature to deal with problems under uncertainty. In this paper, different methods presented and analyzed.Keywords: uncertainty, Monte Carlo simulated, stochastic programming, scenario method
Procedia PDF Downloads 417254 Probabilistic Life Cycle Assessment of the Nano Membrane Toilet
Authors: A. Anastasopoulou, A. Kolios, T. Somorin, A. Sowale, Y. Jiang, B. Fidalgo, A. Parker, L. Williams, M. Collins, E. J. McAdam, S. Tyrrel
Abstract:
Developing countries are nowadays confronted with great challenges related to domestic sanitation services in view of the imminent water scarcity. Contemporary sanitation technologies established in these countries are likely to pose health risks unless waste management standards are followed properly. This paper provides a solution to sustainable sanitation with the development of an innovative toilet system, called Nano Membrane Toilet (NMT), which has been developed by Cranfield University and sponsored by the Bill & Melinda Gates Foundation. The particular technology converts human faeces into energy through gasification and provides treated wastewater from urine through membrane filtration. In order to evaluate the environmental profile of the NMT system, a deterministic life cycle assessment (LCA) has been conducted in SimaPro software employing the Ecoinvent v3.3 database. The particular study has determined the most contributory factors to the environmental footprint of the NMT system. However, as sensitivity analysis has identified certain critical operating parameters for the robustness of the LCA results, adopting a stochastic approach to the Life Cycle Inventory (LCI) will comprehensively capture the input data uncertainty and enhance the credibility of the LCA outcome. For that purpose, Monte Carlo simulations, in combination with an artificial neural network (ANN) model, have been conducted for the input parameters of raw material, produced electricity, NOX emissions, amount of ash and transportation of fertilizer. The given analysis has provided the distribution and the confidence intervals of the selected impact categories and, in turn, more credible conclusions are drawn on the respective LCIA (Life Cycle Impact Assessment) profile of NMT system. Last but not least, the specific study will also yield essential insights into the methodological framework that can be adopted in the environmental impact assessment of other complex engineering systems subject to a high level of input data uncertainty.Keywords: sanitation systems, nano-membrane toilet, lca, stochastic uncertainty analysis, Monte Carlo simulations, artificial neural network
Procedia PDF Downloads 226253 Risk Measure from Investment in Finance by Value at Risk
Authors: Mohammed El-Arbi Khalfallah, Mohamed Lakhdar Hadji
Abstract:
Managing and controlling risk is a topic research in the world of finance. Before a risky situation, the stakeholders need to do comparison according to the positions and actions, and financial institutions must take measures of a particular market risk and credit. In this work, we study a model of risk measure in finance: Value at Risk (VaR), which is a new tool for measuring an entity's exposure risk. We explain the concept of value at risk, your average, tail, and describe the three methods for computing: Parametric method, Historical method, and numerical method of Monte Carlo. Finally, we briefly describe advantages and disadvantages of the three methods for computing value at risk.Keywords: average value at risk, conditional value at risk, tail value at risk, value at risk
Procedia PDF Downloads 443252 On Coverage Probability of Confidence Intervals for the Normal Mean with Known Coefficient of Variation
Authors: Suparat Niwitpong, Sa-aat Niwitpong
Abstract:
Statistical inference of normal mean with known coefficient of variation has been investigated recently. This phenomenon occurs normally in environment and agriculture experiments when the scientist knows the coefficient of variation of their experiments. In this paper, we constructed new confidence intervals for the normal population mean with known coefficient of variation. We also derived analytic expressions for the coverage probability of each confidence interval. To confirm our theoretical results, Monte Carlo simulation will be used to assess the performance of these intervals based on their coverage probabilities.Keywords: confidence interval, coverage probability, expected length, known coefficient of variation
Procedia PDF Downloads 396251 A Risk-Based Approach to Construction Management
Authors: Chloe E. Edwards, Yasaman Shahtaheri
Abstract:
Risk management plays a fundamental role in project planning and delivery. The purpose of incorporating risk management into project management practices is to identify and address uncertainties related to key project-related activities. The uncertainties, known as risk events, can relate to project deliverables that are quantifiable and are often measured by impact to project schedule, cost, or environmental impact. Risk management should be incorporated as an iterative practice throughout the planning, execution, and commissioning phases of a project. This paper specifically examines how risk management contributes to effective project planning and delivery through a case study of a transportation project. This case study focused solely on impacts to project schedule regarding three milestones: readiness for delivery, readiness for testing and commissioning, and completion of the facility. The case study followed the ISO 31000: Risk Management – Guidelines. The key factors that are outlined by these guidelines include understanding the scope and context of the project, conducting a risk assessment including identification, analysis, and evaluation, and lastly, risk treatment through mitigation measures. This process requires continuous consultation with subject matter experts and monitoring to iteratively update the risks accordingly. The risk identification process led to a total of fourteen risks related to design, permitting, construction, and commissioning. The analysis involved running 1,000 Monte Carlo simulations through @RISK 8.0 Industrial software to determine potential milestone completion dates based on the project baseline schedule. These dates include the best case, most likely case, and worst case to provide an estimated delay for each milestone. Evaluation of these results provided insight into which risks were the highest contributors to the projected milestone completion dates. Based on the analysis results, the risk management team was able to provide recommendations for mitigation measures to reduce the likelihood of risks occurring. The risk management team also provided recommendations for managing the identified risks and project activities moving forward to meet the most likely or best-case milestone completion dates.Keywords: construction management, monte carlo simulation, project delivery, risk assessment, transportation engineering
Procedia PDF Downloads 108250 A One Dimensional Particle in Cell Model for Excimer Lamps
Authors: W. Benstaali, A. Belasri
Abstract:
In this work we study a planar lamp filled with neon-xenon gas. We use a one-dimensional particle in a cell with Monte Carlo simulation (PIC-MCC) to investigate the effect xenon concentration on the energy deposited on excitation, ionization and ions. A Xe-Ne discharge is studied for a gas pressure of 400 torr. The results show an efficient Xe20-Ne mixture with an applied voltage of 1.2KV; the xenon excitation energy represents 65% form total energy dissipated in the discharge. We have also studied electrical properties and the energy balance a discharge for Xe50-Ne which needs a voltage of 2kv; the xenon energy is than more important.Keywords: dielectric barrier discharge, efficiency, excitation, lamps
Procedia PDF Downloads 168249 Finite Sample Inferences for Weak Instrument Models
Authors: Gubhinder Kundhi, Paul Rilstone
Abstract:
It is well established that Instrumental Variable (IV) estimators in the presence of weak instruments can be poorly behaved, in particular, be quite biased in finite samples. Finite sample approximations to the distributions of these estimators are obtained using Edgeworth and Saddlepoint expansions. Departures from normality of the distributions of these estimators are analyzed using higher order analytical corrections in these expansions. In a Monte-Carlo experiment, the performance of these expansions is compared to the first order approximation and other methods commonly used in finite samples such as the bootstrap.Keywords: bootstrap, Instrumental Variable, Edgeworth expansions, Saddlepoint expansions
Procedia PDF Downloads 311248 Application Reliability Method for the Analysis of the Stability Limit States of Large Concrete Dams
Authors: Mustapha Kamel Mihoubi, Essadik Kerkar, Abdelhamid Hebbouche
Abstract:
According to the randomness of most of the factors affecting the stability of a gravity dam, probability theory is generally used to TESTING the risk of failure and there is a confusing logical transition from the state of stability failed state, so the stability failure process is considered as a probable event. The control of risk of product failures is of capital importance for the control from a cross analysis of the gravity of the consequences and effects of the probability of occurrence of identified major accidents and can incur a significant risk to the concrete dam structures. Probabilistic risk analysis models are used to provide a better understanding the reliability and structural failure of the works, including when calculating stability of large structures to a major risk in the event of an accident or breakdown. This work is interested in the study of the probability of failure of concrete dams through the application of the reliability analysis methods including the methods used in engineering. It is in our case of the use of level II methods via the study limit state. Hence, the probability of product failures is estimated by analytical methods of the type FORM (First Order Reliability Method), SORM (Second Order Reliability Method). By way of comparison, a second level III method was used which generates a full analysis of the problem and involving an integration of the probability density function of, random variables are extended to the field of security by using of the method of Mont-Carlo simulations. Taking into account the change in stress following load combinations: normal, exceptional and extreme the acting on the dam, calculation results obtained have provided acceptable failure probability values which largely corroborate the theory, in fact, the probability of failure tends to increase with increasing load intensities thus causing a significant decrease in strength, especially in the presence of combinations of unique and extreme loads. Shear forces then induce a shift threatens the reliability of the structure by intolerable values of the probability of product failures. Especially, in case THE increase of uplift in a hypothetical default of the drainage system.Keywords: dam, failure, limit state, monte-carlo, reliability, probability, sliding, Taylor
Procedia PDF Downloads 319247 A Saturation Attack Simulation on a Navy Warship Based on Discrete-Event Simulation Models
Authors: Yawei Liang
Abstract:
Threat from cruise missiles is among the most dangerous considerations to a warship in the modern era: anti-ship cruise missiles are fast, accurate, and extremely destructive. In this paper, the goal was to use an object-orientated environment to program a simulation to model a scenario in which a lone frigate is attacked by a wave of missiles fired at given intervals. The parameters of the simulation are modified to examine the relationships between different variables in the situation, and an analysis is performed on various aspects of the defending ship’s equipment. Finally, the results are presented, along with a brief discussion.Keywords: discrete event simulation, Monte Carlo simulation, naval resource management, weapon-target allocation/assignment
Procedia PDF Downloads 95246 The Generalized Pareto Distribution as a Model for Sequential Order Statistics
Authors: Mahdy Esmailian, Mahdi Doostparast, Ahmad Parsian
Abstract:
In this article, sequential order statistics (SOS) censoring type II samples coming from the generalized Pareto distribution are considered. Maximum likelihood (ML) estimators of the unknown parameters are derived on the basis of the available multiple SOS data. Necessary conditions for existence and uniqueness of the derived ML estimates are given. Due to complexity in the proposed likelihood function, a useful re-parametrization is suggested. For illustrative purposes, a Monte Carlo simulation study is conducted and an illustrative example is analysed.Keywords: bayesian estimation, generalized pareto distribution, maximum likelihood estimation, sequential order statistics
Procedia PDF Downloads 512245 Design, Construction and Performance Evaluation of a HPGe Detector Shield
Authors: M. Sharifi, M. Mirzaii, F. Bolourinovin, H. Yousefnia, M. Akbari, K. Yousefi-Mojir
Abstract:
A multilayer passive shield composed of low-activity lead (Pb), copper (Cu), tin (Sn) and iron (Fe) was designed and manufactured for a coaxial HPGe detector placed at a surface laboratory for reducing background radiation and radiation dose to the personnel. The performance of the shield was evaluated and efficiency curves of the detector were plotted by using of the various standard sources in different distances. Monte Carlo simulations and a set of TLD chips were used for dose estimation in two distances of 20 and 40 cm. The results show that the shield reduced background spectrum and the personnel dose more than 95%.Keywords: HPGe shield, background count, personnel dose, efficiency curve
Procedia PDF Downloads 457244 Towards the Design of Gripper Independent of Substrate Surface Structures
Authors: Annika Schmidt, Ausama Hadi Ahmed, Carlo Menon
Abstract:
End effectors for robotic systems are becoming more and more advanced, resulting in a growing variety of gripping tasks. However, most grippers are application specific. This paper presents a gripper that interacts with an object’s surface rather than being dependent on a defined shape or size. For this purpose, ingressive and astrictive features are combined to achieve the desired gripping capabilities. The developed prototype is tested on a variety of surfaces with different hardness and roughness properties. The results show that the gripping mechanism works on all of the tested surfaces. The influence of the material properties on the amount of the supported load is also studied and the efficiency is discussed.Keywords: claw, dry adhesion, insects, material properties
Procedia PDF Downloads 359243 Investigating the Effects of Data Transformations on a Bi-Dimensional Chi-Square Test
Authors: Alexandru George Vaduva, Adriana Vlad, Bogdan Badea
Abstract:
In this research, we conduct a Monte Carlo analysis on a two-dimensional χ2 test, which is used to determine the minimum distance required for independent sampling in the context of chaotic signals. We investigate the impact of transforming initial data sets from any probability distribution to new signals with a uniform distribution using the Spearman rank correlation on the χ2 test. This transformation removes the randomness of the data pairs, and as a result, the observed distribution of χ2 test values differs from the expected distribution. We propose a solution to this problem and evaluate it using another chaotic signal.Keywords: chaotic signals, logistic map, Pearson’s test, Chi Square test, bivariate distribution, statistical independence
Procedia PDF Downloads 99242 High Purity Germanium Detector Characterization by Means of Monte Carlo Simulation through Application of Geant4 Toolkit
Authors: Milos Travar, Jovana Nikolov, Andrej Vranicar, Natasa Todorovic
Abstract:
Over the years, High Purity Germanium (HPGe) detectors proved to be an excellent practical tool and, as such, have established their today's wide use in low background γ-spectrometry. One of the advantages of gamma-ray spectrometry is its easy sample preparation as chemical processing and separation of the studied subject are not required. Thus, with a single measurement, one can simultaneously perform both qualitative and quantitative analysis. One of the most prominent features of HPGe detectors, besides their excellent efficiency, is their superior resolution. This feature virtually allows a researcher to perform a thorough analysis by discriminating photons of similar energies in the studied spectra where otherwise they would superimpose within a single-energy peak and, as such, could potentially scathe analysis and produce wrongly assessed results. Naturally, this feature is of great importance when the identification of radionuclides, as well as their activity concentrations, is being practiced where high precision comes as a necessity. In measurements of this nature, in order to be able to reproduce good and trustworthy results, one has to have initially performed an adequate full-energy peak (FEP) efficiency calibration of the used equipment. However, experimental determination of the response, i.e., efficiency curves for a given detector-sample configuration and its geometry, is not always easy and requires a certain set of reference calibration sources in order to account for and cover broader energy ranges of interest. With the goal of overcoming these difficulties, a lot of researches turned towards the application of different software toolkits that implement the Monte Carlo method (e.g., MCNP, FLUKA, PENELOPE, Geant4, etc.), as it has proven time and time again to be a very powerful tool. In the process of creating a reliable model, one has to have well-established and described specifications of the detector. Unfortunately, the documentation that manufacturers provide alongside the equipment is rarely sufficient enough for this purpose. Furthermore, certain parameters tend to evolve and change over time, especially with older equipment. Deterioration of these parameters consequently decreases the active volume of the crystal and can thus affect the efficiencies by a large margin if they are not properly taken into account. In this study, the optimisation method of two HPGe detectors through the implementation of the Geant4 toolkit developed by CERN is described, with the goal of further improving simulation accuracy in calculations of FEP efficiencies by investigating the influence of certain detector variables (e.g., crystal-to-window distance, dead layer thicknesses, inner crystal’s void dimensions, etc.). Detectors on which the optimisation procedures were carried out were a standard traditional co-axial extended range detector (XtRa HPGe, CANBERRA) and a broad energy range planar detector (BEGe, CANBERRA). Optimised models were verified through comparison with experimentally obtained data from measurements of a set of point-like radioactive sources. Acquired results of both detectors displayed good agreement with experimental data that falls under an average statistical uncertainty of ∼ 4.6% for XtRa and ∼ 1.8% for BEGe detector within the energy range of 59.4−1836.1 [keV] and 59.4−1212.9 [keV], respectively.Keywords: HPGe detector, γ spectrometry, efficiency, Geant4 simulation, Monte Carlo method
Procedia PDF Downloads 121241 Influence of Wind Induced Fatigue Damage in the Reliability of Wind Turbines
Authors: Emilio A. Berny-Brandt, Sonia E. Ruiz
Abstract:
Steel tubular towers serving as support structures for large wind turbines are subject to several hundred million stress cycles arising from the turbulent nature of the wind. This causes high-cycle fatigue which can govern tower design. The practice of maintaining the support structure after wind turbines reach its typical 20-year design life have become common, but without quantifying the changes in the reliability on the tower. There are several studies on this topic, but most of them are based on the S-N curve approach using the Miner’s rule damage summation method, the de-facto standard in the wind industry. However, the qualitative nature of Miner’s method makes desirable the use of fracture mechanics to measure the effects of fatigue in the capacity curve of the structure, which is important in order to evaluate the integrity and reliability of these towers. Temporal and spatially varying wind speed time histories are simulated based on power spectral density and coherence functions. Simulations are then applied to a SAP2000 finite element model and step-by-step analysis is used to obtain the stress time histories for a range of representative wind speeds expected during service conditions of the wind turbine. Rainflow method is then used to obtain cycle and stress range information of each of these time histories and a statistical analysis is performed to obtain the distribution parameters of each variable. Monte Carlo simulation is used here to evaluate crack growth over time in the tower base using the Paris-Erdogan equation. A nonlinear static pushover analysis to assess the capacity curve of the structure after a number of years is performed. The capacity curves are then used to evaluate the changes in reliability of a steel tower located in Oaxaca, Mexico, where wind energy facilities are expected to grow in the near future. Results show that fatigue on the tower base can have significant effects on the structural capacity of the wind turbine, especially after the 20-year design life when the crack growth curve starts behaving exponentially.Keywords: crack growth, fatigue, Monte Carlo simulation, structural reliability, wind turbines
Procedia PDF Downloads 517240 The Democratization of 3D Capturing: An Application Investigating Google Tango Potentials
Authors: Carlo Bianchini, Lorenzo Catena
Abstract:
The appearance of 3D scanners and then, more recently, of image-based systems that generate point clouds directly from common digital images have deeply affected the survey process in terms of both capturing and 2D/3D modelling. In this context, low cost and mobile systems are increasingly playing a key role and actually paving the way to the democratization of what in the past was the realm of few specialized technicians and expensive equipment. The application of Google Tango on the ancient church of Santa Maria delle Vigne in Pratica di Mare – Rome presented in this paper is one of these examples.Keywords: the architectural survey, augmented/mixed/virtual reality, Google Tango project, image-based 3D capturing
Procedia PDF Downloads 152239 A Comparative Study of Cognitive Factors Affecting Social Distancing among Vaccinated and Unvaccinated Filipinos
Authors: Emmanuel Carlo Belara, Albert John Dela Merced, Mark Anthony Dominguez, Diomari Erasga, Jerome Ferrer, Bernard Ombrog
Abstract:
Social distancing errors are a common prevalence between vaccinated and unvaccinated in the Filipino community. This study aims to identify and relate the factors on how they affect our daily lives. Observed factors include memory, attention, anxiety, decision-making, and stress. Upon applying the ergonomic tools and statistical treatment such as t-test and multiple linear regression, stress and attention turned out to have the most impact to the errors of social distancing.Keywords: vaccinated, unvaccinated, socoal distancing, filipinos
Procedia PDF Downloads 203238 Effects of Level Densities and Those of a-Parameter in the Framework of Preequilibrium Model for 63,65Cu(n,xp) Reactions in Neutrons at 9 to 15 MeV
Authors: L. Yettou
Abstract:
In this study, the calculations of proton emission spectra produced by 63Cu(n,xp) and 65Cu(n,xp) reactions are used in the framework of preequilibrium models using the EMPIRE code and TALYS code. Exciton Model predidtions combined with the Kalbach angular distribution systematics and the Hybrid Monte Carlo Simulation (HMS) were used. The effects of levels densities and those of a-parameter have been investigated for our calculations. The comparison with experimental data shows clear improvement over the Exciton Model and HMS calculations.Keywords: Preequilibrium models , level density, level density a-parameter., Empire code, Talys code.
Procedia PDF Downloads 134237 Probabilistic Health Risk Assessment of Polycyclic Aromatic Hydrocarbons in Repeatedly Used Edible Oils and Finger Foods
Authors: Suraj Sam Issaka, Anita Asamoah, Abass Gibrilla, Joseph Richmond Fianko
Abstract:
Polycyclic aromatic hydrocarbons (PAHs) are a group of organic compounds that can form in edible oils during repeated frying and accumulate in fried foods. This study assesses the chances of health risks (carcinogenic and non-carcinogenic) due to PAHs levels in popular finger foods (bean cakes, plantain chips, doughnuts) fried in edible oils (mixed vegetable, sunflower, soybean) from the Ghanaian market. Employing probabilistic health risk assessment that considers variability and uncertainty in exposure and risk estimates provides a more realistic representation of potential health risks. Monte Carlo simulations with 10,000 iterations were used to estimate carcinogenic, mutagenic, and non-carcinogenic risks for different age groups (A: 6-10 years, B: 11-20 years, C: 20-70 years), food types (bean cake, plantain chips, doughnut), oil types (soybean, mixed vegetable, sunflower), and re-usage frying oil frequencies (once, twice, thrice). Our results suggest that, for age Group A, doughnuts posed the highest probability of carcinogenic risk (91.55%) exceeding the acceptable threshold, followed by bean cakes (43.87%) and plantain chips (7.72%), as well as the highest probability of unacceptable mutagenic risk (89.2%), followed by bean cakes (40.32%). Among age Group B, doughnuts again had the highest probability of exceeding carcinogenic risk limits (51.16%) and mutagenic risk limits (44.27%). At the same time, plantain chips exhibited the highest maximum carcinogenic risk. For adults age Group C, bean cakes had the highest probability of unacceptable carcinogenic (50.88%) and mutagenic risks (46.44%), though plantain chips showed the highest maximum values for both carcinogenic and mutagenic risks in this age group. Also, on non-carcinogenic risks across different age groups, it was found that age Group A) who consumed doughnuts had a 68.16% probability of a hazard quotient (HQ) greater than 1, suggesting potential cognitive impairment and lower IQ scores due to early PAH exposure. This group also faced risks from consuming plantain chips and bean cake. For age Group B, the consumption of plantain chips was associated with a 36.98% probability of HQ greater than 1, indicating a potential risk of reduced lung function. In age Group C, the consumption of plantain chips was linked to a 35.70% probability of HQ greater than 1, suggesting a potential risk of cardiovascular diseases.Keywords: PAHs, fried foods, carcinogenic risk, non-carcinogenic risk, Monte Carlo simulations
Procedia PDF Downloads 18236 Model of Obstacle Avoidance on Hard Disk Drive Manufacturing with Distance Constraint
Authors: Rawinun Praserttaweelap, Somyot Kiatwanidvilai
Abstract:
Obstacle avoidance is the one key for the robot system in unknown environment. The robots should be able to know their position and safety region. This research starts on the path planning which are SLAM and AMCL in ROS system. In addition, the best parameters of the obstacle avoidance function are required. In situation on Hard Disk Drive Manufacturing, the distance between robots and obstacles are very serious due to the manufacturing constraint. The simulations are accomplished by the SLAM and AMCL with adaptive velocity and safety region calculation.Keywords: obstacle avoidance, OA, Simultaneous Localization and Mapping, SLAM, Adaptive Monte Carlo Localization, AMCL, KLD sampling, KLD
Procedia PDF Downloads 200235 Pion/Muon Identification in a Nuclear Emulsion Cloud Chamber Using Neural Networks
Authors: Kais Manai
Abstract:
The main part of this work focuses on the study of pion/muon separation at low energy using a nuclear Emulsion Cloud Chamber (ECC) made of lead and nuclear emulsion films. The work consists of two parts: particle reconstruction algorithm and a Neural Network that assigns to each reconstructed particle the probability to be a muon or a pion. The pion/muon separation algorithm has been optimized by using a detailed Monte Carlo simulation of the ECC and tested on real data. The algorithm allows to achieve a 60% muon identification efficiency with a pion misidentification smaller than 3%.Keywords: nuclear emulsion, particle identification, tracking, neural network
Procedia PDF Downloads 508234 Interval Estimation for Rainfall Mean in Northeastern Thailand
Authors: Nitaya Buntao
Abstract:
This paper considers the problems of interval estimation for rainfall mean of the lognormal distribution and the delta-lognormal distribution in Northeastern Thailand. We present here the modified generalized pivotal approach (MGPA) compared to the modified method of variance estimates recovery (MMOVER). The performance of each method is examined in term of coverage probabilities and average lengths by Monte Carlo simulation. An extensive simulation study indicates that the MMOVER performs better than the MGPA approach in terms of the coverage probability; it results in highly accurate coverage probability.Keywords: rainfall mean, interval estimation, lognormal distribution, delta-lognormal distribution
Procedia PDF Downloads 456233 Tuning of Indirect Exchange Coupling in FePt/Al₂O₃/Fe₃Pt System
Authors: Rajan Goyal, S. Lamba, S. Annapoorni
Abstract:
The indirect exchange coupled system consists of two ferromagnetic layers separated by non-magnetic spacer layer. The type of exchange coupling may be either ferro or anti-ferro depending on the thickness of the spacer layer. In the present work, the strength of exchange coupling in FePt/Al₂O₃/Fe₃Pt has been investigated by varying the thickness of the spacer layer Al₂O₃. The FePt/Al₂O₃/Fe₃Pt trilayer structure is fabricated on Si <100> single crystal substrate using sputtering technique. The thickness of FePt and Fe₃Pt is fixed at 60 nm and 2 nm respectively. The thickness of spacer layer Al₂O₃ was varied from 0 to 16 nm. The normalized hysteresis loops recorded at room temperature both in the in-plane and out of plane configuration reveals that the orientation of easy axis lies along the plane of the film. It is observed that the hysteresis loop for ts=0 nm does not exhibit any knee around H=0 indicating that the hard FePt layer and soft Fe₃Pt layer are strongly exchange coupled. However, the insertion of Al₂O₃ spacer layer of thickness ts = 0.7 nm results in appearance of a minor knee around H=0 suggesting the weakening of exchange coupling between FePt and Fe₃Pt. The disappearance of knee in hysteresis loop with further increase in thickness of the spacer layer up to 8 nm predicts the co-existence of ferromagnetic (FM) and antiferromagnetic (AFM) exchange interaction between FePt and Fe₃Pt. In addition to this, the out of plane hysteresis loop also shows an asymmetry around H=0. The exchange field Hex = (Hc↑-HC↓)/2, where Hc↑ and Hc↓ are the coercivity estimated from lower and upper branch of hysteresis loop, increases from ~ 150 Oe to ~ 700 Oe respectively. This behavior may be attributed to the uncompensated moments in the hard FePt layer and soft Fe₃Pt layer at the interface. A better insight into the variation in indirect exchange coupling has been investigated using recoil curves. It is observed that the almost closed recoil curves are obtained for ts= 0 nm up to a reverse field of ~ 5 kOe. On the other hand, the appearance of appreciable open recoil curves at lower reverse field ~ 4 kOe for ts = 0.7 nm indicates that uncoupled soft phase undergoes irreversible magnetization reversal at lower reverse field suggesting the weakening of exchange coupling. The openness of recoil curves decreases with increase in thickness of the spacer layer up to 8 nm. This behavior may be attributed to the competition between FM and AFM exchange interactions. The FM exchange coupling between FePt and Fe₃Pt due to porous nature of Al₂O₃ decreases much slower than the weak AFM coupling due to interaction between Fe ions of FePt and Fe₃Pt via O ions of Al₂O₃. The hysteresis loop has been simulated using Monte Carlo based on Metropolis algorithm to investigate the variation in strength of exchange coupling in FePt/Al₂O₃/Fe₃Pt trilayer system.Keywords: indirect exchange coupling, MH loop, Monte Carlo simulation, recoil curve
Procedia PDF Downloads 190232 Considerations for Effectively Using Probability of Failure as a Means of Slope Design Appraisal for Homogeneous and Heterogeneous Rock Masses
Authors: Neil Bar, Andrew Heweston
Abstract:
Probability of failure (PF) often appears alongside factor of safety (FS) in design acceptance criteria for rock slope, underground excavation and open pit mine designs. However, the design acceptance criteria generally provide no guidance relating to how PF should be calculated for homogeneous and heterogeneous rock masses, or what qualifies a ‘reasonable’ PF assessment for a given slope design. Observational and kinematic methods were widely used in the 1990s until advances in computing permitted the routine use of numerical modelling. In the 2000s and early 2010s, PF in numerical models was generally calculated using the point estimate method. More recently, some limit equilibrium analysis software offer statistical parameter inputs along with Monte-Carlo or Latin-Hypercube sampling methods to automatically calculate PF. Factors including rock type and density, weathering and alteration, intact rock strength, rock mass quality and shear strength, the location and orientation of geologic structure, shear strength of geologic structure and groundwater pore pressure influence the stability of rock slopes. Significant engineering and geological judgment, interpretation and data interpolation is usually applied in determining these factors and amalgamating them into a geotechnical model which can then be analysed. Most factors are estimated ‘approximately’ or with allowances for some variability rather than ‘exactly’. When it comes to numerical modelling, some of these factors are then treated deterministically (i.e. as exact values), while others have probabilistic inputs based on the user’s discretion and understanding of the problem being analysed. This paper discusses the importance of understanding the key aspects of slope design for homogeneous and heterogeneous rock masses and how they can be translated into reasonable PF assessments where the data permits. A case study from a large open pit gold mine in a complex geological setting in Western Australia is presented to illustrate how PF can be calculated using different methods and obtain markedly different results. Ultimately sound engineering judgement and logic is often required to decipher the true meaning and significance (if any) of some PF results.Keywords: probability of failure, point estimate method, Monte-Carlo simulations, sensitivity analysis, slope stability
Procedia PDF Downloads 208231 Reliability Based Topology Optimization: An Efficient Method for Material Uncertainty
Authors: Mehdi Jalalpour, Mazdak Tootkaboni
Abstract:
We present a computationally efficient method for reliability-based topology optimization under material properties uncertainty, which is assumed to be lognormally distributed and correlated within the domain. Computational efficiency is achieved through estimating the response statistics with stochastic perturbation of second order, using these statistics to fit an appropriate distribution that follows the empirical distribution of the response, and employing an efficient gradient-based optimizer. The proposed algorithm is utilized for design of new structures and the changes in the optimized topology is discussed for various levels of target reliability and correlation strength. Predictions were verified thorough comparison with results obtained using Monte Carlo simulation.Keywords: material uncertainty, stochastic perturbation, structural reliability, topology optimization
Procedia PDF Downloads 606230 Ensemble Sampler For Infinite-Dimensional Inverse Problems
Authors: Jeremie Coullon, Robert J. Webber
Abstract:
We introduce a Markov chain Monte Carlo (MCMC) sam-pler for infinite-dimensional inverse problems. Our sam-pler is based on the affine invariant ensemble sampler, which uses interacting walkers to adapt to the covariance structure of the target distribution. We extend this ensem-ble sampler for the first time to infinite-dimensional func-tion spaces, yielding a highly efficient gradient-free MCMC algorithm. Because our ensemble sampler does not require gradients or posterior covariance estimates, it is simple to implement and broadly applicable. In many Bayes-ian inverse problems, Markov chain Monte Carlo (MCMC) meth-ods are needed to approximate distributions on infinite-dimensional function spaces, for example, in groundwater flow, medical imaging, and traffic flow. Yet designing efficient MCMC methods for function spaces has proved challenging. Recent gradi-ent-based MCMC methods preconditioned MCMC methods, and SMC methods have improved the computational efficiency of functional random walk. However, these samplers require gradi-ents or posterior covariance estimates that may be challenging to obtain. Calculating gradients is difficult or impossible in many high-dimensional inverse problems involving a numerical integra-tor with a black-box code base. Additionally, accurately estimating posterior covariances can require a lengthy pilot run or adaptation period. These concerns raise the question: is there a functional sampler that outperforms functional random walk without requir-ing gradients or posterior covariance estimates? To address this question, we consider a gradient-free sampler that avoids explicit covariance estimation yet adapts naturally to the covariance struc-ture of the sampled distribution. This sampler works by consider-ing an ensemble of walkers and interpolating and extrapolating between walkers to make a proposal. This is called the affine in-variant ensemble sampler (AIES), which is easy to tune, easy to parallelize, and efficient at sampling spaces of moderate dimen-sionality (less than 20). The main contribution of this work is to propose a functional ensemble sampler (FES) that combines func-tional random walk and AIES. To apply this sampler, we first cal-culate the Karhunen–Loeve (KL) expansion for the Bayesian prior distribution, assumed to be Gaussian and trace-class. Then, we use AIES to sample the posterior distribution on the low-wavenumber KL components and use the functional random walk to sample the posterior distribution on the high-wavenumber KL components. Alternating between AIES and functional random walk updates, we obtain our functional ensemble sampler that is efficient and easy to use without requiring detailed knowledge of the target dis-tribution. In past work, several authors have proposed splitting the Bayesian posterior into low-wavenumber and high-wavenumber components and then applying enhanced sampling to the low-wavenumber components. Yet compared to these other samplers, FES is unique in its simplicity and broad applicability. FES does not require any derivatives, and the need for derivative-free sam-plers has previously been emphasized. FES also eliminates the requirement for posterior covariance estimates. Lastly, FES is more efficient than other gradient-free samplers in our tests. In two nu-merical examples, we apply FES to challenging inverse problems that involve estimating a functional parameter and one or more scalar parameters. We compare the performance of functional random walk, FES, and an alternative derivative-free sampler that explicitly estimates the posterior covariance matrix. We conclude that FES is the fastest available gradient-free sampler for these challenging and multimodal test problems.Keywords: Bayesian inverse problems, Markov chain Monte Carlo, infinite-dimensional inverse problems, dimensionality reduction
Procedia PDF Downloads 154229 Approximate Confidence Interval for Effect Size Base on Bootstrap Resampling Method
Authors: S. Phanyaem
Abstract:
This paper presents the confidence intervals for the effect size base on bootstrap resampling method. The meta-analytic confidence interval for effect size is proposed that are easy to compute. A Monte Carlo simulation study was conducted to compare the performance of the proposed confidence intervals with the existing confidence intervals. The best confidence interval method will have a coverage probability close to 0.95. Simulation results have shown that our proposed confidence intervals perform well in terms of coverage probability and expected length.Keywords: effect size, confidence interval, bootstrap method, resampling
Procedia PDF Downloads 596