Search results for: Carlo Ruiz
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 525

Search results for: Carlo Ruiz

405 Instructional Design Strategy Based on Stories with Interactive Resources for Learning English in Preschool

Authors: Vicario Marina, Ruiz Elena, Peredo Ruben, Bustos Eduardo

Abstract:

the development group of Educational Computing of the National Polytechnic (IPN) in Mexico has been developing interactive resources at preschool level in an effort to improve learning in the Child Development Centers (CENDI). This work describes both a didactic architecture and a strategy for teaching English with digital stories using interactive resources available through a Web repository designed to be used in mobile platforms. It will be accessible initially to 500 children and worldwide by the end of 2015.

Keywords: instructional design, interactive resources, digital educational resources, story based English teaching, preschool education

Procedia PDF Downloads 477
404 A Fourier Method for Risk Quantification and Allocation of Credit Portfolios

Authors: Xiaoyu Shen, Fang Fang, Chujun Qiu

Abstract:

Herewith we present a Fourier method for credit risk quantification and allocation in the factor-copula model framework. The key insight is that, compared to directly computing the cumulative distribution function of the portfolio loss via Monte Carlo simulation, it is, in fact, more efficient to calculate the transformation of the distribution function in the Fourier domain instead and inverting back to the real domain can be done in just one step and semi-analytically, thanks to the popular COS method (with some adjustments). We also show that the Euler risk allocation problem can be solved in the same way since it can be transformed into the problem of evaluating a conditional cumulative distribution function. Once the conditional or unconditional cumulative distribution function is known, one can easily calculate various risk metrics. The proposed method not only fills the niche in literature, to the best of our knowledge, of accurate numerical methods for risk allocation but may also serve as a much faster alternative to the Monte Carlo simulation method for risk quantification in general. It can cope with various factor-copula model choices, which we demonstrate via examples of a two-factor Gaussian copula and a two-factor Gaussian-t hybrid copula. The fast error convergence is proved mathematically and then verified by numerical experiments, in which Value-at-Risk, Expected Shortfall, and conditional Expected Shortfall are taken as examples of commonly used risk metrics. The calculation speed and accuracy are tested to be significantly superior to the MC simulation for real-sized portfolios. The computational complexity is, by design, primarily driven by the number of factors instead of the number of obligors, as in the case of Monte Carlo simulation. The limitation of this method lies in the "curse of dimension" that is intrinsic to multi-dimensional numerical integration, which, however, can be relaxed with the help of dimension reduction techniques and/or parallel computing, as we will demonstrate in a separate paper. The potential application of this method has a wide range: from credit derivatives pricing to economic capital calculation of the banking book, default risk charge and incremental risk charge computation of the trading book, and even to other risk types than credit risk.

Keywords: credit portfolio, risk allocation, factor copula model, the COS method, Fourier method

Procedia PDF Downloads 171
403 Comparison between Deterministic and Probabilistic Stability Analysis, Featuring Consequent Risk Assessment

Authors: Isabela Moreira Queiroz

Abstract:

Slope stability analyses are largely carried out by deterministic methods and evaluated through a single security factor. Although it is known that the geotechnical parameters can present great dispersal, such analyses are considered fixed and known. The probabilistic methods, in turn, incorporate the variability of input key parameters (random variables), resulting in a range of values of safety factors, thus enabling the determination of the probability of failure, which is an essential parameter in the calculation of the risk (probability multiplied by the consequence of the event). Among the probabilistic methods, there are three frequently used methods in geotechnical society: FOSM (First-Order, Second-Moment), Rosenblueth (Point Estimates) and Monte Carlo. This paper presents a comparison between the results from deterministic and probabilistic analyses (FOSM method, Monte Carlo and Rosenblueth) applied to a hypothetical slope. The end was held to evaluate the behavior of the slope and consequent risk analysis, which is used to calculate the risk and analyze their mitigation and control solutions. It can be observed that the results obtained by the three probabilistic methods were quite close. It should be noticed that the calculation of the risk makes it possible to list the priority to the implementation of mitigation measures. Therefore, it is recommended to do a good assessment of the geological-geotechnical model incorporating the uncertainty in viability, design, construction, operation and closure by means of risk management. 

Keywords: probabilistic methods, risk assessment, risk management, slope stability

Procedia PDF Downloads 395
402 Evaluation of Corrosion by Impedance Spectroscopy of Embedded Steel in an Alternative Concrete Exposed a Chloride Ion

Authors: E. Ruíz, W. Aperador

Abstract:

In this article evaluates the protective effect of the concrete alternative obtained from the fly ash and iron and steel slag mixed in binary form and were placed on structural steel ASTM A 706. The study was conducted comparatively with specimens exposed to natural conditions free of chloride ion. The effect of chloride ion on the specimens was generated of form accelerated under controlled conditions (3.5% NaCl and 25 ° C temperature). The Impedance data were acquired over a range of 1 mHz to 100 kHz. At frequencies high is found the response of the interface means of the exposure-concrete and to frequency low the response of the interface corresponding to concrete-steel.

Keywords: alternative concrete, corrosion, alkaline activation, impedance spectroscopy

Procedia PDF Downloads 362
401 Fuel Inventory/ Depletion Analysis for a Thorium-Uranium Dioxide (Th-U) O2 Pin Cell Benchmark Using Monte Carlo and Deterministic Codes with New Version VIII.0 of the Evaluated Nuclear Data File (ENDF/B) Nuclear Data Library

Authors: Jamal Al-Zain, O. El Hajjaji, T. El Bardouni

Abstract:

A (Th-U) O2 fuel pin benchmark made up of 25 w/o U and 75 w/o Th was used. In order to analyze the depletion and inventory of the fuel for the pressurized water reactor pin-cell model. The new version VIII.0 of the ENDF/B nuclear data library was used to create a data set in ACE format at various temperatures and process the data using the MAKXSF6.2 and NJOY2016 programs to process the data at the various temperatures in order to conduct this study and analyze cross-section data. The infinite multiplication factor, the concentrations and activities of the main fission products, the actinide radionuclides accumulated in the pin cell, and the total radioactivity were all estimated and compared in this study using the Monte Carlo N-Particle 6 (MCNP6.2) and DRAGON5 programs. Additionally, the behavior of the Pressurized Water Reactor (PWR) thorium pin cell that is dependent on burn-up (BU) was validated and compared with the reference data obtained using the Massachusetts Institute of Technology (MIT-MOCUP), Idaho National Engineering and Environmental Laboratory (INEEL-MOCUP), and CASMO-4 codes. The results of this study indicate that all of the codes examined have good agreements.

Keywords: PWR thorium pin cell, ENDF/B-VIII.0, MAKXSF6.2, NJOY2016, MCNP6.2, DRAGON5, fuel burn-up.

Procedia PDF Downloads 111
400 A Monte Carlo Fuzzy Logistic Regression Framework against Imbalance and Separation

Authors: Georgios Charizanos, Haydar Demirhan, Duygu Icen

Abstract:

Two of the most impactful issues in classical logistic regression are class imbalance and complete separation. These can result in model predictions heavily leaning towards the imbalanced class on the binary response variable or over-fitting issues. Fuzzy methodology offers key solutions for handling these problems. However, most studies propose the transformation of the binary responses into a continuous format limited within [0,1]. This is called the possibilistic approach within fuzzy logistic regression. Following this approach is more aligned with straightforward regression since a logit-link function is not utilized, and fuzzy probabilities are not generated. In contrast, we propose a method of fuzzifying binary response variables that allows for the use of the logit-link function; hence, a probabilistic fuzzy logistic regression model with the Monte Carlo method. The fuzzy probabilities are then classified by selecting a fuzzy threshold. Different combinations of fuzzy and crisp input, output, and coefficients are explored, aiming to understand which of these perform better under different conditions of imbalance and separation. We conduct numerical experiments using both synthetic and real datasets to demonstrate the performance of the fuzzy logistic regression framework against seven crisp machine learning methods. The proposed framework shows better performance irrespective of the degree of imbalance and presence of separation in the data, while the considered machine learning methods are significantly impacted.

Keywords: fuzzy logistic regression, fuzzy, logistic, machine learning

Procedia PDF Downloads 79
399 Julia-Based Computational Tool for Composite System Reliability Assessment

Authors: Josif Figueroa, Kush Bubbar, Greg Young-Morris

Abstract:

The reliability evaluation of composite generation and bulk transmission systems is crucial for ensuring a reliable supply of electrical energy to significant system load points. However, evaluating adequacy indices using probabilistic methods like sequential Monte Carlo Simulation can be computationally expensive. Despite this, it is necessary when time-varying and interdependent resources, such as renewables and energy storage systems, are involved. Recent advances in solving power network optimization problems and parallel computing have improved runtime performance while maintaining solution accuracy. This work introduces CompositeSystems, an open-source Composite System Reliability Evaluation tool developed in Julia™, to address the current deficiencies of commercial and non-commercial tools. This work introduces its design, validation, and effectiveness, which includes analyzing two different formulations of the Optimal Power Flow problem. The simulations demonstrate excellent agreement with existing published studies while improving replicability and reproducibility. Overall, the proposed tool can provide valuable insights into the performance of transmission systems, making it an important addition to the existing toolbox for power system planning.

Keywords: open-source software, composite system reliability, optimization methods, Monte Carlo methods, optimal power flow

Procedia PDF Downloads 81
398 Water Diffusivity in Amorphous Epoxy Resins: An Autonomous Basin Climbing-Based Simulation Method

Authors: Betim Bahtiri, B. Arash, R. Rolfes

Abstract:

Epoxy-based materials are frequently exposed to high-humidity environments in many engineering applications. As a result, their material properties would be degraded by water absorption. A full characterization of the material properties under hygrothermal conditions requires time- and cost-consuming experimental tests. To gain insights into the physics of diffusion mechanisms, atomistic simulations have been shown to be effective tools. Concerning the diffusion of water in polymers, spatial trajectories of water molecules are obtained from molecular dynamics (MD) simulations allowing the interpretation of diffusion pathways at the nanoscale in a polymer network. Conventional MD simulations of water diffusion in amorphous polymers lead to discrepancies at low temperatures due to the short timescales of the simulations. In the proposed model, this issue is solved by using a combined scheme of autonomous basin climbing (ABC) with kinetic Monte Carlo and reactive MD simulations to investigate the diffusivity of water molecules in epoxy resins across a wide range of temperatures. It is shown that the proposed simulation framework estimates kinetic properties of water diffusion in epoxy resins that are consistent with experimental observations and provide a predictive tool for investigating the diffusion of small molecules in other amorphous polymers.

Keywords: epoxy resins, water diffusion, autonomous basin climbing, kinetic Monte Carlo, reactive molecular dynamics

Procedia PDF Downloads 71
397 Effect of Inclusions in the Ultrasonic Fatigue Endurance of Maraging 300 Steel

Authors: G. M. Dominguez Almaraz, J. A. Ruiz Vilchez, M. A. Sanchez Miranda

Abstract:

Ultrasonic fatigue tests have been carried out in the maraging 300 steel. Experimental results show that fatigue endurance under this modality of testing is closely related to the nature and geometrical properties of inclusions present in this alloy. A model was proposed to correlate the ultrasonic fatigue endurance with the nature and geometrical properties of the crack initiation inclusion. Scanning Electron Microscopy analyses were obtained on the fracture surfaces, in order to assess the crack initiation inclusion and to introduce these parameters in the proposed model, with good agreement for the fatigue life prediction.

Keywords: inclusions, ultrasonic fatigue, maraging 300 steel, crack initiation

Procedia PDF Downloads 220
396 Unsteady Three-Dimensional Adaptive Spatial-Temporal Multi-Scale Direct Simulation Monte Carlo Solver to Simulate Rarefied Gas Flows in Micro/Nano Devices

Authors: Mirvat Shamseddine, Issam Lakkis

Abstract:

We present an efficient, three-dimensional parallel multi-scale Direct Simulation Monte Carlo (DSMC) algorithm for the simulation of unsteady rarefied gas flows in micro/nanosystems. The algorithm employs a novel spatiotemporal adaptivity scheme. The scheme performs a fully dynamic multi-level grid adaption based on the gradients of flow macro-parameters and an automatic temporal adaptation. The computational domain consists of a hierarchical octree-based Cartesian grid representation of the flow domain and a triangular mesh for the solid object surfaces. The hybrid mesh, combined with the spatiotemporal adaptivity scheme, allows for increased flexibility and efficient data management, rendering the framework suitable for efficient particle-tracing and dynamic grid refinement and coarsening. The parallel algorithm is optimized to run DSMC simulations of strongly unsteady, non-equilibrium flows over multiple cores. The presented method is validated by comparing with benchmark studies and then employed to improve the design of micro-scale hotwire thermal sensors in rarefied gas flows.

Keywords: DSMC, oct-tree hierarchical grid, ray tracing, spatial-temporal adaptivity scheme, unsteady rarefied gas flows

Procedia PDF Downloads 305
395 Analyzing of Speed Disparity in Mixed Vehicle Technologies on Horizontal Curves

Authors: Tahmina Sultana, Yasser Hassan

Abstract:

Vehicle technologies rapidly evolving due to their multifaceted advantages. Adapted different vehicle technologies like connectivity and automation on the same roads with conventional vehicles controlled by human drivers may increase speed disparity in mixed vehicle technologies. Identifying relationships between speed distribution measures of different vehicles and road geometry can be an indicator of speed disparity in mixed technologies. Previous studies proved that speed disparity measures and traffic accidents are inextricably related. Horizontal curves from three geographic areas were selected based on relevant criteria, and speed data were collected at the midpoint of the preceding tangent and starting, ending, and middle point of the curve. Multiple linear mixed effect models (LME) were developed using the instantaneous speed measures representing the speed of vehicles at different points of horizontal curves to recognize relationships between speed variance (standard deviation) and road geometry. A simulation-based framework (Monte Carlo) was introduced to check the speed disparity on horizontal curves in mixed vehicle technologies when consideration is given to the interactions among connected vehicles (CVs), autonomous vehicles (AVs), and non-connected vehicles (NCVs) on horizontal curves. The Monte Carlo method was used in the simulation to randomly sample values for the various parameters from their respective distributions. Theresults show that NCVs had higher speed variation than CVs and AVs. In addition, AVs and CVs contributed to reduce speed disparity in the mixed vehicle technologies in any penetration rates.

Keywords: autonomous vehicles, connected vehicles, non-connected vehicles, speed variance

Procedia PDF Downloads 148
394 Advanced Numerical and Analytical Methods for Assessing Concrete Sewers and Their Remaining Service Life

Authors: Amir Alani, Mojtaba Mahmoodian, Anna Romanova, Asaad Faramarzi

Abstract:

Pipelines are extensively used engineering structures which convey fluid from one place to another. Most of the time, pipelines are placed underground and are encumbered by soil weight and traffic loads. Corrosion of pipe material is the most common form of pipeline deterioration and should be considered in both the strength and serviceability analysis of pipes. The study in this research focuses on concrete pipes in sewage systems (concrete sewers). This research firstly investigates how to involve the effect of corrosion as a time dependent process of deterioration in the structural and failure analysis of this type of pipe. Then three probabilistic time dependent reliability analysis methods including the first passage probability theory, the gamma distributed degradation model and the Monte Carlo simulation technique are discussed and developed. Sensitivity analysis indexes which can be used to identify the most important parameters that affect pipe failure are also discussed. The reliability analysis methods developed in this paper contribute as rational tools for decision makers with regard to the strengthening and rehabilitation of existing pipelines. The results can be used to obtain a cost-effective strategy for the management of the sewer system.

Keywords: reliability analysis, service life prediction, Monte Carlo simulation method, first passage probability theory, gamma distributed degradation model

Procedia PDF Downloads 460
393 Evaluation of the Performance of Solar Stills as an Alternative for Brine Treatment Applying the Monte Carlo Ray Tracing Method

Authors: B. E. Tarazona-Romero, J. G. Ascanio-Villabona, O. Lengerke-Perez, A. D. Rincon-Quintero, C. L. Sandoval-Rodriguez

Abstract:

Desalination offers solutions for the shortage of water in the world, however, the process of eliminating salts generates a by-product known as brine, generally eliminated in the environment through techniques that mitigate its impact. Brine treatment techniques are vital to developing an environmentally sustainable desalination process. Consequently, this document evaluates three different geometric configurations of solar stills as an alternative for brine treatment to be integrated into a low-scale desalination process. The geometric scenarios to be studied were selected because they have characteristics that adapt to the concept of appropriate technology; low cost, intensive labor and material resources for local manufacturing, modularity, and simplicity in construction. Additionally, the conceptual design of the collectors was carried out, and the ray tracing methodology was applied through the open access software SolTrace and Tonatiuh. The simulation process used 600.00 rays and modified two input parameters; direct normal radiation (DNI) and reflectance. In summary, for the scenarios evaluated, the ladder-type distiller presented higher efficiency values compared to the pyramid-type and single-slope collectors. Finally, the efficiency of the collectors studied was directly related to their geometry, that is, large geometries allow them to receive a greater number of solar rays in various paths, affecting the efficiency of the device.

Keywords: appropriate technology, brine treatment techniques, desalination, monte carlo ray tracing

Procedia PDF Downloads 76
392 Association between G2677T/A MDR1 Polymorphism with the Clinical Response to Disease Modifying Anti-Rheumatic Drugs in Rheumatoid Arthritis

Authors: Alan Ruiz-Padilla, Brando Villalobos-Villalobos, Yeniley Ruiz-Noa, Claudia Mendoza-Macías, Claudia Palafox-Sánchez, Miguel Marín-Rosales, Álvaro Cruz, Rubén Rangel-Salazar

Abstract:

Introduction: In patients with rheumatoid arthritis, resistance or poor response to disease modifying antirheumatic drugs (DMARD) may be a reflection of the increase in g-P. The expression of g-P may be important in mediating the effluence of DMARD from the cell. In addition, P-glycoprotein is involved in the transport of cytokines, IL-1, IL-2 and IL-4, from normal lymphocytes activated to the surrounding extracellular matrix, thus influencing the activity of RA. The involvement of P-glycoprotein in the transmembrane transport of cytokines can serve as a modulator of the efficacy of DMARD. It was shown that a number of lymphocytes with glycoprotein P activity is increased in patients with RA; therefore, P-glycoprotein expression could be related to the activity of RA and could be a predictor of poor response to therapy. Objective: To evaluate in RA patients, if the G2677T/A MDR1 polymorphisms is associated with differences in the rate of therapeutic response to disease-modifying antirheumatic agents in patients with rheumatoid arthritis. Material and Methods: A prospective cohort study was conducted. Fifty seven patients with RA were included. They had an active disease according to DAS-28 (score >3.2). We excluded patients receiving biological agents. All the patients were followed during 6 months in order to identify the rate of therapeutic response according to the American College of Rheumatology (ACR) criteria. At the baseline peripheral blood samples were taken in order to identify the G2677T/A MDR1 polymorphisms using PCR- Specific allele. The fragment was identified by electrophoresis in polyacrylamide gels stained with ethidium bromide. For statistical analysis, the genotypic and allelic frequencies of MDR1 gene polymorphism between responders and non-responders were determined. Chi-square tests as well as, relative risks with 95% confidence intervals (95%CI) were computed to identify differences in the risk for achieving therapeutic response. Results: RA patients had a mean age of 47.33 ± 12.52 years, 87.7% were women with a mean for DAS-28 score of 6.45 ± 1.12. At the 6 months, the rate of therapeutic response was 68.7 %. The observed genotype frequencies were: for G/G 40%, T/T 32%, A/A 19%, G/T 7% and for A/A genotype 2%. Patients with G allele developed at 6 months of treatment, higher rate for therapeutic response assessed by ACR20 compared to patients with others alleles (p=0.039). Conclusions: Patients with G allele of the - G2677T/A MDR1 polymorphisms had a higher rate of therapeutic response at 6 months with DMARD. These preliminary data support the requirement for a deep evaluation of these and other genotypes as factors that may influence the therapeutic response in RA.

Keywords: pharmacogenetics, MDR1, P-glycoprotein, therapeutic response, rheumatoid arthritis

Procedia PDF Downloads 211
391 A Critical Review of Assessments of Geological CO2 Storage Resources in Pennsylvania and the Surrounding Region

Authors: Levent Taylan Ozgur Yildirim, Qihao Qian, John Yilin Wang

Abstract:

A critical review of assessments of geological carbon dioxide (CO2) storage resources in Pennsylvania and the surrounding region was completed with a focus on the studies of Midwest Regional Carbon Sequestration Partnership (MRCSP), United States Department of Energy (US-DOE), and United States Geological Survey (USGS). Pennsylvania Geological Survey participated in the MRCSP Phase I research to characterize potential storage formations in Pennsylvania. The MRCSP’s volumetric method estimated ~89 gigatonnes (Gt) of total CO2 storage resources in deep saline formations, depleted oil and gas reservoirs, coals, and shales in Pennsylvania. Meanwhile, the US-DOE calculated storage efficiency factors using log-odds normal distribution and Monte Carlo sampling, revealing contingent storage resources of ~18 Gt to ~20 Gt in deep saline formations, depleted oil and gas reservoirs, and coals in Pennsylvania. Additionally, the USGS employed Beta-PERT distribution and Monte Carlo sampling to determine buoyant and residual storage efficiency factors, resulting in 20 Gt of contingent storage resources across four storage assessment units in Appalachian Basin. However, few studies have explored CO2 storage resources in shales in the region, yielding inconclusive findings. This article provides a critical and most up to date review and analysis of geological CO2 storage resources in Pennsylvania and the region.

Keywords: carbon capture and storage, geological CO2 storage, pennsylvania, appalachian basin

Procedia PDF Downloads 58
390 Electricity Load Modeling: An Application to Italian Market

Authors: Giovanni Masala, Stefania Marica

Abstract:

Forecasting electricity load plays a crucial role regards decision making and planning for economical purposes. Besides, in the light of the recent privatization and deregulation of the power industry, the forecasting of future electricity load turned out to be a very challenging problem. Empirical data about electricity load highlights a clear seasonal behavior (higher load during the winter season), which is partly due to climatic effects. We also emphasize the presence of load periodicity at a weekly basis (electricity load is usually lower on weekends or holidays) and at daily basis (electricity load is clearly influenced by the hour). Finally, a long-term trend may depend on the general economic situation (for example, industrial production affects electricity load). All these features must be captured by the model. The purpose of this paper is then to build an hourly electricity load model. The deterministic component of the model requires non-linear regression and Fourier series while we will investigate the stochastic component through econometrical tools. The calibration of the parameters’ model will be performed by using data coming from the Italian market in a 6 year period (2007- 2012). Then, we will perform a Monte Carlo simulation in order to compare the simulated data respect to the real data (both in-sample and out-of-sample inspection). The reliability of the model will be deduced thanks to standard tests which highlight a good fitting of the simulated values.

Keywords: ARMA-GARCH process, electricity load, fitting tests, Fourier series, Monte Carlo simulation, non-linear regression

Procedia PDF Downloads 402
389 Uncertainty Assessment in Building Energy Performance

Authors: Fally Titikpina, Abderafi Charki, Antoine Caucheteux, David Bigaud

Abstract:

The building sector is one of the largest energy consumer with about 40% of the final energy consumption in the European Union. Ensuring building energy performance is of scientific, technological and sociological matter. To assess a building energy performance, the consumption being predicted or estimated during the design stage is compared with the measured consumption when the building is operational. When valuing this performance, many buildings show significant differences between the calculated and measured consumption. In order to assess the performance accurately and ensure the thermal efficiency of the building, it is necessary to evaluate the uncertainties involved not only in measurement but also those induced by the propagation of dynamic and static input data in the model being used. The evaluation of measurement uncertainty is based on both the knowledge about the measurement process and the input quantities which influence the result of measurement. Measurement uncertainty can be evaluated within the framework of conventional statistics presented in the \textit{Guide to the Expression of Measurement Uncertainty (GUM)} as well as by Bayesian Statistical Theory (BST). Another choice is the use of numerical methods like Monte Carlo Simulation (MCS). In this paper, we proposed to evaluate the uncertainty associated to the use of a simplified model for the estimation of the energy consumption of a given building. A detailed review and discussion of these three approaches (GUM, MCS and BST) is given. Therefore, an office building has been monitored and multiple sensors have been mounted on candidate locations to get required data. The monitored zone is composed of six offices and has an overall surface of 102 $m^2$. Temperature data, electrical and heating consumption, windows opening and occupancy rate are the features for our research work.

Keywords: building energy performance, uncertainty evaluation, GUM, bayesian approach, monte carlo method

Procedia PDF Downloads 464
388 Optimization of Maritime Platform Transport Problem of Solid, Special and Dangerous Waste

Authors: Ocotlán Díaz-Parra, Jorge A. Ruiz-Vanoye, Alejandro Fuentes-Penna, Beatriz Bernabe-Loranca, Patricia Ambrocio-Cruz, José J. Hernández-Flores

Abstract:

The Maritime Platform Transport Problem of Solid, Special and Dangerous Waste consist of to minimize the monetary value of carry different types of waste from one location to another location using ships. We offer a novel mathematical, the characterization of the problem and the use CPLEX to find the optimal values to solve the Solid, Special and Hazardous Waste Transportation Problem of offshore platforms instances of Mexican state-owned petroleum company (PEMEX). The set of instances used are WTPLib real instances and the tool CPLEX solver to solve the MPTPSSDW problem.

Keywords: oil platform, transport problem, waste, solid waste

Procedia PDF Downloads 475
387 Application Reliability Method for Concrete Dams

Authors: Mustapha Kamel Mihoubi, Mohamed Essadik Kerkar

Abstract:

Probabilistic risk analysis models are used to provide a better understanding of the reliability and structural failure of works, including when calculating the stability of large structures to a major risk in the event of an accident or breakdown. This work is interested in the study of the probability of failure of concrete dams through the application of reliability analysis methods including the methods used in engineering. It is in our case, the use of level 2 methods via the study limit state. Hence, the probability of product failures is estimated by analytical methods of the type first order risk method (FORM) and the second order risk method (SORM). By way of comparison, a level three method was used which generates a full analysis of the problem and involves an integration of the probability density function of random variables extended to the field of security using the Monte Carlo simulation method. Taking into account the change in stress following load combinations: normal, exceptional and extreme acting on the dam, calculation of the results obtained have provided acceptable failure probability values which largely corroborate the theory, in fact, the probability of failure tends to increase with increasing load intensities, thus causing a significant decrease in strength, shear forces then induce a shift that threatens the reliability of the structure by intolerable values of the probability of product failures. Especially, in case the increase of uplift in a hypothetical default of the drainage system.

Keywords: dam, failure, limit-state, monte-carlo, reliability, probability, simulation, sliding, taylor

Procedia PDF Downloads 328
386 Considering Uncertainties of Input Parameters on Energy, Environmental Impacts and Life Cycle Costing by Monte Carlo Simulation in the Decision Making Process

Authors: Johannes Gantner, Michael Held, Matthias Fischer

Abstract:

The refurbishment of the building stock in terms of energy supply and efficiency is one of the major challenges of the German turnaround in energy policy. As the building sector accounts for 40% of Germany’s total energy demand, additional insulation is key for energy efficient refurbished buildings. Nevertheless the energetic benefits often the environmental and economic performances of insulation materials are questioned. The methods Life Cycle Assessment (LCA) as well as Life Cycle Costing (LCC) can form the standardized basis for answering this doubts and more and more become important for material producers due efforts such as Product Environmental Footprint (PEF) or Environmental Product Declarations (EPD). Due to increasing use of LCA and LCC information for decision support the robustness and resilience of the results become crucial especially for support of decision and policy makers. LCA and LCC results are based on respective models which depend on technical parameters like efficiencies, material and energy demand, product output, etc.. Nevertheless, the influence of parameter uncertainties on lifecycle results are usually not considered or just studied superficially. Anyhow the effect of parameter uncertainties cannot be neglected. Based on the example of an exterior wall the overall lifecycle results are varying by a magnitude of more than three. As a result simple best case worst case analyses used in practice are not sufficient. These analyses allow for a first rude view on the results but are not taking effects into account such as error propagation. Thereby LCA practitioners cannot provide further guidance for decision makers. Probabilistic analyses enable LCA practitioners to gain deeper understanding of the LCA and LCC results and provide a better decision support. Within this study, the environmental and economic impacts of an exterior wall system over its whole lifecycle are illustrated, and the effect of different uncertainty analysis on the interpretation in terms of resilience and robustness are shown. Hereby the approaches of error propagation and Monte Carlo Simulations are applied and combined with statistical methods in order to allow for a deeper understanding and interpretation. All in all this study emphasis the need for a deeper and more detailed probabilistic evaluation based on statistical methods. Just by this, misleading interpretations can be avoided, and the results can be used for resilient and robust decisions.

Keywords: uncertainty, life cycle assessment, life cycle costing, Monte Carlo simulation

Procedia PDF Downloads 289
385 Investigating the Minimum RVE Size to Simulate Poly (Propylene carbonate) Composites Reinforced with Cellulose Nanocrystals as a Bio-Nanocomposite

Authors: Hamed Nazeri, Pierre Mertiny, Yongsheng Ma, Kajsa Duke

Abstract:

The background of the present study is the use of environment-friendly biopolymer and biocomposite materials. Among the recently introduced biopolymers, poly (propylene carbonate) (PPC) has been gaining attention. This study focuses on the size of representative volume elements (RVE) in order to simulate PPC composites reinforced by cellulose nanocrystals (CNCs) as a bio-nanocomposite. Before manufacturing nanocomposites, numerical modeling should be implemented to explore and predict mechanical properties, which may be accomplished by creating and studying a suitable RVE. In other studies, modeling of composites with rod shaped fillers has been reported assuming that fillers are unidirectionally aligned. But, modeling of non-aligned filler dispersions is considerably more difficult. This study investigates the minimum RVE size to enable subsequent FEA modeling. The matrix and nano-fillers were modeled using the finite element software ABAQUS, assuming randomly dispersed fillers with a filler mass fraction of 1.5%. To simulate filler dispersion, a Monte Carlo technique was employed. The numerical simulation was implemented to find composite elastic moduli. After commencing the simulation with a single filler particle, the number of particles was increased to assess the minimum number of filler particles that satisfies the requirements for an RVE, providing the composite elastic modulus in a reliable fashion.

Keywords: biocomposite, Monte Carlo method, nanocomposite, representative volume element

Procedia PDF Downloads 447
384 Monte Carlo Simulation Study on Improving the Flatting Filter-Free Radiotherapy Beam Quality Using Filters from Low- z Material

Authors: H. M. Alfrihidi, H.A. Albarakaty

Abstract:

Flattening filter-free (FFF) photon beam radiotherapy has increased in the last decade, which is enabled by advancements in treatment planning systems and radiation delivery techniques like multi-leave collimators. FFF beams have higher dose rates, which reduces treatment time. On the other hand, FFF beams have a higher surface dose, which is due to the loss of beam hardening effect caused by the presence of the flatting filter (FF). The possibility of improving FFF beam quality using filters from low-z materials such as steel and aluminium (Al) was investigated using Monte Carlo (MC) simulations. The attenuation coefficient of low-z materials for low-energy photons is higher than that of high-energy photons, which leads to the hardening of the FFF beam and, consequently, a reduction in the surface dose. BEAMnrc user code, based on Electron Gamma Shower (EGSnrc) MC code, is used to simulate the beam of a 6 MV True-Beam linac. A phase-space (phosphor) file provided by Varian Medical Systems was used as a radiation source in the simulation. This phosphor file was scored just above the jaws at 27.88 cm from the target. The linac from the jaw downward was constructed, and radiation passing was simulated and scored at 100 cm from the target. To study the effect of low-z filters, steel and Al filters with a thickness of 1 cm were added below the jaws, and the phosphor file was scored at 100 cm from the target. For comparison, the FF beam was simulated using a similar setup. (BEAM Data Processor (BEAMdp) is used to analyse the energy spectrum in the phosphorus files. Then, the dose distribution resulting from these beams was simulated in a homogeneous water phantom using DOSXYZnrc. The dose profile was evaluated according to the surface dose, the lateral dose distribution, and the percentage depth dose (PDD). The energy spectra of the beams show that the FFF beam is softer than the FF beam. The energy peaks for the FFF and FF beams are 0.525 MeV and 1.52 MeV, respectively. The FFF beam's energy peak becomes 1.1 MeV using a steel filter, while the Al filter does not affect the peak position. Steel and Al's filters reduced the surface dose by 5% and 1.7%, respectively. The dose at a depth of 10 cm (D10) rises by around 2% and 0.5% due to using a steel and Al filter, respectively. On the other hand, steel and Al filters reduce the dose rate of the FFF beam by 34% and 14%, respectively. However, their effect on the dose rate is less than that of the tungsten FF, which reduces the dose rate by about 60%. In conclusion, filters from low-z material decrease the surface dose and increase the D10 dose, allowing for a high-dose delivery to deep tumors with a low skin dose. Although using these filters affects the dose rate, this effect is much lower than the effect of the FF.

Keywords: flattening filter free, monte carlo, radiotherapy, surface dose

Procedia PDF Downloads 76
383 The BNCT Project Using the Cf-252 Source: Monte Carlo Simulations

Authors: Marta Błażkiewicz-Mazurek, Adam Konefał

Abstract:

The project can be divided into three main parts: i. modeling the Cf-252 neutron source and conducting an experiment to verify the correctness of the obtained results, ii. design of the BNCT system infrastructure, iii. analysis of the results from the logical detector. Modeling of the Cf-252 source included designing the shape and size of the source as well as the energy and spatial distribution of emitted neutrons. Two options were considered: a point source and a cylindrical spatial source. The energy distribution corresponded to various spectra taken from specialized literature. Directionally isotropic neutron emission was simulated. The simulation results were compared with experimental values determined using the activation detector method using indium foils and cadmium shields. The relative fluence rate of thermal and resonance neutrons was compared in the chosen places in the vicinity of the source. The second part of the project related to the modeling of the BNCT infrastructure consisted of developing a simulation program taking into account all the essential components of this system. Materials with moderating, absorbing, and backscattering properties of neutrons were adopted into the project. Additionally, a gamma radiation filter was introduced into the beam output system. The analysis of the simulation results obtained using a logical detector located at the beam exit from the BNCT infrastructure included neutron energy and their spatial distribution. Optimization of the system involved changing the size and materials of the system to obtain a suitable collimated beam of thermal neutrons.

Keywords: BNCT, Monte Carlo, neutrons, simulation, modeling

Procedia PDF Downloads 39
382 Lean Healthcare: Barriers and Enablers in the Colombian Context

Authors: Erika Ruiz, Nestor Ortiz

Abstract:

Lean philosophy has evolved over time and has been implemented both in manufacturing and services, more recently lean has been integrated in the companies of the health sector. Currently it is important to understand the successful way to implement this philosophy and try to identify barriers and enablers to the sustainability of lean healthcare. The main purpose of this research is to identify the barriers and enablers in the implementation of Lean Healthcare based on case studies of Colombian healthcare centers. In order to do so, we conducted semi-structured interviews based on a maturity model. The main results indicate that the success of Lean implementation depends on its adaptation to contextual factors. In addition, in the Colombian context were identified new factors such as organizational culture, management models, integration of the care and administrative departments and triple helix relationship.

Keywords: barriers, enablers, implementation, lean healthcare, sustainability

Procedia PDF Downloads 367
381 Estimation of the Mean of the Selected Population

Authors: Kalu Ram Meena, Aditi Kar Gangopadhyay, Satrajit Mandal

Abstract:

Two normal populations with different means and same variance are considered, where the variances are known. The population with the smaller sample mean is selected. Various estimators are constructed for the mean of the selected normal population. Finally, they are compared with respect to the bias and MSE risks by the method of Monte-Carlo simulation and their performances are analysed with the help of graphs.

Keywords: estimation after selection, Brewster-Zidek technique, estimators, selected populations

Procedia PDF Downloads 514
380 Molecular Simulation of NO, NH3 Adsorption in MFI and H-ZSM5

Authors: Z. Jamalzadeh, A. Niaei, H. Erfannia, S. G. Hosseini, A. S. Razmgir

Abstract:

Due to developing the industries, the emission of pollutants such as NOx, SOx, and CO2 are rapidly increased. Generally, NOx is attributed to the mono nitrogen oxides of NO and NO2 that is one of the most important atmospheric contaminants. Hence, controlling the emission of nitrogen oxides is urgent environmentally. Selective Catalytic Reduction of NOx is one of the most common techniques for NOx removal in which Zeolites have wide application due to their high performance. In zeolitic processes, the catalytic reaction occurs mostly in the pores. Therefore, investigation the adsorption phenomena of the molecules in order to gain an insight and understand the catalytic cycle is of important. Hence, in current study, molecular simulations is applied for studying the adsorption phenomena in nanocatalysts applied for SCR of NOx process. The effect of cation addition to the support in the catalysts’ behavior through adsorption step was explored by Mont Carlo (MC). Simulation time of 1 Ns accompanying 1 fs time step, COMPASS27 Force Field and the cut off radios of 12.5 Ȧ was applied for performed runs. It was observed that the adsorption capacity increases in the presence of cations. The sorption isotherms demonstrated the behavior of type I isotherm categories and sorption capacity diminished with increase in temperature whereas an increase was observed at high pressures. Besides, NO sorption showed higher sorption capacity than NH3 in H–ZSM5. In this respect, the Energy distributions signified that the molecules could adsorb in just one sorption site at the catalyst and the sorption energy of NO was stronger than the NH3 in H-ZSM5. Furthermore, the isosteric heat of sorption data showed nearly same values for the molecules; however, it indicated stronger interactions of NO molecules with H-ZSM5 Zeolite compared to the isosteric heat of NH3 which was low in value.

Keywords: Monte Carlo simulation, adsorption, NOx, ZSM5

Procedia PDF Downloads 382
379 The Bayesian Premium Under Entropy Loss

Authors: Farouk Metiri, Halim Zeghdoudi, Mohamed Riad Remita

Abstract:

Credibility theory is an experience rating technique in actuarial science which can be seen as one of quantitative tools that allows the insurers to perform experience rating, that is, to adjust future premiums based on past experiences. It is used usually in automobile insurance, worker's compensation premium, and IBNR (incurred but not reported claims to the insurer) where credibility theory can be used to estimate the claim size amount. In this study, we focused on a popular tool in credibility theory which is the Bayesian premium estimator, considering Lindley distribution as a claim distribution. We derive this estimator under entropy loss which is asymmetric and squared error loss which is a symmetric loss function with informative and non-informative priors. In a purely Bayesian setting, the prior distribution represents the insurer’s prior belief about the insured’s risk level after collection of the insured’s data at the end of the period. However, the explicit form of the Bayesian premium in the case when the prior is not a member of the exponential family could be quite difficult to obtain as it involves a number of integrations which are not analytically solvable. The paper finds a solution to this problem by deriving this estimator using numerical approximation (Lindley approximation) which is one of the suitable approximation methods for solving such problems, it approaches the ratio of the integrals as a whole and produces a single numerical result. Simulation study using Monte Carlo method is then performed to evaluate this estimator and mean squared error technique is made to compare the Bayesian premium estimator under the above loss functions.

Keywords: bayesian estimator, credibility theory, entropy loss, monte carlo simulation

Procedia PDF Downloads 337
378 Stability of a Biofilm Reactor Able to Degrade a Mixture of the Organochlorine Herbicides Atrazine, Simazine, Diuron and 2,4-Dichlorophenoxyacetic Acid to Changes in the Composition of the Supply Medium

Authors: I. Nava-Arenas, N. Ruiz-Ordaz, C. J. Galindez-Mayer, M. L. Luna-Guido, S. L. Ruiz-López, A. Cabrera-Orozco, D. Nava-Arenas

Abstract:

Among the most important herbicides, the organochlorine compounds are of considerable interest due to their recalcitrance to the chemical, biological, and photolytic degradation, their persistence in the environment, their mobility, and their bioacummulation. The most widely used herbicides in North America are primarily 2,4-dichlorophenoxyacetic acid (2,4-D), the triazines (atrazine and simazine), and to a lesser extent diuron. The contamination of soils and water bodies frequently occurs by mixtures of these xenobiotics. For this reason, in this work, the operational stability to changes in the composition of the medium supplied to an aerobic biofilm reactor was studied. The reactor was packed with fragments of volcanic rock that retained a complex microbial film, able to degrade a mixture of organochlorine herbicides atrazine, simazine, diuron and 2,4-D, and whose members have microbial genes encoding the main catabolic enzymes atzABCD, tfdACD and puhB. To acclimate the attached microbial community, the biofilm reactor was fed continuously with a mineral minimal medium containing the herbicides (in mg•L-1): diuron, 20.4; atrazine, 14.2, simazine, 11.4, and 2,4-D, 59.7, as carbon and nitrogen sources. Throughout the bioprocess, removal efficiencies of 92-100% for herbicides, 78-90% for COD, 92-96% for TOC and 61-83% for dehalogenation were reached. In the microbial community, the genes encoding catabolic enzymes of different herbicides tfdACD, puhB and, occasionally, the genes atzA and atzC were detected. After the acclimatization, the triazine herbicides were eliminated from the mixture formulation. Volumetric loading rates of the mixture 2,4-D and diuron were continuously supplied to the reactor (1.9-21.5 mg herbicides •L-1 •h-1). Along the bioprocess, the removal efficiencies obtained were 86-100% for the mixture of herbicides, 63-94% for for COD, 90-100% for COT, and dehalogenation values of 63-100%. It was also observed that the genes encoding the enzymes in the catabolism of both herbicides, tfdACD and puhB, were consistently detected; and, occasionally, the atzA and atzC. Subsequently, the triazine herbicide atrazine and simazine were restored to the medium supply. Different volumetric charges of this mixture were continuously fed to the reactor (2.9 to 12.6 mg herbicides •L-1 •h-1). During this new treatment process, removal efficiencies of 65-95% for the mixture of herbicides, 63-92% for COD, 66-89% for TOC and 73-94% of dehalogenation were observed. In this last case, the genes tfdACD, puhB and atzABC encoding for the enzymes involved in the catabolism of the distinct herbicides were consistently detected. The atzD gene, encoding the cyanuric hydrolase enzyme, could not be detected, though it was determined that there was partial degradation of cyanuric acid. In general, the community in the biofilm reactor showed some catabolic stability, adapting to changes in loading rates and composition of the mixture of herbicides, and preserving their ability to degrade the four herbicides tested; although, there was a significant delay in the response time to recover to degradation of the herbicides.

Keywords: biodegradation, biofilm reactor, microbial community, organochlorine herbicides

Procedia PDF Downloads 437
377 A Location Routing Model for the Logistic System in the Mining Collection Centers of the Northern Region of Boyacá-Colombia

Authors: Erika Ruíz, Luis Amaya, Diego Carreño

Abstract:

The main objective of this study is to design a mathematical model for the logistics of mining collection centers in the northern region of the department of Boyacá (Colombia), determining the structure that facilitates the flow of products along the supply chain. In order to achieve this, it is necessary to define a suitable design of the distribution network, taking into account the products, customer’s characteristics and the availability of information. Likewise, some other aspects must be defined, such as number and capacity of collection centers to establish, routes that must be taken to deliver products to the customers, among others. This research will use one of the operation research problems, which is used in the design of distribution networks known as Location Routing Problem (LRP).

Keywords: location routing problem, logistic, mining collection, model

Procedia PDF Downloads 221
376 Classical and Bayesian Inference of the Generalized Log-Logistic Distribution with Applications to Survival Data

Authors: Abdisalam Hassan Muse, Samuel Mwalili, Oscar Ngesa

Abstract:

A generalized log-logistic distribution with variable shapes of the hazard rate was introduced and studied, extending the log-logistic distribution by adding an extra parameter to the classical distribution, leading to greater flexibility in analysing and modeling various data types. The proposed distribution has a large number of well-known lifetime special sub-models such as; Weibull, log-logistic, exponential, and Burr XII distributions. Its basic mathematical and statistical properties were derived. The method of maximum likelihood was adopted for estimating the unknown parameters of the proposed distribution, and a Monte Carlo simulation study is carried out to assess the behavior of the estimators. The importance of this distribution is that its tendency to model both monotone (increasing and decreasing) and non-monotone (unimodal and bathtub shape) or reversed “bathtub” shape hazard rate functions which are quite common in survival and reliability data analysis. Furthermore, the flexibility and usefulness of the proposed distribution are illustrated in a real-life data set and compared to its sub-models; Weibull, log-logistic, and BurrXII distributions and other parametric survival distributions with 3-parmaeters; like the exponentiated Weibull distribution, the 3-parameter lognormal distribution, the 3- parameter gamma distribution, the 3-parameter Weibull distribution, and the 3-parameter log-logistic (also known as shifted log-logistic) distribution. The proposed distribution provided a better fit than all of the competitive distributions based on the goodness-of-fit tests, the log-likelihood, and information criterion values. Finally, Bayesian analysis and performance of Gibbs sampling for the data set are also carried out.

Keywords: hazard rate function, log-logistic distribution, maximum likelihood estimation, generalized log-logistic distribution, survival data, Monte Carlo simulation

Procedia PDF Downloads 205