Search results for: monte carlo ray tracing
299 Simulation of the Collimator Plug Design for Prompt-Gamma Activation Analysis in the IEA-R1 Nuclear Reactor
Authors: Carlos G. Santos, Frederico A. Genezini, A. P. Dos Santos, H. Yorivaz, P. T. D. Siqueira
Abstract:
The Prompt-Gamma Activation Analysis (PGAA) is a valuable technique for investigating the elemental composition of various samples. However, the installation of a PGAA system entails specific conditions such as filtering the neutron beam according to the target and providing adequate shielding for both users and detectors. These requirements incur substantial costs, exceeding $100,000, including manpower. Nevertheless, a cost-effective approach involves leveraging an existing neutron beam facility to create a hybrid system integrating PGAA and Neutron Tomography (NT). The IEA-R1 nuclear reactor at IPEN/USP possesses an NT facility with suitable conditions for adapting and implementing a PGAA device. The NT facility offers a thermal flux slightly colder and provides shielding for user protection. The key additional requirement involves designing detector shielding to mitigate high gamma ray background and safeguard the HPGe detector from neutron-induced damage. This study employs Monte Carlo simulations with the MCNP6 code to optimize the collimator plug for PGAA within the IEA-R1 NT facility. Three collimator models are proposed and simulated to assess their effectiveness in shielding gamma and neutron radiation from nucleon fission. The aim is to achieve a focused prompt-gamma signal while shielding ambient gamma radiation. The simulation results indicate that one of the proposed designs is particularly suitable for the PGAA-NT hybrid system.Keywords: MCNP6.1, neutron, prompt-gamma ray, prompt-gamma activation analysis
Procedia PDF Downloads 75298 Multi-Point Dieless Forming Product Defect Reduction Using Reliability-Based Robust Process Optimization
Authors: Misganaw Abebe Baye, Ji-Woo Park, Beom-Soo Kang
Abstract:
The product quality of multi-point dieless forming (MDF) is identified to be dependent on the process parameters. Moreover, a certain variation of friction and material properties may have a substantially worse influence on the final product quality. This study proposed on how to compensate the MDF product defects by minimizing the sensitivity of noise parameter variations. This can be attained by reliability-based robust optimization (RRO) technique to obtain the optimal process setting of the controllable parameters. Initially two MDF Finite Element (FE) simulations of AA3003-H14 saddle shape showed a substantial amount of dimpling, wrinkling, and shape error. FE analyses are consequently applied on ABAQUS commercial software to obtain the correlation between the control process setting and noise variation with regard to the product defects. The best prediction models are chosen from the family of metamodels to swap the computational expensive FE simulation. Genetic algorithm (GA) is applied to determine the optimal process settings of the control parameters. Monte Carlo Analysis (MCA) is executed to determine how the noise parameter variation affects the final product quality. Finally, the RRO FE simulation and the experimental result show that the amendment of the control parameters in the final forming process leads to a considerably better-quality product.Keywords: dimpling, multi-point dieless forming, reliability-based robust optimization, shape error, variation, wrinkling
Procedia PDF Downloads 254297 Age Estimation Using Atlas Method with Orthopantomogram and Digital Tracing on Lateral Cephalogram
Authors: Astika Swastirani
Abstract:
Chronological age estimation can be done by looking at the stage of growth and development of teeth from orthopantomogram and mandibular remodeling from lateral cephalogram. Mandibular morphological changes associated with the size and remodeling during growth is a strong indicator of age estimation. These changes can be observed with lateral cephalogram. Objective: To prove the difference between chronological age and age estimation using orthopantomogram (dental age) and lateral cephalogram (skeletal age). Methods: Sample consisted of 100 medical records, 100 orthopantomograms digital and 100 lateral cephalograms digital belongs to 50 male and 50 female of Airlangga University hospital of dentistry. Orthopantomogram were matched with London atlas and lateral cephalograms were observed by digital tracing. The difference of dental age and skeletal age was analyzed by pair t –test. Result: Result of the pair t-test between chronological age and dental age in male (p-value 0.002, p<0.05), in female (p-value 0.605, p>0.05). Result of pair t-test between the chronological age and skeletal age (variable length Condylion-Gonion, Gonion-Gnathion, Condylion-Gnathion in male (p-value 0.000, p<0.05) in female (variable Condylion-Gonion length (p-value 0.000, Condylion-Gnathion length (p-value 0,040) and Gonion-Gnathion length (p-value 0.493). Conclusion: Orthopantomogram with London atlas and lateral cephalograms with Gonion- Gnathion variable can be used for age estimation in female. Orthopantomogram with London atlas and lateral cephalograms with Condylion-Gonion variable, Gonion-Gnathion variable and Condylion-Gnathion can not be used for age estimation in male.Keywords: age estimation, chronological age, dental age, skeletal age
Procedia PDF Downloads 169296 Floristic Diversity, Composition and Environmental Correlates on the Arid, Coralline Islands of the Farasan Archipelago, Red SEA, Saudi Arabia
Authors: Khalid Al Mutairi, Mashhor Mansor, Magdy El-Bana, Asyraf Mansor, Saud AL-Rowaily
Abstract:
Urban expansion and the associated increase in anthropogenic pressures have led to a great loss of the Red Sea’s biodiversity. Floristic composition, diversity, and environmental controls were investigated for 210 relive's on twenty coral islands of Farasan in the Red Sea, Saudi Arabia. Multivariate statistical analyses for classification (Cluster Analysis), ordination (Detrended Correspondence Analysis (DCA), and Redundancy Analysis (RDA) were employed to identify vegetation types and their relevance to the underlying environmental gradients. A total of 191 flowering plants belonging to 53 families and 129 genera were recorded. Geophytes and chamaephytes were the main life forms in the saline habitats, whereas therophytes and hemicryptophytes dominated the sandy formations and coral rocks. The cluster analysis and DCA ordination identified twelve vegetation groups that linked to five main habitats with definite floristic composition and environmental characteristics. The constrained RDA with Monte Carlo permutation tests revealed that elevation and soil salinity were the main environmental factors explaining the vegetation distributions. These results indicate that the flora of the study archipelago represents a phytogeographical linkage between Africa and Saharo-Arabian landscape functional elements. These findings should guide conservation and management efforts to maintain species diversity, which is threatened by anthropogenic activities and invasion by the exotic invasive tree Prosopis juliflora (Sw.) DC.Keywords: biodiversity, classification, conservation, ordination, Red Sea
Procedia PDF Downloads 343295 Residual Lifetime Estimation for Weibull Distribution by Fusing Expert Judgements and Censored Data
Authors: Xiang Jia, Zhijun Cheng
Abstract:
The residual lifetime of a product is the operation time between the current time and the time point when the failure happens. The residual lifetime estimation is rather important in reliability analysis. To predict the residual lifetime, it is necessary to assume or verify a particular distribution that the lifetime of the product follows. And the two-parameter Weibull distribution is frequently adopted to describe the lifetime in reliability engineering. Due to the time constraint and cost reduction, a life testing experiment is usually terminated before all the units have failed. Then the censored data is usually collected. In addition, other information could also be obtained for reliability analysis. The expert judgements are considered as it is common that the experts could present some useful information concerning the reliability. Therefore, the residual lifetime is estimated for Weibull distribution by fusing the censored data and expert judgements in this paper. First, the closed-forms concerning the point estimate and confidence interval for the residual lifetime under the Weibull distribution are both presented. Next, the expert judgements are regarded as the prior information and how to determine the prior distribution of Weibull parameters is developed. For completeness, the cases that there is only one, and there are more than two expert judgements are both focused on. Further, the posterior distribution of Weibull parameters is derived. Considering that it is difficult to derive the posterior distribution of residual lifetime, a sample-based method is proposed to generate the posterior samples of Weibull parameters based on the Monte Carlo Markov Chain (MCMC) method. And these samples are used to obtain the Bayes estimation and credible interval for the residual lifetime. Finally, an illustrative example is discussed to show the application. It demonstrates that the proposed method is rather simple, satisfactory, and robust.Keywords: expert judgements, information fusion, residual lifetime, Weibull distribution
Procedia PDF Downloads 142294 Nonlinear Vibration of FGM Plates Subjected to Acoustic Load in Thermal Environment Using Finite Element Modal Reduction Method
Authors: Hassan Parandvar, Mehrdad Farid
Abstract:
In this paper, a finite element modeling is presented for large amplitude vibration of functionally graded material (FGM) plates subjected to combined random pressure and thermal load. The material properties of the plates are assumed to vary continuously in the thickness direction by a simple power law distribution in terms of the volume fractions of the constituents. The material properties depend on the temperature whose distribution along the thickness can be expressed explicitly. The von Karman large deflection strain displacement and extended Hamilton's principle are used to obtain the governing system of equations of motion in structural node degrees of freedom (DOF) using finite element method. Three-node triangular Mindlin plate element with shear correction factor is used. The nonlinear equations of motion in structural degrees of freedom are reduced by using modal reduction method. The reduced equations of motion are solved numerically by 4th order Runge-Kutta scheme. In this study, the random pressure is generated using Monte Carlo method. The modeling is verified and the nonlinear dynamic response of FGM plates is studied for various values of volume fraction and sound pressure level under different thermal loads. Snap-through type behavior of FGM plates is studied too.Keywords: nonlinear vibration, finite element method, functionally graded material (FGM) plates, snap-through, random vibration, thermal effect
Procedia PDF Downloads 262293 Bayesian Locally Approach for Spatial Modeling of Visceral Leishmaniasis Infection in Northern and Central Tunisia
Authors: Kais Ben-Ahmed, Mhamed Ali-El-Aroui
Abstract:
This paper develops a Local Generalized Linear Spatial Model (LGLSM) to describe the spatial variation of Visceral Leishmaniasis (VL) infection risk in northern and central Tunisia. The response from each region is a number of affected children less than five years of age recorded from 1996 through 2006 from Tunisian pediatric departments and treated as a poison county level data. The model includes climatic factors, namely averages of annual rainfall, extreme values of low temperatures in winter and high temperatures in summer to characterize the climate of each region according to each continentality index, the pluviometric quotient of Emberger (Q2) to characterize bioclimatic regions and component for residual extra-poison variation. The statistical results show the progressive increase in the number of affected children in regions with high continentality index and low mean yearly rainfull. On the other hand, an increase in pluviometric quotient of Emberger contributed to a significant increase in VL incidence rate. When compared with the original GLSM, Bayesian locally modeling is improvement and gives a better approximation of the Tunisian VL risk estimation. According to the Bayesian approach inference, we use vague priors for all parameters model and Markov Chain Monte Carlo method.Keywords: generalized linear spatial model, local model, extra-poisson variation, continentality index, visceral leishmaniasis, Tunisia
Procedia PDF Downloads 397292 Probability-Based Damage Detection of Structures Using Model Updating with Enhanced Ideal Gas Molecular Movement Algorithm
Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee
Abstract:
Model updating method has received increasing attention in damage detection structures based on measured modal parameters. Therefore, a probability-based damage detection (PBDD) procedure based on a model updating procedure is presented in this paper, in which a one-stage model-based damage identification technique based on the dynamic features of a structure is investigated. The presented framework uses a finite element updating method with a Monte Carlo simulation that considers the uncertainty caused by measurement noise. Enhanced ideal gas molecular movement (EIGMM) is used as the main algorithm for model updating. Ideal gas molecular movement (IGMM) is a multiagent algorithm based on the ideal gas molecular movement. Ideal gas molecules disperse rapidly in different directions and cover all the space inside. This is embedded in the high speed of molecules, collisions between them and with the surrounding barriers. In IGMM algorithm to accomplish the optimal solutions, the initial population of gas molecules is randomly generated and the governing equations related to the velocity of gas molecules and collisions between those are utilized. In this paper, an enhanced version of IGMM, which removes unchanged variables after specified iterations, is developed. The proposed method is implemented on two numerical examples in the field of structural damage detection. The results show that the proposed method can perform well and competitive in PBDD of structures.Keywords: enhanced ideal gas molecular movement (EIGMM), ideal gas molecular movement (IGMM), model updating method, probability-based damage detection (PBDD), uncertainty quantification
Procedia PDF Downloads 277291 Quantum Statistical Machine Learning and Quantum Time Series
Authors: Omar Alzeley, Sergey Utev
Abstract:
Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series
Procedia PDF Downloads 469290 Solventless C−C Coupling of Low Carbon Furanics to High Carbon Fuel Precursors Using an Improved Graphene Oxide Carbocatalyst
Authors: Ashish Bohre, Blaž Likozar, Saikat Dutta, Dionisios G. Vlachos, Basudeb Saha
Abstract:
Graphene oxide, decorated with surface oxygen functionalities, has emerged as a sustainable alternative to precious metal catalysts for many reactions. Herein, we report for the first time that graphene oxide becomes super active for C-C coupling upon incorporation of multilayer crystalline features, highly oxidized surface, Brønsted acidic functionalities and defect sites on the surface and edges via modified oxidation. The resulting improved graphene oxide (IGO) demonstrates superior activity to commonly used framework zeolites for upgrading of low carbon biomass furanics to long carbon chain aviation fuel precursors. A maximum 95% yield of C15 fuel precursor with high selectivity is obtained at low temperature (60 C) and neat conditions via hydroxyalkylation/alkylation (HAA) of 2-methylfuran (2-MF) and furfural. The coupling of 2-MF with carbonyl molecules ranging from C3 to C6 produced the precursors of carbon numbers 12 to 21. The catalyst becomes inactive in the 4th cycle due to the loss of oxygen functionalities, defect sites and multilayer features; however, regains comparable activity upon regeneration. Extensive microscopic and spectroscopic characterization of the fresh and reused IGO is presented to elucidate high activity of IGO and to establish a correlation between activity and surface and structural properties. Kinetic Monte Carlo (KMC) and density functional theory (DFT) calculations are presented to further illustrate the surface features and the reaction mechanism.Keywords: methacrylic acid, itaconic acid, biomass, monomer, solid base catalyst
Procedia PDF Downloads 173289 Biophysical Consideration in the Interaction of Biological Cell Membranes with Virus Nanofilaments
Authors: Samaneh Farokhirad, Fatemeh Ahmadpoor
Abstract:
Biological membranes are constantly in contact with various filamentous soft nanostructures that either reside on their surface or are being transported between the cell and its environment. In particular, viral infections are determined by the interaction of viruses (such as filovirus) with cell membranes, membrane protein organization (such as cytoskeletal proteins and actin filament bundles) has been proposed to influence the mechanical properties of lipid membranes, and the adhesion of filamentous nanoparticles influence their delivery yield into target cells or tissues. The goal of this research is to integrate the rapidly increasing but still fragmented experimental observations on the adhesion and self-assembly of nanofilaments (including filoviruses, actin filaments, as well as natural and synthetic nanofilaments) on cell membranes into a general, rigorous, and unified knowledge framework. The global outbreak of the coronavirus disease in 2020, which has persisted for over three years, highlights the crucial role that nanofilamentbased delivery systems play in human health. This work will unravel the role of a unique property of all cell membranes, namely flexoelectricity, and the significance of nanofilaments’ flexibility in the adhesion and self-assembly of nanofilaments on cell membranes. This will be achieved utilizing a set of continuum mechanics, statistical mechanics, and molecular dynamics and Monte Carlo simulations. The findings will help address the societal needs to understand biophysical principles that govern the attachment of filoviruses and flexible nanofilaments onto the living cells and provide guidance on the development of nanofilament-based vaccines for a range of diseases, including infectious diseases and cancer.Keywords: virus nanofilaments, cell mechanics, computational biophysics, statistical mechanics
Procedia PDF Downloads 94288 Radiation Protection Assessment of the Emission of a d-t Neutron Generator: Simulations with MCNP Code and Experimental Measurements in Different Operating Conditions
Authors: G. M. Contessa, L. Lepore, G. Gandolfo, C. Poggi, N. Cherubini, R. Remetti, S. Sandri
Abstract:
Practical guidelines are provided in this work for the safe use of a portable d-t Thermo Scientific MP-320 neutron generator producing pulsed 14.1 MeV neutron beams. The neutron generator’s emission was tested experimentally and reproduced by MCNPX Monte Carlo code. Simulations were particularly accurate, even generator’s internal components were reproduced on the basis of ad-hoc collected X-ray radiographic images. Measurement campaigns were conducted under different standard experimental conditions using an LB 6411 neutron detector properly calibrated at three different energies, and comparing simulated and experimental data. In order to estimate the dose to the operator vs. the operating conditions and the energy spectrum, the most appropriate value of the conversion factor between neutron fluence and ambient dose equivalent has been identified, taking into account both direct and scattered components. The results of the simulations show that, in real situations, when there is no information about the neutron spectrum at the point where the dose has to be evaluated, it is possible - and in any case conservative - to convert the measured value of the count rate by means of the conversion factor corresponding to 14 MeV energy. This outcome has a general value when using this type of generator, enabling a more accurate design of experimental activities in different setups. The increasingly widespread use of this type of device for industrial and medical applications makes the results of this work of interest in different situations, especially as a support for the definition of appropriate radiation protection procedures and, in general, for risk analysis.Keywords: instrumentation and monitoring, management of radiological safety, measurement of individual dose, radiation protection of workers
Procedia PDF Downloads 132287 Competing Risks Modeling Using within Node Homogeneity Classification Tree
Authors: Kazeem Adesina Dauda, Waheed Babatunde Yahya
Abstract:
To design a tree that maximizes within-node homogeneity, there is a need for a homogeneity measure that is appropriate for event history data with multiple risks. We consider the use of Deviance and Modified Cox-Snell residuals as a measure of impurity in Classification Regression Tree (CART) and compare our results with the results of Fiona (2008) in which homogeneity measures were based on Martingale Residual. Data structure approach was used to validate the performance of our proposed techniques via simulation and real life data. The results of univariate competing risk revealed that: using Deviance and Cox-Snell residuals as a response in within node homogeneity classification tree perform better than using other residuals irrespective of performance techniques. Bone marrow transplant data and double-blinded randomized clinical trial, conducted in other to compare two treatments for patients with prostate cancer were used to demonstrate the efficiency of our proposed method vis-à-vis the existing ones. Results from empirical studies of the bone marrow transplant data showed that the proposed model with Cox-Snell residual (Deviance=16.6498) performs better than both the Martingale residual (deviance=160.3592) and Deviance residual (Deviance=556.8822) in both event of interest and competing risks. Additionally, results from prostate cancer also reveal the performance of proposed model over the existing one in both causes, interestingly, Cox-Snell residual (MSE=0.01783563) outfit both the Martingale residual (MSE=0.1853148) and Deviance residual (MSE=0.8043366). Moreover, these results validate those obtained from the Monte-Carlo studies.Keywords: within-node homogeneity, Martingale residual, modified Cox-Snell residual, classification and regression tree
Procedia PDF Downloads 272286 The Analysis of Personalized Low-Dose Computed Tomography Protocol Based on Cumulative Effective Radiation Dose and Cumulative Organ Dose for Patients with Breast Cancer with Regular Chest Computed Tomography Follow up
Authors: Okhee Woo
Abstract:
Purpose: The aim of this study is to evaluate 2-year cumulative effective radiation dose and cumulative organ dose on regular follow-up computed tomography (CT) scans in patients with breast cancer and to establish personalized low-dose CT protocol. Methods and Materials: A retrospective study was performed on the patients with breast cancer who were diagnosed and managed consistently on the basis of routine breast cancer follow-up protocol between 2012-01 and 2016-06. Based on ICRP (International Commission on Radiological Protection) 103, the cumulative effective radiation doses of each patient for 2-year follow-up were analyzed using the commercial radiation management software (Radimetrics, Bayer healthcare). The personalized effective doses on each organ were analyzed in detail by the software-providing Monte Carlo simulation. Results: A total of 3822 CT scans on 490 patients was evaluated (age: 52.32±10.69). The mean scan number for each patient was 7.8±4.54. Each patient was exposed 95.54±63.24 mSv of radiation for 2 years. The cumulative CT radiation dose was significantly higher in patients with lymph node metastasis (p = 0.00). The HER-2 positive patients were more exposed to radiation compared to estrogen or progesterone receptor positive patient (p = 0.00). There was no difference in the cumulative effective radiation dose with different age groups. Conclusion: To acknowledge how much radiation exposed to a patient is a starting point of management of radiation exposure for patients with long-term CT follow-up. The precise and personalized protocol, as well as iterative reconstruction, may reduce hazard from unnecessary radiation exposure.Keywords: computed tomography, breast cancer, effective radiation dose, cumulative organ dose
Procedia PDF Downloads 197285 Influence of Travel Time Reliability on Elderly Drivers Crash Severity
Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig
Abstract:
Although older drivers (defined as those of age 65 and above) are less involved with speeding, alcohol use as well as night driving, they are more vulnerable to severe crashes. The major contributing factors for severe crashes include frailty and medical complications. Several studies have evaluated the contributing factors on severity of crashes. However, few studies have established the impact of travel time reliability (TTR) on road safety. In particular, the impact of TTR on senior adults who face several challenges including hearing difficulties, decreasing of the processing skills and cognitive problems in driving is not well established. Therefore, this study focuses on determining possible impacts of TTR on the traffic safety with focus on elderly drivers. Historical travel speed data from freeway links in the study area were used to calculate travel time and the associated TTR metrics that is, planning time index, the buffer index, the standard deviation of the travel time and the probability of congestion. Four-year information on crashes occurring on these freeway links was acquired. The binary logit model estimated using the Markov Chain Monte Carlo (MCMC) sampling technique was used to evaluate variables that could be influencing elderly crash severity. Preliminary results of the analysis suggest that TTR is statistically significant in affecting the severity of a crash involving an elderly driver. The result suggests that one unit increase in the probability of congestion reduces the likelihood of the elderly severe crash by nearly 22%. These findings will enhance the understanding of TTR and its impact on the elderly crash severity.Keywords: highway safety, travel time reliability, elderly drivers, traffic modeling
Procedia PDF Downloads 493284 Nonlinear Finite Element Modeling of Deep Beam Resting on Linear and Nonlinear Random Soil
Authors: M. Seguini, D. Nedjar
Abstract:
An accuracy nonlinear analysis of a deep beam resting on elastic perfectly plastic soil is carried out in this study. In fact, a nonlinear finite element modeling for large deflection and moderate rotation of Euler-Bernoulli beam resting on linear and nonlinear random soil is investigated. The geometric nonlinear analysis of the beam is based on the theory of von Kàrmàn, where the Newton-Raphson incremental iteration method is implemented in a Matlab code to solve the nonlinear equation of the soil-beam interaction system. However, two analyses (deterministic and probabilistic) are proposed to verify the accuracy and the efficiency of the proposed model where the theory of the local average based on the Monte Carlo approach is used to analyze the effect of the spatial variability of the soil properties on the nonlinear beam response. The effect of six main parameters are investigated: the external load, the length of a beam, the coefficient of subgrade reaction of the soil, the Young’s modulus of the beam, the coefficient of variation and the correlation length of the soil’s coefficient of subgrade reaction. A comparison between the beam resting on linear and nonlinear soil models is presented for different beam’s length and external load. Numerical results have been obtained for the combination of the geometric nonlinearity of beam and material nonlinearity of random soil. This comparison highlighted the need of including the material nonlinearity and spatial variability of the soil in the geometric nonlinear analysis, when the beam undergoes large deflections.Keywords: finite element method, geometric nonlinearity, material nonlinearity, soil-structure interaction, spatial variability
Procedia PDF Downloads 414283 Evaluation of the Mechanical Behavior of a Retaining Wall Structure on a Weathered Soil through Probabilistic Methods
Authors: P. V. S. Mascarenhas, B. C. P. Albuquerque, D. J. F. Campos, L. L. Almeida, V. R. Domingues, L. C. S. M. Ozelim
Abstract:
Retaining slope structures are increasingly considered in geotechnical engineering projects due to extensive urban cities growth. These kinds of engineering constructions may present instabilities over the time and may require reinforcement or even rebuilding of the structure. In this context, statistical analysis is an important tool for decision making regarding retaining structures. This study approaches the failure probability of the construction of a retaining wall over the debris of an old and collapsed one. The new solution’s extension length will be of approximately 350 m and will be located over the margins of the Lake Paranoá, Brasilia, in the capital of Brazil. The building process must also account for the utilization of the ruins as a caisson. A series of in situ and laboratory experiments defined local soil strength parameters. A Standard Penetration Test (SPT) defined the in situ soil stratigraphy. Also, the parameters obtained were verified using soil data from a collection of masters and doctoral works from the University of Brasília, which is similar to the local soil. Initial studies show that the concrete wall is the proper solution for this case, taking into account the technical, economic and deterministic analysis. On the other hand, in order to better analyze the statistical significance of the factor-of-safety factors obtained, a Monte Carlo analysis was performed for the concrete wall and two more initial solutions. A comparison between the statistical and risk results generated for the different solutions indicated that a Gabion solution would better fit the financial and technical feasibility of the project.Keywords: economical analysis, probability of failure, retaining walls, statistical analysis
Procedia PDF Downloads 406282 Bias-Corrected Estimation Methods for Receiver Operating Characteristic Surface
Authors: Khanh To Duc, Monica Chiogna, Gianfranco Adimari
Abstract:
With three diagnostic categories, assessment of the performance of diagnostic tests is achieved by the analysis of the receiver operating characteristic (ROC) surface, which generalizes the ROC curve for binary diagnostic outcomes. The volume under the ROC surface (VUS) is a summary index usually employed for measuring the overall diagnostic accuracy. When the true disease status can be exactly assessed by means of a gold standard (GS) test, unbiased nonparametric estimators of the ROC surface and VUS are easily obtained. In practice, unfortunately, disease status verification via the GS test could be unavailable for all study subjects, due to the expensiveness or invasiveness of the GS test. Thus, often only a subset of patients undergoes disease verification. Statistical evaluations of diagnostic accuracy based only on data from subjects with verified disease status are typically biased. This bias is known as verification bias. Here, we consider the problem of correcting for verification bias when continuous diagnostic tests for three-class disease status are considered. We assume that selection for disease verification does not depend on disease status, given test results and other observed covariates, i.e., we assume that the true disease status, when missing, is missing at random. Under this assumption, we discuss several solutions for ROC surface analysis based on imputation and re-weighting methods. In particular, verification bias-corrected estimators of the ROC surface and of VUS are proposed, namely, full imputation, mean score imputation, inverse probability weighting and semiparametric efficient estimators. Consistency and asymptotic normality of the proposed estimators are established, and their finite sample behavior is investigated by means of Monte Carlo simulation studies. Two illustrations using real datasets are also given.Keywords: imputation, missing at random, inverse probability weighting, ROC surface analysis
Procedia PDF Downloads 416281 Computer Simulation of Hydrogen Superfluidity through Binary Mixing
Authors: Sea Hoon Lim
Abstract:
A superfluid is a fluid of bosons that flows without resistance. In order to be a superfluid, a substance’s particles must behave like bosons, yet remain mobile enough to be considered a superfluid. Bosons are low-temperature particles that can be in all energy states at the same time. If bosons were to be cooled down, then the particles will all try to be on the lowest energy state, which is called the Bose Einstein condensation. The temperature when bosons start to matter is when the temperature has reached its critical temperature. For example, when Helium reaches its critical temperature of 2.17K, the liquid density drops and becomes a superfluid with zero viscosity. However, most materials will solidify -and thus not remain fluids- at temperatures well above the temperature at which they would otherwise become a superfluid. Only a few substances currently known to man are capable of at once remaining a fluid and manifesting boson statistics. The most well-known of these is helium and its isotopes. Because hydrogen is lighter than helium, and thus expected to manifest Bose statistics at higher temperatures than helium, one might expect hydrogen to also be a superfluid. As of today, however, no one has yet been able to produce a bulk, hydrogen superfluid. The reason why hydrogen did not form a superfluid in the past is its intermolecular interactions. As a result, hydrogen molecules are much more likely to crystallize than their helium counterparts. The key to creating a hydrogen superfluid is therefore finding a way to reduce the effect of the interactions among hydrogen molecules, postponing the solidification to lower temperature. In this work, we attempt via computer simulation to produce bulk superfluid hydrogen through binary mixing. Binary mixture is a technique of mixing two pure substances in order to avoid crystallization and enhance super fluidity. Our mixture here is KALJ H2. We then sample the partition function using this Path Integral Monte Carlo (PIMC), which is well-suited for the equilibrium properties of low-temperature bosons and captures not only the statistics but also the dynamics of Hydrogen. Via this sampling, we will then produce a time evolution of the substance and see if it exhibits superfluid properties.Keywords: superfluidity, hydrogen, binary mixture, physics
Procedia PDF Downloads 316280 Analysis of Noise Environment and Acoustics Material in Residential Building
Authors: Heruanda Alviana Giska Barabah, Hilda Rasnia Hapsari
Abstract:
Acoustic phenomena create an acoustic interpretation condition that describes the characteristics of the environment. In urban areas, the tendency of heterogeneous and simultaneous human activity form a soundscape that is different from other regions, one of the characteristics of urban areas that developing the soundscape is the presence of vertical model houses or residential building. Activities both within the building and surrounding environment are able to make the soundscape with certain characteristics. The acoustics comfort of residential building becomes an important aspect, those demand lead the building features become more diverse. Initial steps in mapping acoustic conditions in a soundscape are important, this is the method to determine uncomfortable condition. Noise generated by road traffic, railway, and plane is an important consideration, especially for urban people, therefore the proper design of the building becomes very important as an effort to bring appropriate acoustics comfort. In this paper the authors developed noise mapping on the location of the residential building. Mapping done by taking some point referring to the noise source. The mapping result become the basis for modeling the acoustics wave interacted with the building model. Material selection is done based on literature study and modeling simulation using Insul by considering the absorption coefficient and Sound Transmission Class. The analysis of acoustics rays is ray tracing method using Comsol simulator software that can show the movement of acoustics rays and their interaction with a boundary. The result of this study can be used to consider boundary material in residential building as well as consideration for improving the acoustic quality in the acoustics zones that are formed.Keywords: residential building, noise, absorption coefficient, sound transmission class, ray tracing
Procedia PDF Downloads 247279 Association of Dietary Intake with the Nutrition Knowledge, Food Label Use, and Food Preferences of Adults in San Jose del Monte City, Bulacan, Philippines
Authors: Barby Jennette A. Florano
Abstract:
Dietary intake has been associated with the health and wellbeing of adults, and lifestyle related diseases. The aim of this study was to investigate whether nutrition knowledge, food label use, and food preference are associated with the dietary intake in a sample of San Jose Del Monte City, Bulacan (SJDM) adults. A sample of 148 adults, with a mean age of 20 years, completed a validated questionnaire related to their demographic, dietary intake, nutrition knowledge, food label use and food preference. Data were analyzed using Pearson correlation and there was no association between dietary intake and nutrition knowledge. However, there were positive relationships between dietary intake and food label use (r=0.1276, p<0.10), and dietary intake and food preference (r=0.1070, p<0.10). SJDM adults who use food label and have extensive food preference had better diet quality. This finding magnifies the role of nutrition education as a potential tool in health campaigns to promote healthy eating patterns and reading food labels among students and adults. Results of this study can give information for the design of future nutrition education intervention studies to assess the efficacy of nutrition knowledge and food label use among a similar sample population.Keywords: dietary intake, nutrition knowledge, food preference, food label use
Procedia PDF Downloads 91278 Development of an Autonomous Automated Guided Vehicle with Robot Manipulator under Robot Operation System Architecture
Authors: Jinsiang Shaw, Sheng-Xiang Xu
Abstract:
This paper presents the development of an autonomous automated guided vehicle (AGV) with a robot arm attached on top of it within the framework of robot operation system (ROS). ROS can provide libraries and tools, including hardware abstraction, device drivers, libraries, visualizers, message-passing, package management, etc. For this reason, this AGV can provide automatic navigation and parts transportation and pick-and-place task using robot arm for typical industrial production line use. More specifically, this AGV will be controlled by an on-board host computer running ROS software. Command signals for vehicle and robot arm control and measurement signals from various sensors are transferred to respective microcontrollers. Users can operate the AGV remotely through the TCP / IP protocol and perform SLAM (Simultaneous Localization and Mapping). An RGBD camera and LIDAR sensors are installed on the AGV, using these data to perceive the environment. For SLAM, Gmapping is used to construct the environment map by Rao-Blackwellized particle filter; and AMCL method (Adaptive Monte Carlo localization) is employed for mobile robot localization. In addition, current AGV position and orientation can be visualized by ROS toolkit. As for robot navigation and obstacle avoidance, A* for global path planning and dynamic window approach for local planning are implemented. The developed ROS AGV with a robot arm on it has been experimented in the university factory. A 2-D and 3-D map of the factory were successfully constructed by the SLAM method. Base on this map, robot navigation through the factory with and without dynamic obstacles are shown to perform well. Finally, pick-and-place of parts using robot arm and ensuing delivery in the factory by the mobile robot are also accomplished.Keywords: automated guided vehicle, navigation, robot operation system, Simultaneous Localization and Mapping
Procedia PDF Downloads 149277 Low-Voltage and Low-Power Bulk-Driven Continuous-Time Current-Mode Differentiator Filters
Authors: Ravi Kiran Jaladi, Ezz I. El-Masry
Abstract:
Emerging technologies such as ultra-wide band wireless access technology that operate at ultra-low power present several challenges due to their inherent design that limits the use of voltage-mode filters. Therefore, Continuous-time current-mode (CTCM) filters have become very popular in recent times due to the fact they have a wider dynamic range, improved linearity, and extended bandwidth compared to their voltage-mode counterparts. The goal of this research is to develop analog filters which are suitable for the current scaling CMOS technologies. Bulk-driven MOSFET is one of the most popular low power design technique for the existing challenges, while other techniques have obvious shortcomings. In this work, a CTCM Gate-driven (GD) differentiator has been presented with a frequency range from dc to 100MHz which operates at very low supply voltage of 0.7 volts. A novel CTCM Bulk-driven (BD) differentiator has been designed for the first time which reduces the power consumption multiple times that of GD differentiator. These GD and BD differentiator has been simulated using CADENCE TSMC 65nm technology for all the bilinear and biquadratic band-pass frequency responses. These basic building blocks can be used to implement the higher order filters. A 6th order cascade CTCM Chebyshev band-pass filter has been designed using the GD and BD techniques. As a conclusion, a low power GD and BD 6th order chebyshev stagger-tuned band-pass filter was simulated and all the parameters obtained from all the resulting realizations are analyzed and compared. Monte Carlo analysis is performed for both the 6th order filters and the results of sensitivity analysis are presented.Keywords: bulk-driven (BD), continuous-time current-mode filters (CTCM), gate-driven (GD)
Procedia PDF Downloads 260276 Effect of Correlation of Random Variables on Structural Reliability Index
Authors: Agnieszka Dudzik
Abstract:
The problem of correlation between random variables in the structural reliability analysis has been extensively discussed in literature on the subject. The cases taken under consideration were usually related to correlation between random variables from one side of ultimate limit state: correlation between particular loads applied on structure or correlation between resistance of particular members of a structure as a system. It has been proved that positive correlation between these random variables reduces the reliability of structure and increases the probability of failure. In the paper, the problem of correlation between random variables from both side of the limit state equation will be taken under consideration. The simplest case where these random variables are of the normal distributions will be concerned. The case when a degree of that correlation is described by the covariance or the coefficient of correlation will be used. Special attention will be paid on questions: how much that correlation changes the reliability level and can it be ignored. In reliability analysis will be used well-known methods for assessment of the failure probability: based on the Hasofer-Lind reliability index and Monte Carlo method adapted to the problem of correlation. The main purpose of this work will be a presentation how correlation of random variables influence on reliability index of steel bar structures. Structural design parameters will be defined as deterministic values and random variables. The latter will be correlated. The criterion of structural failure will be expressed by limit functions related to the ultimate and serviceability limit state. In the description of random variables will be used only for the normal distribution. Sensitivity of reliability index to the random variables will be defined. If the reliability index sensitivity due to the random variable X will be low when compared with other variables, it can be stated that the impact of this variable on failure probability is small. Therefore, in successive computations, it can be treated as a deterministic parameter. Sensitivity analysis leads to simplify the description of the mathematical model, determine the new limit functions and values of the Hasofer-Lind reliability index. In the examples, the NUMPRESS software will be used in the reliability analysis.Keywords: correlation of random variables, reliability index, sensitivity of reliability index, steel structure
Procedia PDF Downloads 237275 Modified Weibull Approach for Bridge Deterioration Modelling
Authors: Niroshan K. Walgama Wellalage, Tieling Zhang, Richard Dwight
Abstract:
State-based Markov deterioration models (SMDM) sometimes fail to find accurate transition probability matrix (TPM) values, and hence lead to invalid future condition prediction or incorrect average deterioration rates mainly due to drawbacks of existing nonlinear optimization-based algorithms and/or subjective function types used for regression analysis. Furthermore, a set of separate functions for each condition state with age cannot be directly derived by using Markov model for a given bridge element group, which however is of interest to industrial partners. This paper presents a new approach for generating Homogeneous SMDM model output, namely, the Modified Weibull approach, which consists of a set of appropriate functions to describe the percentage condition prediction of bridge elements in each state. These functions are combined with Bayesian approach and Metropolis Hasting Algorithm (MHA) based Markov Chain Monte Carlo (MCMC) simulation technique for quantifying the uncertainty in model parameter estimates. In this study, factors contributing to rail bridge deterioration were identified. The inspection data for 1,000 Australian railway bridges over 15 years were reviewed and filtered accordingly based on the real operational experience. Network level deterioration model for a typical bridge element group was developed using the proposed Modified Weibull approach. The condition state predictions obtained from this method were validated using statistical hypothesis tests with a test data set. Results show that the proposed model is able to not only predict the conditions in network-level accurately but also capture the model uncertainties with given confidence interval.Keywords: bridge deterioration modelling, modified weibull approach, MCMC, metropolis-hasting algorithm, bayesian approach, Markov deterioration models
Procedia PDF Downloads 727274 Calculation of Secondary Neutron Dose Equivalent in Proton Therapy of Thyroid Gland Using FLUKA Code
Authors: M. R. Akbari, M. Sadeghi, R. Faghihi, M. A. Mosleh-Shirazi, A. R. Khorrami-Moghadam
Abstract:
Proton radiotherapy (PRT) is becoming an established treatment modality for cancer. The localized tumors, the same as undifferentiated thyroid tumors are insufficiently handled by conventional radiotherapy, while protons would propose the prospect of increasing the tumor dose without exceeding the tolerance of the surrounding healthy tissues. In spite of relatively high advantages in giving localized radiation dose to the tumor region, in proton therapy, secondary neutron production can have significant contribution on integral dose and lessen advantages of this modality contrast to conventional radiotherapy techniques. Furthermore, neutrons have high quality factor, therefore, even a small physical dose can cause considerable biological effects. Measuring of this neutron dose is a very critical step in prediction of secondary cancer incidence. It has been found that FLUKA Monte Carlo code simulations have been used to evaluate dose due to secondaries in proton therapy. In this study, first, by validating simulated proton beam range in water phantom with CSDA range from NIST for the studied proton energy range (34-54 MeV), a proton therapy in thyroid gland cancer was simulated using FLUKA code. Secondary neutron dose equivalent of some organs and tissues after the target volume caused by 34 and 54 MeV proton interactions were calculated in order to evaluate secondary cancer incidence. A multilayer cylindrical neck phantom considering all the layers of neck tissues and a proton beam impinging normally on the phantom were also simulated. Trachea (accompanied by Larynx) had the greatest dose equivalent (1.24×10-1 and 1.45 pSv per primary 34 and 54 MeV protons, respectively) among the simulated tissues after the target volume in the neck region.Keywords: FLUKA code, neutron dose equivalent, proton therapy, thyroid gland
Procedia PDF Downloads 425273 A Data-Driven Agent Based Model for the Italian Economy
Authors: Michele Catalano, Jacopo Di Domenico, Luca Riccetti, Andrea Teglio
Abstract:
We develop a data-driven agent based model (ABM) for the Italian economy. We calibrate the model for the initial condition and parameters. As a preliminary step, we replicate the Monte-Carlo simulation for the Austrian economy. Then, we evaluate the dynamic properties of the model: the long-run equilibrium and the allocative efficiency in terms of disequilibrium patterns arising in the search and matching process for final goods, capital, intermediate goods, and credit markets. In this perspective, we use a randomized initial condition approach. We perform a robustness analysis perturbing the system for different parameter setups. We explore the empirical properties of the model using a rolling window forecast exercise from 2010 to 2022 to observe the model’s forecasting ability in the wake of the COVID-19 pandemic. We perform an analysis of the properties of the model with a different number of agents, that is, with different scales of the model compared to the real economy. The model generally displays transient dynamics that properly fit macroeconomic data regarding forecasting ability. We stress the model with a large set of shocks, namely interest policy, fiscal policy, and exogenous factors, such as external foreign demand for export. In this way, we can explore the most exposed sectors of the economy. Finally, we modify the technology mix of the various sectors and, consequently, the underlying input-output sectoral interdependence to stress the economy and observe the long-run projections. In this way, we can include in the model the generation of endogenous crisis due to the implied structural change, technological unemployment, and potential lack of aggregate demand creating the condition for cyclical endogenous crises reproduced in this artificial economy.Keywords: agent-based models, behavioral macro, macroeconomic forecasting, micro data
Procedia PDF Downloads 69272 Ground Surface Temperature History Prediction Using Long-Short Term Memory Neural Network Architecture
Authors: Venkat S. Somayajula
Abstract:
Ground surface temperature history prediction model plays a vital role in determining standards for international nuclear waste management. International standards for borehole based nuclear waste disposal require paleoclimate cycle predictions on scale of a million forward years for the place of waste disposal. This research focuses on developing a paleoclimate cycle prediction model using Bayesian long-short term memory (LSTM) neural architecture operated on accumulated borehole temperature history data. Bayesian models have been previously used for paleoclimate cycle prediction based on Monte-Carlo weight method, but due to limitations pertaining model coupling with certain other prediction networks, Bayesian models in past couldn’t accommodate prediction cycle’s over 1000 years. LSTM has provided frontier to couple developed models with other prediction networks with ease. Paleoclimate cycle developed using this process will be trained on existing borehole data and then will be coupled to surface temperature history prediction networks which give endpoints for backpropagation of LSTM network and optimize the cycle of prediction for larger prediction time scales. Trained LSTM will be tested on past data for validation and then propagated for forward prediction of temperatures at borehole locations. This research will be beneficial for study pertaining to nuclear waste management, anthropological cycle predictions and geophysical featuresKeywords: Bayesian long-short term memory neural network, borehole temperature, ground surface temperature history, paleoclimate cycle
Procedia PDF Downloads 128271 Modeling of Bipolar Charge Transport through Nanocomposite Films for Energy Storage
Authors: Meng H. Lean, Wei-Ping L. Chu
Abstract:
The effects of ferroelectric nanofiller size, shape, loading, and polarization, on bipolar charge injection, transport, and recombination through amorphous and semicrystalline polymers are studied. A 3D particle-in-cell model extends the classical electrical double layer representation to treat ferroelectric nanoparticles. Metal-polymer charge injection assumes Schottky emission and Fowler-Nordheim tunneling, migration through field-dependent Poole-Frenkel mobility, and recombination with Monte Carlo selection based on collision probability. A boundary integral equation method is used for solution of the Poisson equation coupled with a second-order predictor-corrector scheme for robust time integration of the equations of motion. The stability criterion of the explicit algorithm conforms to the Courant-Friedrichs-Levy limit. Trajectories for charge that make it through the film are curvilinear paths that meander through the interspaces. Results indicate that charge transport behavior depends on nanoparticle polarization with anti-parallel orientation showing the highest leakage conduction and lowest level of charge trapping in the interaction zone. Simulation prediction of a size range of 80 to 100 nm to minimize attachment and maximize conduction is validated by theory. Attached charge fractions go from 2.2% to 97% as nanofiller size is decreased from 150 nm to 60 nm. Computed conductivity of 0.4 x 1014 S/cm is in agreement with published data for plastics. Charge attachment is increased with spheroids due to the increase in surface area, and especially so for oblate spheroids showing the influence of larger cross-sections. Charge attachment to nanofillers and nanocrystallites increase with vol.% loading or degree of crystallinity, and saturate at about 40 vol.%.Keywords: nanocomposites, nanofillers, electrical double layer, bipolar charge transport
Procedia PDF Downloads 354270 Spatio-Temporal Analysis of Rabies Incidence in Herbivores of Economic Interest in Brazil
Authors: Francisco Miroslav Ulloa-Stanojlovic, Gina Polo, Ricardo Augusto Dias
Abstract:
In Brazil, there is a high incidence of rabies in herbivores of economic interest (HEI) transmitted by the common vampire bat Desmodus rotundus, the presence of human rabies cases and the huge economic losses in the world's largest cattle industry, it is important to assist the National Program for Control of Rabies in herbivores in Brazil, that aims to reduce the incidence of rabies in HEI populations, mainly through epidemiological surveillance, vaccination of herbivores and control of vampire-bat roosts. Material and Methods: A spatiotemporal retrospective Kulldorff's spatial scan statistic based on a Poisson model and Monte Carlo simulation and an Anselin's Local Moran's I statistic were used to uncover spatial clustering of HEI rabies from 2000 – 2014. Results: Were identify three important clusters with significant year-to-year variation (Figure 1). In 2000, was identified one area of clustering in the North region, specifically in the State of Tocantins. Between the year 2000 and 2004, a cluster centered in the Midwest and Southeast region including the States of Goiás, Minas Gerais, Rio de Janeiro, Espirito Santo and São Paulo was prominent. And finally between 2000 and 2005 was found an important cluster in the North, Midwest and South region. Conclusions: The HEI rabies is endemic in the country, in addition, appears to be significant differences among the States according to their surveillance services, that may be difficulting the control of the disease, also other factors could be influencing in the maintenance of this problem like the lack of information of vampire-bat roosts identification, and limited human resources for realization of field monitoring. A review of the program control by the authorities it’s necessary.Keywords: Brazil, Desmodus rotundus, herbivores, rabies
Procedia PDF Downloads 417