Search results for: Niño Carlo I. Casim
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 496

Search results for: Niño Carlo I. Casim

196 Nonlinear Analysis of Shear Deformable Deep Beam Resting on Nonlinear Two-Parameter Random Soil

Authors: M. Seguini, D. Nedjar

Abstract:

In this paper, the nonlinear analysis of Timoshenko beam undergoing moderate large deflections and resting on nonlinear two-parameter random foundation is presented, taking into account the effects of shear deformation, beam’s properties variation and the spatial variability of soil characteristics. The finite element probabilistic analysis has been performed by using Timoshenko beam theory with the Von Kàrmàn nonlinear strain-displacement relationships combined to Vanmarcke theory and Monte Carlo simulations, which is implemented in a Matlab program. Numerical examples of the newly developed model is conducted to confirm the efficiency and accuracy of this later and the importance of accounting for the foundation second parameter (Winkler-Pasternak). Thus, the results obtained from the developed model are presented and compared with those available in the literature to examine how the consideration of the shear and spatial variability of soil’s characteristics affects the response of the system.

Keywords: nonlinear analysis, soil-structure interaction, large deflection, Timoshenko beam, Euler-Bernoulli beam, Winkler foundation, Pasternak foundation, spatial variability

Procedia PDF Downloads 323
195 Differential Item Functioning in the Vocabulary Test of Grade 7 Students in Public and Private Schools

Authors: Dave Kenneth Tayao Cayado, Carlo P. Magno

Abstract:

The most common source of bias detected are those of gender and socioeconomic status. The present study investigated the Differential Item Functioning (DIF) or item bias between public and private school students in a vocabulary test. Studies on DIF were expanded by using the type of school as a source of bias. There were 200 participants in this study. 100 came from a public secondary school and 100 came from a private secondary school. The vocabulary skills of students were measured using a standardized vocabulary test for grade 7 students. Using DIF, specifically the Rasch-Welch approach, it was found that out of 24 items, 12 were biased for a specific group. The vocabulary skills on the use of slang, idiomatic expression, personification, collocations, and partitive relations were biased for private schools while the use of slang and homonymous words were biased for public school students. The analysis debunked the trend that private school students are outperforming public school students in terms of academic achievement. It was revealed that there are some competencies that private school students are having difficulty and vice versa.

Keywords: differential item functioning, item bias, public school students, private school students, vocabulary

Procedia PDF Downloads 191
194 Modelling Operational Risk Using Extreme Value Theory and Skew t-Copulas via Bayesian Inference

Authors: Betty Johanna Garzon Rozo, Jonathan Crook, Fernando Moreira

Abstract:

Operational risk losses are heavy tailed and are likely to be asymmetric and extremely dependent among business lines/event types. We propose a new methodology to assess, in a multivariate way, the asymmetry and extreme dependence between severity distributions, and to calculate the capital for Operational Risk. This methodology simultaneously uses (i) several parametric distributions and an alternative mix distribution (the Lognormal for the body of losses and the Generalized Pareto Distribution for the tail) via extreme value theory using SAS®, (ii) the multivariate skew t-copula applied for the first time for operational losses and (iii) Bayesian theory to estimate new n-dimensional skew t-copula models via Markov chain Monte Carlo (MCMC) simulation. This paper analyses a newly operational loss data set, SAS Global Operational Risk Data [SAS OpRisk], to model operational risk at international financial institutions. All the severity models are constructed in SAS® 9.2. We implement the procedure PROC SEVERITY and PROC NLMIXED. This paper focuses in describing this implementation.

Keywords: operational risk, loss distribution approach, extreme value theory, copulas

Procedia PDF Downloads 603
193 Self-Regulation in Composition Writing: The Case of Variation of Self-Regulation Dispositions in Opinion Essay and Technical Writing

Authors: Dave Kenneth Tayao Cayado, Carlo P. Magno, Venice Cristine Dangaran

Abstract:

The present study determines whether there will be differences in the self-regulation dispositions that learners utilize when writing different types of composition. There were 7 self-regulation factors that were used to develop a scale in this study such as memory strategy, goal setting, self-evaluation, seeking assistance, learning responsibility, environmental structuring, and organizing. The scale was made specific for writing a composition. The researcher-made scale was administered to 150 participants who all came from a university in the Philippines. The participants were asked to write two compositions namely opinion essay and research introduction/review of related literature. The zero-order correlation revealed that all the factors of self-regulation are correlated with one another. However, only seeking assistance and self-evaluation are correlated with opinion essay and technical writing is not correlated to any of the self-regulation factors. However, when path analysis was used, it was shown that seeking assistance can predict opinion essay scores whereas memory strategy, self-evaluation, and organizing can predict technical writing scores.

Keywords: opinion essay, self-regulation, technical writing, writing skills

Procedia PDF Downloads 183
192 Statically Fused Unbiased Converted Measurements Kalman Filter

Authors: Zhengkun Guo, Yanbin Li, Wenqing Wang, Bo Zou

Abstract:

The statically fused converted position and doppler measurements Kalman filter (SF-CMKF) with additive debiased measurement conversion has been previously presented to combine the resulting states of converted position measurements Kalman filter (CPMKF) and converted doppler measurement Kalman filter (CDMKF) to yield the final state estimates under minimum mean squared error (MMSE) criterion. However, the exact compensation for the bias in the polar-to-cartesian and spherical-to-cartesian conversion are multiplicative and depend on the statistics of the cosine of the angle measurement errors. As a result, the consistency and performance of the SF-CMKF may be suboptimal in large-angle error situations. In this paper, the multiplicative unbiased position and Doppler measurement conversion for 2D (polar-to-cartesian) tracking are derived, and the SF-CMKF is improved to use those conversions. Monte Carlo simulations are presented to demonstrate the statistical consistency of the multiplicative unbiased conversion and the superior performance of the modified SF-CMKF (SF-UCMKF).

Keywords: measurement conversion, Doppler, Kalman filter, estimation, tracking

Procedia PDF Downloads 208
191 Fermentation of Pretreated Herbaceous Cellulosic Wastes to Ethanol by Anaerobic Cellulolytic and Saccharolytic Thermophilic Clostridia

Authors: Lali Kutateladze, Tamar Urushadze, Tamar Dudauri, Besarion Metreveli, Nino Zakariashvili, Izolda Khokhashvili, Maya Jobava

Abstract:

Lignocellulosic waste streams from agriculture, paper and wood industry are renewable, plentiful and low-cost raw materials that can be used for large-scale production of liquid and gaseous biofuels. As opposed to prevailing multi-stage biotechnological processes developed for bioconversion of cellulosic substrates to ethanol where high-cost cellulase preparations are used, Consolidated Bioprocessing (CBP) offers to accomplish cellulose and xylan hydrolysis followed by fermentation of both C6 and C5 sugars to ethanol in a single-stage process. Syntrophic microbial consortium comprising of anaerobic, thermophilic, cellulolytic, and saccharolytic bacteria in the genus Clostridia with improved ethanol productivity and high tolerance to fermentation end-products had been proposed for achieving CBP. 65 new strains of anaerobic thermophilic cellulolytic and saccharolytic Clostridia were isolated from different wetlands and hot springs in Georgia. Using new isolates, fermentation of mechanically pretreated wheat straw and corn stalks was done under oxygen-free nitrogen environment in thermophilic conditions (T=550C) and pH 7.1. Process duration was 120 hours. Liquid and gaseous products of fermentation were analyzed on a daily basis using Perkin-Elmer gas chromatographs with flame ionization and thermal detectors. Residual cellulose, xylan, xylose, and glucose were determined using standard methods. Cellulolytic and saccharolytic bacteria strains degraded mechanically pretreated herbaceous cellulosic wastes and fermented glucose and xylose to ethanol, acetic acid and gaseous products like hydrogen and CO2. Specifically, maximum yield of ethanol was reached at 96 h of fermentation and varied between 2.9 – 3.2 g/ 10 g of substrate. The content of acetic acid didn’t exceed 0.35 g/l. Other volatile fatty acids were detected in trace quantities.

Keywords: anaerobic bacteria, cellulosic wastes, Clostridia sp, ethanol

Procedia PDF Downloads 296
190 Polynomial Chaos Expansion Combined with Exponential Spline for Singularly Perturbed Boundary Value Problems with Random Parameter

Authors: W. K. Zahra, M. A. El-Beltagy, R. R. Elkhadrawy

Abstract:

So many practical problems in science and technology developed over the past decays. For instance, the mathematical boundary layer theory or the approximation of solution for different problems described by differential equations. When such problems consider large or small parameters, they become increasingly complex and therefore require the use of asymptotic methods. In this work, we consider the singularly perturbed boundary value problems which contain very small parameters. Moreover, we will consider these perturbation parameters as random variables. We propose a numerical method to solve this kind of problems. The proposed method is based on an exponential spline, Shishkin mesh discretization, and polynomial chaos expansion. The polynomial chaos expansion is used to handle the randomness exist in the perturbation parameter. Furthermore, the Monte Carlo Simulations (MCS) are used to validate the solution and the accuracy of the proposed method. Numerical results are provided to show the applicability and efficiency of the proposed method, which maintains a very remarkable high accuracy and it is ε-uniform convergence of almost second order.

Keywords: singular perturbation problem, polynomial chaos expansion, Shishkin mesh, two small parameters, exponential spline

Procedia PDF Downloads 160
189 Capability Prediction of Machining Processes Based on Uncertainty Analysis

Authors: Hamed Afrasiab, Saeed Khodaygan

Abstract:

Prediction of machining process capability in the design stage plays a key role to reach the precision design and manufacturing of mechanical products. Inaccuracies in machining process lead to errors in position and orientation of machined features on the part, and strongly affect the process capability in the final quality of the product. In this paper, an efficient systematic approach is given to investigate the machining errors to predict the manufacturing errors of the parts and capability prediction of corresponding machining processes. A mathematical formulation of fixture locators modeling is presented to establish the relationship between the part errors and the related sources. Based on this method, the final machining errors of the part can be accurately estimated by relating them to the combined dimensional and geometric tolerances of the workpiece – fixture system. This method is developed for uncertainty analysis based on the Worst Case and statistical approaches. The application of the presented method is illustrated through presenting an example and the computational results are compared with the Monte Carlo simulation results.

Keywords: process capability, machining error, dimensional and geometrical tolerances, uncertainty analysis

Procedia PDF Downloads 307
188 Theoretical Modelling of Molecular Mechanisms in Stimuli-Responsive Polymers

Authors: Catherine Vasnetsov, Victor Vasnetsov

Abstract:

Context: Thermo-responsive polymers are materials that undergo significant changes in their physical properties in response to temperature changes. These polymers have gained significant attention in research due to their potential applications in various industries and medicine. However, the molecular mechanisms underlying their behavior are not well understood, particularly in relation to cosolvency, which is crucial for practical applications. Research Aim: This study aimed to theoretically investigate the phenomenon of cosolvency in long-chain polymers using the Flory-Huggins statistical-mechanical framework. The main objective was to understand the interactions between the polymer, solvent, and cosolvent under different conditions. Methodology: The research employed a combination of Monte Carlo computer simulations and advanced machine-learning methods. The Flory-Huggins mean field theory was used as the basis for the simulations. Spinodal graphs and ternary plots were utilized to develop an initial computer model for predicting polymer behavior. Molecular dynamic simulations were conducted to mimic real-life polymer systems. Machine learning techniques were incorporated to enhance the accuracy and reliability of the simulations. Findings: The simulations revealed that the addition of very low or very high volumes of cosolvent molecules resulted in smaller radii of gyration for the polymer, indicating poor miscibility. However, intermediate volume fractions of cosolvent led to higher radii of gyration, suggesting improved miscibility. These findings provide a possible microscopic explanation for the cosolvency phenomenon in polymer systems. Theoretical Importance: This research contributes to a better understanding of the behavior of thermo-responsive polymers and the role of cosolvency. The findings provide insights into the molecular mechanisms underlying cosolvency and offer specific predictions for future experimental investigations. The study also presents a more rigorous analysis of the Flory-Huggins free energy theory in the context of polymer systems. Data Collection and Analysis Procedures: The data for this study was collected through Monte Carlo computer simulations and molecular dynamic simulations. The interactions between the polymer, solvent, and cosolvent were analyzed using the Flory-Huggins mean field theory. Machine learning techniques were employed to enhance the accuracy of the simulations. The collected data was then analyzed to determine the impact of cosolvent volume fractions on the radii of gyration of the polymer. Question Addressed: The research addressed the question of how cosolvency affects the behavior of long-chain polymers. Specifically, the study aimed to investigate the interactions between the polymer, solvent, and cosolvent under different volume fractions and understand the resulting changes in the radii of gyration. Conclusion: In conclusion, this study utilized theoretical modeling and computer simulations to investigate the phenomenon of cosolvency in long-chain polymers. The findings suggest that moderate cosolvent volume fractions can lead to improved miscibility, as indicated by higher radii of gyration. These insights contribute to a better understanding of the molecular mechanisms underlying cosolvency in polymer systems and provide predictions for future experimental studies. The research also enhances the theoretical analysis of the Flory-Huggins free energy theory.

Keywords: molecular modelling, flory-huggins, cosolvency, stimuli-responsive polymers

Procedia PDF Downloads 70
187 Simulation Study of Multiple-Thick Gas Electron Multiplier-Based Microdosimeters for Fast Neutron Measurements

Authors: Amir Moslehi, Gholamreza Raisali

Abstract:

Microdosimetric detectors based on multiple-thick gas electron multiplier (multiple-THGEM) configurations are being used in various fields of radiation protection and dosimetry. In the present work, microdosimetric response of these detectors to fast neutrons has been investigated by Monte Carlo method. Three similar microdosimeters made of A-150 and rexolite as the wall materials are designed; the first based on single-THGEM, the second based on double-THGEM and the third is based on triple-THGEM. Sensitive volume of the three microdosimeters is a right cylinder of 5 mm height and diameter which is filled with the propane-based tissue-equivalent (TE) gas. The TE gas with 0.11 atm pressure at the room temperature simulates 1 µm of tissue. Lineal energy distributions for several neutron energies from 10 keV to 14 MeV including 241Am-Be neutrons are calculated by the Geant4 simulation toolkit. Also, mean quality factor and dose-equivalent value for any neutron energy has been determined by these distributions. Obtained data derived from the three microdosimeters are in agreement. Therefore, we conclude that the multiple-THGEM structures present similar microdosimetric responses to fast neutrons.

Keywords: fast neutrons, geant4, multiple-thick gas electron multiplier, microdosimeter

Procedia PDF Downloads 350
186 Dynamic Correlations and Portfolio Optimization between Islamic and Conventional Equity Indexes: A Vine Copula-Based Approach

Authors: Imen Dhaou

Abstract:

This study examines conditional Value at Risk by applying the GJR-EVT-Copula model, and finds the optimal portfolio for eight Dow Jones Islamic-conventional pairs. Our methodology consists of modeling the data by a bivariate GJR-GARCH model in which we extract the filtered residuals and then apply the Peak over threshold model (POT) to fit the residual tails in order to model marginal distributions. After that, we use pair-copula to find the optimal portfolio risk dependence structure. Finally, with Monte Carlo simulations, we estimate the Value at Risk (VaR) and the conditional Value at Risk (CVaR). The empirical results show the VaR and CVaR values for an equally weighted portfolio of Dow Jones Islamic-conventional pairs. In sum, we found that the optimal investment focuses on Islamic-conventional US Market index pairs because of high investment proportion; however, all other index pairs have low investment proportion. These results deliver some real repercussions for portfolio managers and policymakers concerning to optimal asset allocations, portfolio risk management and the diversification advantages of these markets.

Keywords: CVaR, Dow Jones Islamic index, GJR-GARCH-EVT-pair copula, portfolio optimization

Procedia PDF Downloads 256
185 The Role of Uncertainty in the Integration of Environmental Parameters in Energy System Modeling

Authors: Alexander de Tomás, Miquel Sierra, Stefan Pfenninger, Francesco Lombardi, Ines Campos, Cristina Madrid

Abstract:

Environmental parameters are key in the definition of sustainable energy systems yet excluded from most energy system optimization models. Still, decision-making may be misleading without considering them. Environmental analyses of the energy transition are a key part of industrial ecology but often are performed without any input from the users of the information. This work assesses the systemic impacts of energy transition pathways in Portugal. Using the Calliope energy modeling framework, 250+ optimized energy system pathways are generated. A Delphi study helps to identify the relevant criteria for the stakeholders as regards the environmental assessment, which is performed with ENBIOS, a python package that integrates life cycle assessment (LCA) with a metabolic analysis based on complex relations. Furthermore, this study focuses on how the uncertainty propagates through the model’s consortium. With the aim of doing so, a soft link between the Calliope/ENBIOS cascade and Brightway’s data capabilities is built to perform Monte Carlo simulations. These findings highlight the relevance of including uncertainty analysis as a range of values rather than informing energy transition results with a single value.

Keywords: energy transition, energy modeling, uncertainty, sustainability

Procedia PDF Downloads 83
184 Bayes Estimation of Parameters of Binomial Type Rayleigh Class Software Reliability Growth Model using Non-informative Priors

Authors: Rajesh Singh, Kailash Kale

Abstract:

In this paper, the Binomial process type occurrence of software failures is considered and failure intensity has been characterized by one parameter Rayleigh class Software Reliability Growth Model (SRGM). The proposed SRGM is mathematical function of parameters namely; total number of failures i.e. η-0 and scale parameter i.e. η-1. It is assumed that very little or no information is available about both these parameters and then considering non-informative priors for both these parameters, the Bayes estimators for the parameters η-0 and η-1 have been obtained under square error loss function. The proposed Bayes estimators are compared with their corresponding maximum likelihood estimators on the basis of risk efficiencies obtained by Monte Carlo simulation technique. It is concluded that both the proposed Bayes estimators of total number of failures and scale parameter perform well for proper choice of execution time.

Keywords: binomial process, non-informative prior, maximum likelihood estimator (MLE), rayleigh class, software reliability growth model (SRGM)

Procedia PDF Downloads 389
183 Non-Invasive Imaging of Tissue Using Near Infrared Radiations

Authors: Ashwani Kumar Aggarwal

Abstract:

NIR Light is non-ionizing and can pass easily through living tissues such as breast without any harmful effects. Therefore, use of NIR light for imaging the biological tissue and to quantify its optical properties is a good choice over other invasive methods. Optical tomography involves two steps. One is the forward problem and the other is the reconstruction problem. The forward problem consists of finding the measurements of transmitted light through the tissue from source to detector, given the spatial distribution of absorption and scattering properties. The second step is the reconstruction problem. In X-ray tomography, there is standard method for reconstruction called filtered back projection method or the algebraic reconstruction methods. But this method cannot be applied as such, in optical tomography due to highly scattering nature of biological tissue. A hybrid algorithm for reconstruction has been implemented in this work which takes into account the highly scattered path taken by photons while back projecting the forward data obtained during Monte Carlo simulation. The reconstructed image suffers from blurring due to point spread function. This blurred reconstructed image has been enhanced using a digital filter which is optimal in mean square sense.

Keywords: least-squares optimization, filtering, tomography, laser interaction, light scattering

Procedia PDF Downloads 316
182 A Decadal Flood Assessment Using Time-Series Satellite Data in Cambodia

Authors: Nguyen-Thanh Son

Abstract:

Flood is among the most frequent and costliest natural hazards. The flood disasters especially affect the poor people in rural areas, who are heavily dependent on agriculture and have lower incomes. Cambodia is identified as one of the most climate-vulnerable countries in the world, ranked 13th out of 181 countries most affected by the impacts of climate change. Flood monitoring is thus a strategic priority at national and regional levels because policymakers need reliable spatial and temporal information on flood-prone areas to form successful monitoring programs to reduce possible impacts on the country’s economy and people’s likelihood. This study aims to develop methods for flood mapping and assessment from MODIS data in Cambodia. We processed the data for the period from 2000 to 2017, following three main steps: (1) data pre-processing to construct smooth time-series vegetation and water surface indices, (2) delineation of flood-prone areas, and (3) accuracy assessment. The results of flood mapping were verified with the ground reference data, indicating the overall accuracy of 88.7% and a Kappa coefficient of 0.77, respectively. These results were reaffirmed by close agreement between the flood-mapping area and ground reference data, with the correlation coefficient of determination (R²) of 0.94. The seasonally flooded areas observed for 2010, 2015, and 2016 were remarkably smaller than other years, mainly attributed to the El Niño weather phenomenon exacerbated by impacts of climate change. Eventually, although several sources potentially lowered the mapping accuracy of flood-prone areas, including image cloud contamination, mixed-pixel issues, and low-resolution bias between the mapping results and ground reference data, our methods indicated the satisfactory results for delineating spatiotemporal evolutions of floods. The results in the form of quantitative information on spatiotemporal flood distributions could be beneficial to policymakers in evaluating their management strategies for mitigating the negative effects of floods on agriculture and people’s likelihood in the country.

Keywords: MODIS, flood, mapping, Cambodia

Procedia PDF Downloads 127
181 On Estimating the Low Income Proportion with Several Auxiliary Variables

Authors: Juan F. Muñoz-Rosas, Rosa M. García-Fernández, Encarnación Álvarez-Verdejo, Pablo J. Moya-Fernández

Abstract:

Poverty measurement is a very important topic in many studies in social sciences. One of the most important indicators when measuring poverty is the low income proportion. This indicator gives the proportion of people of a population classified as poor. This indicator is generally unknown, and for this reason, it is estimated by using survey data, which are obtained by official surveys carried out by many statistical agencies such as Eurostat. The main feature of the mentioned survey data is the fact that they contain several variables. The variable used to estimate the low income proportion is called as the variable of interest. The survey data may contain several additional variables, also named as the auxiliary variables, related to the variable of interest, and if this is the situation, they could be used to improve the estimation of the low income proportion. In this paper, we use Monte Carlo simulation studies to analyze numerically the performance of estimators based on several auxiliary variables. In this simulation study, we considered real data sets obtained from the 2011 European Union Survey on Income and Living Condition. Results derived from this study indicate that the estimators based on auxiliary variables are more accurate than the naive estimator.

Keywords: inclusion probability, poverty, poverty line, survey sampling

Procedia PDF Downloads 458
180 A Partially Accelerated Life Test Planning with Competing Risks and Linear Degradation Path under Tampered Failure Rate Model

Authors: Fariba Azizi, Firoozeh Haghighi, Viliam Makis

Abstract:

In this paper, we propose a method to model the relationship between failure time and degradation for a simple step stress test where underlying degradation path is linear and different causes of failure are possible. It is assumed that the intensity function depends only on the degradation value. No assumptions are made about the distribution of the failure times. A simple step-stress test is used to shorten failure time of products and a tampered failure rate (TFR) model is proposed to describe the effect of the changing stress on the intensities. We assume that some of the products that fail during the test have a cause of failure that is only known to belong to a certain subset of all possible failures. This case is known as masking. In the presence of masking, the maximum likelihood estimates (MLEs) of the model parameters are obtained through an expectation-maximization (EM) algorithm by treating the causes of failure as missing values. The effect of incomplete information on the estimation of parameters is studied through a Monte-Carlo simulation. Finally, a real example is analyzed to illustrate the application of the proposed methods.

Keywords: cause of failure, linear degradation path, reliability function, expectation-maximization algorithm, intensity, masked data

Procedia PDF Downloads 334
179 Multi-Objective Electric Vehicle Charge Coordination for Economic Network Management under Uncertainty

Authors: Ridoy Das, Myriam Neaimeh, Yue Wang, Ghanim Putrus

Abstract:

Electric vehicles are a popular transportation medium renowned for potential environmental benefits. However, large and uncontrolled charging volumes can impact distribution networks negatively. Smart charging is widely recognized as an efficient solution to achieve both improved renewable energy integration and grid relief. Nevertheless, different decision-makers may pursue diverse and conflicting objectives. In this context, this paper proposes a multi-objective optimization framework to control electric vehicle charging to achieve both energy cost reduction and peak shaving. A weighted-sum method is developed due to its intuitiveness and efficiency. Monte Carlo simulations are implemented to investigate the impact of uncertain electric vehicle driving patterns and provide decision-makers with a robust outcome in terms of prospective cost and network loading. The results demonstrate that there is a conflict between energy cost efficiency and peak shaving, with the decision-makers needing to make a collaborative decision.

Keywords: electric vehicles, multi-objective optimization, uncertainty, mixed integer linear programming

Procedia PDF Downloads 179
178 Using Simulation Modeling Approach to Predict USMLE Steps 1 and 2 Performances

Authors: Chau-Kuang Chen, John Hughes, Jr., A. Dexter Samuels

Abstract:

The prediction models for the United States Medical Licensure Examination (USMLE) Steps 1 and 2 performances were constructed by the Monte Carlo simulation modeling approach via linear regression. The purpose of this study was to build robust simulation models to accurately identify the most important predictors and yield the valid range estimations of the Steps 1 and 2 scores. The application of simulation modeling approach was deemed an effective way in predicting student performances on licensure examinations. Also, sensitivity analysis (a/k/a what-if analysis) in the simulation models was used to predict the magnitudes of Steps 1 and 2 affected by changes in the National Board of Medical Examiners (NBME) Basic Science Subject Board scores. In addition, the study results indicated that the Medical College Admission Test (MCAT) Verbal Reasoning score and Step 1 score were significant predictors of the Step 2 performance. Hence, institutions could screen qualified student applicants for interviews and document the effectiveness of basic science education program based on the simulation results.

Keywords: prediction model, sensitivity analysis, simulation method, USMLE

Procedia PDF Downloads 340
177 Citrullinated Myelin Basic Protein Mediated Inflammation in Astrocytes

Authors: Lali Shanshiashvili, Marika Chikviladze, Nino Mamulashvili, Maia Sepashvili, Nana Narmania, David Mikeladze

Abstract:

Purpose: During demyelinating inflammatory diseases and after the damage of the myelin sheet, myelin-derived proteins, including myelin basic protein (MBP), are secreted into the extracellular space. MBP shows extensive post-translational modifications, including the deimination of arginine residues. Deiminated MBP is structurally less ordered, susceptible to proteolytic attack, and more immunogenic than the unmodified one. It is hypothesized that MBP could change the inflammatory response in astrocytes. Methods: MBP was isolated and purified from bovine brain white matter. Primary astrocyte cultures were prepared from whole brains of 2-day-old Wistar rats. For evaluation of glutamate uptake/release in astrocytes following treatment of cells with MBP charge isomers, Glutamate Assay Kit was used. The expression of EAAT-2 (excitatory amino acid transporters), peroxisome proliferator-activated receptor gamma (PPAR- γ), inhibitor of nuclear factor kappa B (IkB), and high mobility group protein B1 (HMGB1) in astrocytes were assayed by Western Blot analysis. Results: This study investigated the action of deiminated isomer (C8) on the cultured primary astrocytes and compared its effects with the effects of unmodified C1 isomers. The study found that C8 and C1 MBP differently act on the uptake and release of glutamate in astrocytes: nonmodified C1 MBP increases the uptake of glutamate and does not change the release, whereas C8 decreases the release of glutamate but does not alter the uptake. Nevertheless, both isomers increased the expression of PPAR-γ and EAAT2 in the same intensity. However, immunostaining and Western Blots of cell lysates showed a decrease of IkB and increased expression of HMGB1 after the treatment of astrocytes by C8. Moreover, in the presence of C8, astrocytes release more nitric oxide than unmodified C1 isomers. Conclusion: These data suggest that the deiminated isomer of MBP evokes an inflammatory response and enhances the ability of astrocytes to release proinflammatory mediators through activation of NF-kB after the breakdown of myelin sheets. Acknowledgment: This research was supported by the SRNSF Georgia RF17_534 grant.

Keywords: myelin basic protein, glutamate, deimination, astrocytes, inflammation

Procedia PDF Downloads 205
176 Mecano-Reliability Coupled of Reinforced Concrete Structure and Vulnerability Analysis: Case Study

Authors: Kernou Nassim

Abstract:

The current study presents a vulnerability and a reliability-mechanical approach that focuses on evaluating the seismic performance of reinforced concrete structures to determine the probability of failure. In this case, the performance function reflecting the non-linear behavior of the structure is modeled by a response surface to establish an analytical relationship between the random variables (strength of concrete and yield strength of steel) and mechanical responses of the structure (inter-floor displacement) obtained by the pushover results of finite element simulations. The push over-analysis is executed by software SAP2000. The results acquired prove that properly designed frames will perform well under seismic loads. It is a comparative study of the behavior of the existing structure before and after reinforcement using the pushover method. The coupling indirect mechanical reliability by response surface avoids prohibitive calculation times. Finally, the results of the proposed approach are compared with Monte Carlo Simulation. The comparative study shows that the structure is more reliable after the introduction of new shear walls.

Keywords: finite element method, surface response, reliability, reliability mechanical coupling, vulnerability

Procedia PDF Downloads 118
175 On the Cluster of the Families of Hybrid Polynomial Kernels in Kernel Density Estimation

Authors: Benson Ade Eniola Afere

Abstract:

Over the years, kernel density estimation has been extensively studied within the context of nonparametric density estimation. The fundamental components of kernel density estimation are the kernel function and the bandwidth. While the mathematical exploration of the kernel component has been relatively limited, its selection and development remain crucial. The Mean Integrated Squared Error (MISE), serving as a measure of discrepancy, provides a robust framework for assessing the effectiveness of any kernel function. A kernel function with a lower MISE is generally considered to perform better than one with a higher MISE. Hence, the primary aim of this article is to create kernels that exhibit significantly reduced MISE when compared to existing classical kernels. Consequently, this article introduces a cluster of hybrid polynomial kernel families. The construction of these proposed kernel functions is carried out heuristically by combining two kernels from the classical polynomial kernel family using probability axioms. We delve into the analysis of error propagation within these kernels. To assess their performance, simulation experiments, and real-life datasets are employed. The obtained results demonstrate that the proposed hybrid kernels surpass their classical kernel counterparts in terms of performance.

Keywords: classical polynomial kernels, cluster of families, global error, hybrid Kernels, Kernel density estimation, Monte Carlo simulation

Procedia PDF Downloads 94
174 Reliability Based Performance Evaluation of Stone Column Improved Soft Ground

Authors: A. GuhaRay, C. V. S. P. Kiranmayi, S. Rudraraju

Abstract:

The present study considers the effect of variation of different geotechnical random variables in the design of stone column-foundation systems for assessing the bearing capacity and consolidation settlement of highly compressible soil. The soil and stone column properties, spacing, diameter and arrangement of stone columns are considered as the random variables. Probability of failure (Pf) is computed for a target degree of consolidation and a target safe load by Monte Carlo Simulation (MCS). The study shows that the variation in coefficient of radial consolidation (cr) and cohesion of soil (cs) are two most important factors influencing Pf. If the coefficient of variation (COV) of cr exceeds 20%, Pf exceeds 0.001, which is unsafe following the guidelines of US Army Corps of Engineers. The bearing capacity also exceeds its safe value for COV of cs > 30%. It is also observed that as the spacing between the stone column increases, the probability of reaching a target degree of consolidation decreases. Accordingly, design guidelines, considering both consolidation and bearing capacity of improved ground, are proposed for different spacing and diameter of stone columns and geotechnical random variables.

Keywords: bearing capacity, consolidation, geotechnical random variables, probability of failure, stone columns

Procedia PDF Downloads 359
173 A Morphological Analysis of Swardspeak in the Philippines

Authors: Carlo Gadingan

Abstract:

Swardspeak, as a language, highlights the exclusive identity of the Filipino gay men and the oppression they are confronted in the society. This paper presents a morphological analysis of swardspeak in the Philippines. Specifically, it aims to find out the common morphological processes involved in the construction of codes that may unmask the nature of swardspeak as a language. 30 purposively selected expert users of swardspeak from Luzon, Visayas, and Mindanao were asked to codify 30 natural words through the Facebook Messenger application. The results of the structural analysis affirm that swardspeak follows no specific rules revealing complicated combinations of clipping/stylized clipping, borrowing, connotation through images, connotation through actions, connotation through sounds, affixation, repetition, substitution, and simple reversal. Moreover, it was also found out that most of these word formation processes occur in all word classes which indicate that swardspeak is very unpredictable. Although different codes are used for the same words, there are still codes that are really common to all homosexuals and these are Chaka (ugly), Crayola (cry), and Aida (referring to a person with AIDS). Hence, the prevailing word formation processes explored may be termed as observed time-specific patterns because the codes documented in this study may turn obsolete and may be replaced with novel ones in a matter of weeks to month, knowing the creativity of homosexuals and the multiplicity of societal resources which can be used to make the codes more opaque and more confusing for non-homosexuals.

Keywords: codes, homosexuals, morphological processes, swardspeak

Procedia PDF Downloads 180
172 Evaluation of a Data Fusion Algorithm for Detecting and Locating a Radioactive Source through Monte Carlo N-Particle Code Simulation and Experimental Measurement

Authors: Hadi Ardiny, Amir Mohammad Beigzadeh

Abstract:

Through the utilization of a combination of various sensors and data fusion methods, the detection of potential nuclear threats can be significantly enhanced by extracting more information from different data. In this research, an experimental and modeling approach was employed to track a radioactive source by combining a surveillance camera and a radiation detector (NaI). To run this experiment, three mobile robots were utilized, with one of them equipped with a radioactive source. An algorithm was developed in identifying the contaminated robot through correlation between camera images and camera data. The computer vision method extracts the movements of all robots in the XY plane coordinate system, and the detector system records the gamma-ray count. The position of the robots and the corresponding count of the moving source were modeled using the MCNPX simulation code while considering the experimental geometry. The results demonstrated a high level of accuracy in finding and locating the target in both the simulation model and experimental measurement. The modeling techniques prove to be valuable in designing different scenarios and intelligent systems before initiating any experiments.

Keywords: nuclear threats, radiation detector, MCNPX simulation, modeling techniques, intelligent systems

Procedia PDF Downloads 125
171 A Bayesian Multivariate Microeconometric Model for Estimation of Price Elasticity of Demand

Authors: Jefferson Hernandez, Juan Padilla

Abstract:

Estimation of price elasticity of demand is a valuable tool for the task of price settling. Given its relevance, it is an active field for microeconomic and statistical research. Price elasticity in the industry of oil and gas, in particular for fuels sold in gas stations, has shown to be a challenging topic given the market and state restrictions, and underlying correlations structures between the types of fuels sold by the same gas station. This paper explores the Lotka-Volterra model for the problem for price elasticity estimation in the context of fuels; in addition, it is introduced multivariate random effects with the purpose of dealing with errors, e.g., measurement or missing data errors. In order to model the underlying correlation structures, the Inverse-Wishart, Hierarchical Half-t and LKJ distributions are studied. Here, the Bayesian paradigm through Markov Chain Monte Carlo (MCMC) algorithms for model estimation is considered. Simulation studies covering a wide range of situations were performed in order to evaluate parameter recovery for the proposed models and algorithms. Results revealed that the proposed algorithms recovered quite well all model parameters. Also, a real data set analysis was performed in order to illustrate the proposed approach.

Keywords: price elasticity, volume, correlation structures, Bayesian models

Procedia PDF Downloads 165
170 An International Curriculum Development for Languages and Technology

Authors: Miguel Nino

Abstract:

When considering the challenges of a changing and demanding globalizing world, it is important to reflect on how university students will be prepared for the realities of internationalization, marketization and intercultural conversation. The present study is an interdisciplinary program designed to respond to the needs of the global community. The proposal bridges the humanities and science through three different fields: Languages, graphic design and computer science, specifically, fundamentals of programming such as python, java script and software animation. Therefore, the goal of the four year program is twofold: First, enable students for intercultural communication between English and other languages such as Spanish, Mandarin, French or German. Second, students will acquire knowledge in practical software and relevant employable skills to collaborate in assisted computer projects that most probable will require essential programing background in interpreted or compiled languages. In order to become inclusive and constructivist, the cognitive linguistics approach is suggested for the three different fields, particularly for languages that rely on the traditional method of repetition. This methodology will help students develop their creativity and encourage them to become independent problem solving individuals, as languages enhance their common ground of interaction for culture and technology. Participants in this course of study will be evaluated in their second language acquisition at the Intermediate-High level. For graphic design and computer science students will apply their creative digital skills, as well as their critical thinking skills learned from the cognitive linguistics approach, to collaborate on a group project design to find solutions for media web design problems or marketing experimentation for a company or the community. It is understood that it will be necessary to apply programming knowledge and skills to deliver the final product. In conclusion, the program equips students with linguistics knowledge and skills to be competent in intercultural communication, where English, the lingua franca, remains the medium for marketing and product delivery. In addition to their employability, students can expand their knowledge and skills in digital humanities, computational linguistics, or increase their portfolio in advertising and marketing. These students will be the global human capital for the competitive globalizing community.

Keywords: curriculum, international, languages, technology

Procedia PDF Downloads 443
169 First Surveillance Results Bring No Evidence of SARS-CoV-2 Spillback in Bats of Central-Southern Italy

Authors: Hiba Dakroub, Danilo Russo, Luca Cistrone, Francesco Serra, Giovanna Fusco, Esterina De Carlo, Maria Grazia Amoroso

Abstract:

The question of the origin of SARS-CoV-2 and the cycle of transmission between humans and animals is still unanswered. One serious concern associated with the SARS-CoV-2 pandemic is that the virus might spill back from humans to wildlife, which would render some animal species reservoirs of the human virus. The aim of the present study is to monitor the potential risk of SARS-CoV-2 reverse infection from humans to bats, by performing bat surveillance from different sites in Central-Southern Italy. We collected 240 droppings or saliva from 129 bats and tested them using specific and general primers of SARS-COV-2 and coronaviruses respectively. All samples, including 127 nasal swabs and 113 fecal droppings resulted negative for SARS-COV-2, and these results were confirmed by testing the samples with the Droplet Digital PCR. Also, an end-point RT-PCR was performed and no sample showed specific bands. The absence of SARS-CoV-2 in the bats we surveyed is a first step towards a better understanding of reverse transmission to bats of this virus. We hope our first contribution will encourage the establishment of systematic surveillance of wildlife, and specifically bats, to help prevent reverse zoonotic episodes that would jeopardize human health as well as biodiversity conservation and management.

Keywords: coronaviruses, bats, zoonotic viruses, spillback, SARS-CoV-2

Procedia PDF Downloads 119
168 Investigation of Efficient Production of ¹³⁵La for the Auger Therapy Using Medical Cyclotron in Poland

Authors: N. Zandi, M. Sitarz, J. Jastrzebski, M. Vagheian, J. Choinski, A. Stolarz, A. Trzcinska

Abstract:

¹³⁵La with the half-life of 19.5 h can be considered as a good candidate for Auger therapy. ¹³⁵La decays almost 100% by electron capture to the stable ¹³⁵Ba. In this study, all important possible reactions leading to ¹³⁵La production are investigated in details, and the corresponding theoretical yield for each reaction using the Monte-Carlo method (MCNPX code) are presented. Among them, the best reaction based on the cost-effectiveness and production yield regarding Poland facilities equipped with medical cyclotron has been selected. ¹³⁵La is produced using 16.5 MeV proton beam of general electric PET trace cyclotron through the ¹³⁵Ba(p,n)¹³⁵La reaction. Moreover, for a consistent facilitating comparison between the theoretical calculations and the experimental measurements, the beam current and also the proton beam energy is measured experimentally. Then, the obtained proton energy is considered as the entrance energy for the theoretical calculations. The production yield finally is measured and compared with the results obtained using the MCNPX code. The results show the experimental measurement and the theoretical calculations are in good agreement.

Keywords: efficient ¹³⁵La production, proton cyclotron energy measurement, MCNPX code, theoretical and experimental production yield

Procedia PDF Downloads 142
167 Measurement and Analysis of Radiation Doses to Radiosensitive Organs from CT Examination of the Cervical Spine Using Radiochromic Films and Monte Carlo Simulation Based Software

Authors: Khaled Soliman, Abdullah Alrushoud, Abdulrahman Alkhalifah, Raed Albathi, Salman Altymiat

Abstract:

Radiation dose received by patients undergoing Computed Tomography (CT) examination of the cervical spine was evaluated using Gafchromic XR-QA2 films and CT-EXPO software (ver. 2.3), in order to document our clinical dose values and to compare our results with other benchmarks reported in the current literature. Radiochromic films were recently used as practical dosimetry tool that provides dose profile information not available using the standard ionisation chamber routinely used in CT dosimetry. We have developed an in-house program to use the films in order to calculate the Entrance Dose Length Product (EDLP) in (mGy.cm) and to relate the EDLP to various organ doses calculated using the CT-EXPO software. We also calculated conversion factor in (mSv/mGy.cm) relating the EDLP to the effective dose (ED) from the examination using CT-EXPO software. Variability among different types of CT scanners and dose modulation methods are reported from at least three major CT brands available at our medical institution. Our work describes the dosimetry method and results are reported. The method can be used as in-vivo dosimetry method. But this work only reports results obtained from adult female anthropomorphic Phantom studies.

Keywords: CT dosimetry, gafchromic films, XR-QA2, CT-Expo software

Procedia PDF Downloads 471