Search results for: agent based simulation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 31412

Search results for: agent based simulation

30362 Optimized Real Ground Motion Scaling for Vulnerability Assessment of Building Considering the Spectral Uncertainty and Shape

Authors: Chen Bo, Wen Zengping

Abstract:

Based on the results of previous studies, we focus on the research of real ground motion selection and scaling method for structural performance-based seismic evaluation using nonlinear dynamic analysis. The input of earthquake ground motion should be determined appropriately to make them compatible with the site-specific hazard level considered. Thus, an optimized selection and scaling method are established including the use of not only Monte Carlo simulation method to create the stochastic simulation spectrum considering the multivariate lognormal distribution of target spectrum, but also a spectral shape parameter. Its applications in structural fragility analysis are demonstrated through case studies. Compared to the previous scheme with no consideration of the uncertainty of target spectrum, the method shown here can make sure that the selected records are in good agreement with the median value, standard deviation and spectral correction of the target spectrum, and greatly reveal the uncertainty feature of site-specific hazard level. Meanwhile, it can help improve computational efficiency and matching accuracy. Given the important infection of target spectrum’s uncertainty on structural seismic fragility analysis, this work can provide the reasonable and reliable basis for structural seismic evaluation under scenario earthquake environment.

Keywords: ground motion selection, scaling method, seismic fragility analysis, spectral shape

Procedia PDF Downloads 281
30361 Design and Implementation of PD-NN Controller Optimized Neural Networks for a Quad-Rotor

Authors: Chiraz Ben Jabeur, Hassene Seddik

Abstract:

In this paper, a full approach of modeling and control of a four-rotor unmanned air vehicle (UAV), known as quad-rotor aircraft, is presented. In fact, a PD and a PD optimized Neural Networks Approaches (PD-NN) are developed to be applied to control a quad-rotor. The goal of this work is to concept a smart self-tuning PD controller based on neural networks able to supervise the quad-rotor for an optimized behavior while tracking the desired trajectory. Many challenges could arise if the quad-rotor is navigating in hostile environments presenting irregular disturbances in the form of wind added to the model on each axis. Thus, the quad-rotor is subject to three-dimensional unknown static/varying wind disturbances. The quad-rotor has to quickly perform tasks while ensuring stability and accuracy and must behave rapidly with regard to decision-making facing disturbances. This technique offers some advantages over conventional control methods such as PD controller. Simulation results are obtained with the use of Matlab/Simulink environment and are founded on a comparative study between PD and PD-NN controllers based on wind disturbances. These later are applied with several degrees of strength to test the quad-rotor behavior. These simulation results are satisfactory and have demonstrated the effectiveness of the proposed PD-NN approach. In fact, this controller has relatively smaller errors than the PD controller and has a better capability to reject disturbances. In addition, it has proven to be highly robust and efficient, facing turbulences in the form of wind disturbances.

Keywords: hostile environment, PD and PD-NN controllers, quad-rotor control, robustness against disturbance

Procedia PDF Downloads 122
30360 Simulation of Heat Exchanger Behavior during LOCA Accident in THTL Test Loop

Authors: R. Mahmoodi, A. R. Zolfaghari

Abstract:

In nuclear power plants, loss of coolant from the primary system is the type of reduced removed capacity that is given most attention; such an accident is referred as Loss of Coolant Accident (LOCA). In the current study, investigation of shell and tube THTL heat exchanger behavior during LOCA is implemented by ANSYS CFX simulation software in both steady state and transient mode of turbulent fluid flow according to experimental conditions. Numerical results obtained from ANSYS CFX simulation show good agreement with experimental data of THTL heat exchanger. The results illustrate that in large break LOCA as short term accident, heat exchanger could not fast response to temperature variables but in the long term, the temperature of shell side of heat exchanger will be increase.

Keywords: shell-and-tube heat exchanger, shell-side, CFD, flow and heat transfer, LOCA

Procedia PDF Downloads 431
30359 Basic One-Dimensional Modelica®-Model for Simulation of Gas-Phase Adsorber Dynamics

Authors: Adrian Rettig, Silvan Schneider, Reto Tamburini, Mirko Kleingries, Ulf Christian Muller

Abstract:

Industrial adsorption processes are, mainly due to si-multaneous heat and mass transfer, characterized by a high level of complexity. The conception of such processes often does not take place systematically; instead scale-up/down respectively number-up/down methods based on existing systems are used. This paper shows how Modelica® can be used to develop a transient model enabling a more systematic design of such ad- and desorption components and processes. The core of this model is a lumped-element submodel of a single adsorbent grain, where the thermodynamic equilibria and the kinetics of the ad- and desorption processes are implemented and solved on the basis of mass-, momentum and energy balances. For validation of this submodel, a fixed bed adsorber, whose characteristics are described in detail in the literature, was modeled and simulated. The simulation results are in good agreement with the experimental results from the literature. Therefore, the model development will be continued, and the extended model will be applied to further adsorber types like rotor adsorbers and moving bed adsorbers.

Keywords: adsorption, desorption, linear driving force, dynamic model, Modelica®, integral equation approach

Procedia PDF Downloads 362
30358 Early Detection of Damages in Railway Steel Truss Bridges from Measured Dynamic Responses

Authors: Dinesh Gundavaram

Abstract:

This paper presents an investigation on bridge damage detection based on the dynamic responses estimated from a passing vehicle. A numerical simulation of steel truss bridge for railway was used in this investigation. The bridge response at different locations is measured using CSI-Bridge software. Several damage scenarios are considered including different locations and severities. The possibilities of dynamic properties of global modes in the identification of structural changes in truss bridges were discussed based on the results of measurement.

Keywords: bridge, damage, dynamic responses, detection

Procedia PDF Downloads 260
30357 Box-Behnken Design for the Biosorption of Cationic Dye from Aqueous Solution Using a Zero-Valent Iron Nano Algal Composite

Authors: V. Sivasubramanian, M. Jerold

Abstract:

The advancement of adsorption is the development of nano-biocomposite for the sorption dyes and heavy metal ions. In fact, Nanoscale zerovalent iron (NZVI) is cost-effective reducing agent and a most reliable biosorbent for the dye biosorption. In this study, nano zero valent iron Sargassum swartzii (nZVI-SS) biocomposite, a novel marine algal based biosorbent, was used for the removal of simulated crystal violet (CV) in batch mode of operation. The Box-Behnen design (BBD) experimental results revealed the biosoprtion was maximum at pH 7.5, biosorbent dosage 0.1 g/L and initial CV concentration of 100 mg/L. Therefore, the result implies that nZVI-SS biocomposite is a cheap and most promising biosorbent for the removal of CV from wastewater.

Keywords: algae, biosorption, zero-valent, dye, waste water

Procedia PDF Downloads 233
30356 Industrial Process Mining Based on Data Pattern Modeling and Nonlinear Analysis

Authors: Hyun-Woo Cho

Abstract:

Unexpected events may occur with serious impacts on industrial process. This work utilizes a data representation technique to model and to analyze process data pattern for the purpose of diagnosis. In this work, the use of triangular representation of process data is evaluated using simulation process. Furthermore, the effect of using different pre-treatment techniques based on such as linear or nonlinear reduced spaces was compared. This work extracted the fault pattern in the reduced space, not in the original data space. The results have shown that the non-linear technique based diagnosis method produced more reliable results and outperforms linear method.

Keywords: process monitoring, data analysis, pattern modeling, fault, nonlinear techniques

Procedia PDF Downloads 378
30355 FEM Based Numerical Simulation and Analysis of a Landslide Triggered by the Fluctuations of Ground-Water Levels

Authors: Deepak Raj Bhat, Akihiko Wakai, Shigeru Ogita, Yorihiro Tanaka, Kazushige Hayashi, Shinro Abe

Abstract:

In this study, the newly developed finite element methods are used for numerical analysis ofa landslide triggered by the fluctuations of ground-water levels in different cases I-IV. In case I, the ground-water level is fixed in such a way that the overall factor of safety (Fs) would be greater or equal to 1 (i.e., stable condition). Then, the ground-water level is gradually increased up to 1.0 m for, making the overall factor of safety (Fs) less than one (i.e., stable or moving condition). Then, the newly developed finite element model is applied for numerical simulation of the slope for each case. Based on the numerical analysis results of each Cases I-IV, the details of the deformation pattern and shear strain pattern are compared to each other. Moreover, the change in mobilized shear strength and local factor of safety along the slip surface of the landslide for each case are discussed to understand the triggering behaviors of a landslide due to the increased in ground water level. It is expected that this study will help to better understand the role of groundwater fluctuation for triggering of a landslide or slope failure disasters, and it would be also helpful for the judgment of the countermeasure works for the prevention and mitigation of landslide and slope failure disasters in near future.

Keywords: finite element method, ground water fluctuations, constitutive model, landslides, long-term disaster management system

Procedia PDF Downloads 111
30354 Effect of Natural Molecular Crowding on the Structure and Stability of DNA Duplex

Authors: Chaudhari S. G., Saxena, S.

Abstract:

We systematically and quantitatively investigated the effect of glucose as a model of natural molecular crowding agent on the structure and thermodynamics of Watson-Crick base paired three duplexes (named as D1, D2 and D3) of different base compositions and lengths. Structural analyses demonstrated that duplexes (D1 and D2) folded into B-form with different cations in the absence and presence of glucose while duplex (D3) folded into mixed A and B-form. Moreover, we demonstrated that the duplex was more stable in the absence of glucose, and marginally destabilized in its presence because glucose act as a weak structure breaker on the tetrahedral network of water. In the absence of glucose, the values of ΔG°25 for duplex (D1) were -13.56, -13.76, -12.46, and -12.36 kcal/mol, for duplex (D2) were -13.64, -12.93, -12.86, and -12.30 kcal/mol, for duplex (D3) were -10.05, -11.76, -9.91, -9.70 kcal/mol in the presence of Na+, K+, Na+ + Mg++ and K+ + Mg++ respectively. At high concentration of glucose (1:10000), there was increase in ΔG°25 for duplex (D1) -12.47, -12.37, -11.96, -11.55 kcal/mol, for duplex (D2) -12.37, -11.47, -11.98, -11.01 kcal/mol and for duplex (D3) -8.47, -9.17, -9.16, -8.66 kcal/mol. Our results provide the information that structure and stability of DNA duplex depends on the structure of molecular crowding agent present in its close vicinity. In this study, I have taken the hydration of simple sugar as an essential model for understanding interactions between hydrophilic groups and interfacial water molecules and its effect on hydrogen bonded DNA duplexes. On the basis of these relatively simple building blocks I hope to gain some insights for understanding more generally the properties of sugar–water–salt systems with DNA duplexes.

Keywords: natural molecular crowding, DNA Duplex, structure of DNA, bioengineering and life sciences

Procedia PDF Downloads 459
30353 Simulation-Based Parametric Study for the Hybrid Superplastic Forming of AZ31

Authors: Fatima Ghassan Al-Abtah, Naser Al-Huniti, Elsadig Mahdi

Abstract:

As the lightest constructional metal on earth, magnesium alloys offer excellent potential for weight reduction in the transportation industry, and it was observed that some magnesium alloys exhibit superior ductility and superplastic behavior at high temperatures. The main limitation of the superplastic forming (SPF) includes the low production rate since it needs a long forming time for each part. Through this study, an SPF process that starts with a mechanical pre-forming stage is developed to promote formability and reduce forming time. A two-dimensional finite element model is used to simulate the process. The forming process consists of two steps. At the pre-forming step (deep drawing), the sheet is drawn into the die to a preselected level, using a mechanical punch, and at the second step (SPF) a pressurized gas is applied at a controlled rate. It is shown that a significant reduction in forming time and improved final thickness uniformity can be achieved when the hybrid forming technique is used, where the process achieved a fully formed part at 400°C. Investigation for the impact of different forming process parameters achieved by comparing forming time and the distribution of final thickness that were obtained from the simulation analysis. Maximum thinning decreased from over 67% to less than 55% and forming time significantly decreased by more than 6 minutes, and the required gas pressure profile was predicted for optimum forming process parameters based on the 0.001/sec target constant strain rate within the sheet.

Keywords: magnesium, plasticity, superplastic forming, finite element analysis

Procedia PDF Downloads 144
30352 Calculation of the Added Mass of a Submerged Object with Variable Sizes at Different Distances from the Wall via Lattice Boltzmann Simulations

Authors: Nastaran Ahmadpour Samani, Shahram Talebi

Abstract:

Added mass is an important quantity in analysis of the motion of a submerged object ,which can be calculated by solving the equation of potential flow around the object . Here, we consider systems in which a square object is submerged in a channel of fluid and moves parallel to the wall. The corresponding added mass at a given distance from the wall d and for the object size s (which is the side of square object) is calculated via lattice Blotzmann simulation . By changing d and s separately, their effect on the added mass is studied systematically. The simulation results reveal that for the systems in which d > 4s, the distance does not influence the added mass any more. The added mass increases when the object approaches the wall and reaches its maximum value as it moves on the wall (d -- > 0). In this case, the added mass is about 73% larger than which of the case d=4s. In addition, it is observed that the added mass increases by increasing of the object size s and vice versa.

Keywords: Lattice Boltzmann simulation , added mass, square, variable size

Procedia PDF Downloads 459
30351 Cluster-Based Multi-Path Routing Algorithm in Wireless Sensor Networks

Authors: Si-Gwan Kim

Abstract:

Small-size and low-power sensors with sensing, signal processing and wireless communication capabilities is suitable for the wireless sensor networks. Due to the limited resources and battery constraints, complex routing algorithms used for the ad-hoc networks cannot be employed in sensor networks. In this paper, we propose node-disjoint multi-path hexagon-based routing algorithms in wireless sensor networks. We suggest the details of the algorithm and compare it with other works. Simulation results show that the proposed scheme achieves better performance in terms of efficiency and message delivery ratio.

Keywords: clustering, multi-path, routing protocol, sensor network

Procedia PDF Downloads 388
30350 Simulation Analysis and Control of the Temperature Field in an Induction Furnace Based on Various Parameters

Authors: Sohaibullah Zarghoon, Syed Yousaf, Cyril Belavy, Stanislav Duris, Samuel Emebu, Radek Matusu

Abstract:

Induction heating is extensively employed in industrial furnaces due to its swift response and high energy efficiency. Designing and optimising these furnaces necessitates the use of computer-aided simulations. This study aims to develop an accurate temperature field model for a rectangular steel billet in an induction furnace by leveraging various parameters in COMSOL Multiphysics software. The simulation analysis incorporated temperature dynamics, considering skin depth, temperature-dependent, and constant parameters of the steel billet. The resulting data-driven model was transformed into a state-space model using MATLAB's System Identification Toolbox for the purpose of designing a linear quadratic regulator (LQR). This controller was successfully implemented to regulate the core temperature of the billet from 1000°C to 1200°C, utilizing the distributed parameter system circuit.

Keywords: induction heating, LQR controller, skin depth, temperature field

Procedia PDF Downloads 18
30349 Effect of Genuine Missing Data Imputation on Prediction of Urinary Incontinence

Authors: Suzan Arslanturk, Mohammad-Reza Siadat, Theophilus Ogunyemi, Ananias Diokno

Abstract:

Missing data is a common challenge in statistical analyses of most clinical survey datasets. A variety of methods have been developed to enable analysis of survey data to deal with missing values. Imputation is the most commonly used among the above methods. However, in order to minimize the bias introduced due to imputation, one must choose the right imputation technique and apply it to the correct type of missing data. In this paper, we have identified different types of missing values: missing data due to skip pattern (SPMD), undetermined missing data (UMD), and genuine missing data (GMD) and applied rough set imputation on only the GMD portion of the missing data. We have used rough set imputation to evaluate the effect of such imputation on prediction by generating several simulation datasets based on an existing epidemiological dataset (MESA). To measure how well each dataset lends itself to the prediction model (logistic regression), we have used p-values from the Wald test. To evaluate the accuracy of the prediction, we have considered the width of 95% confidence interval for the probability of incontinence. Both imputed and non-imputed simulation datasets were fit to the prediction model, and they both turned out to be significant (p-value < 0.05). However, the Wald score shows a better fit for the imputed compared to non-imputed datasets (28.7 vs. 23.4). The average confidence interval width was decreased by 10.4% when the imputed dataset was used, meaning higher precision. The results show that using the rough set method for missing data imputation on GMD data improve the predictive capability of the logistic regression. Further studies are required to generalize this conclusion to other clinical survey datasets.

Keywords: rough set, imputation, clinical survey data simulation, genuine missing data, predictive index

Procedia PDF Downloads 155
30348 Modelling Home Appliances for Energy Management System: Comparison of Simulation Results with Measurements

Authors: Aulon Shabani, Denis Panxhi, Orion Zavalani

Abstract:

This paper presents the modelling and development of a simulator for residential electrical appliances. The simulator is developed on MATLAB providing the possibility to analyze and simulate energy consumption of frequently used home appliances in Albania. Modelling of devices considers the impact of different factors, mentioning occupant behavior and climacteric conditions. Most devices are modeled as an electric circuit, and the electric energy consumption is estimated by the solutions of the guiding differential equations. The provided models refer to devices like a dishwasher, oven, water heater, air conditioners, light bulbs, television, refrigerator water, and pump. The proposed model allows us to simulate beforehand the energetic behavior of the largest consumption home devices to estimate peak consumption and improving its reduction. Simulated home prototype results are compared to real measurement of a considered typical home. Obtained results from simulator framework compared to monitored typical household using EmonTxV3 show the effectiveness of the proposed simulation. This conclusion will help for future simulation of a large group of typical household for a better understanding of peak consumption.

Keywords: electrical appliances, energy management, modelling, peak estimation, simulation, smart home

Procedia PDF Downloads 147
30347 A Miniaturized Circular Patch Antenna Based on Metamaterial for WI-FI Applications

Authors: Fatima Zahra Moussa, Yamina Belhadef, Souheyla Ferouani

Abstract:

In this work, we present a new form of miniature circular patch antenna based on CSRR metamaterials with an extended bandwidth proposed for 5 GHz Wi-Fiapplications. A reflection coefficient of -35 dB and a radiation pattern of 7.47 dB are obtained when simulating the initial proposed antenna with the CST microwave studio simulation software. The notch insertion technique in the radiating element was used for matching the antenna to the desired frequency in the frequency band [5150-5875] MHz.An extension of the bandwidth from 332 MHz to 1423 MHz was done by the DGS (defected ground structure) technique to meet the user's requirement in the 5 GHz Wi-Fi frequency band.

Keywords: patch antenna, miniaturisation, CSRR, notches, wifi, DGS

Procedia PDF Downloads 105
30346 Motherhood Practices and Symbolic Capital: A Study of Teen Mothers in Northeastern Thailand

Authors: Ampai Muensit, Maniemai Thongyou, Patcharin Lapanun

Abstract:

Teen mothers have been viewed as ‘a powerless’ facing numerous pressures including poverty, immaturity of motherhood, and especially social blame.This paper argues that, to endure as an agent, they keep struggling to overcome all difficulties in their everyday life by using certain symbols to negotiate the situations they encounter, and to obtain a social position without surrendering to the dominating socio-cultural structure. Guided by Bourdieu’s theory of practice, this study looks at how teen mothers use symbolic capital in their motherhood practices. Although motherhood practices can be found in different contexts with various types of capital utilization, this paper focuses on the use of symbolic capitals in teen mothers’ practices within the contexts of the community. The study employs a qualitative methodology; data was collected from 12 informants through life history, in-depth interview, observation and the content analytical method was employed for data analysis. The findings show that child and motherhood were key symbolic capitals in motherhood practices. Employing such capitals teen mothers can achieve an acceptance from community – particularly from the new community. These symbolic capitals were the important sources of teen mothers’ power to turn the tide by changing their status – from “the powerless” to be “the agent”. The use of symbolic capitals also related to habitus of teen mothers in better compromising for an appropriate social position.

Keywords: teen mother, motherhood practice, symbolic capital, community

Procedia PDF Downloads 253
30345 Robust Variable Selection Based on Schwarz Information Criterion for Linear Regression Models

Authors: Shokrya Saleh A. Alshqaq, Abdullah Ali H. Ahmadini

Abstract:

The Schwarz information criterion (SIC) is a popular tool for selecting the best variables in regression datasets. However, SIC is defined using an unbounded estimator, namely, the least-squares (LS), which is highly sensitive to outlying observations, especially bad leverage points. A method for robust variable selection based on SIC for linear regression models is thus needed. This study investigates the robustness properties of SIC by deriving its influence function and proposes a robust SIC based on the MM-estimation scale. The aim of this study is to produce a criterion that can effectively select accurate models in the presence of vertical outliers and high leverage points. The advantages of the proposed robust SIC is demonstrated through a simulation study and an analysis of a real dataset.

Keywords: influence function, robust variable selection, robust regression, Schwarz information criterion

Procedia PDF Downloads 129
30344 Comparison Between the Radiation Resistance of n/p and p/n InP Solar Cell

Authors: Mazouz Halima, Belghachi Abdrahmane

Abstract:

Effects of electron irradiation-induced deep level defects have been studied on both n/p and p/n indium phosphide solar cells with very thin emitters. The simulation results show that n/p structure offers a somewhat better short circuit current but the p/n structure offers improved circuit voltage, not only before electron irradiation, but also after 1MeV electron irradiation with 5.1015 fluence. The simulation also shows that n/p solar cell structure is more resistant than that of p/n structure.

Keywords: InP solar cell, p/n and n/p structure, electron irradiation, output parameters

Procedia PDF Downloads 540
30343 Analyzing of Speed Disparity in Mixed Vehicle Technologies on Horizontal Curves

Authors: Tahmina Sultana, Yasser Hassan

Abstract:

Vehicle technologies rapidly evolving due to their multifaceted advantages. Adapted different vehicle technologies like connectivity and automation on the same roads with conventional vehicles controlled by human drivers may increase speed disparity in mixed vehicle technologies. Identifying relationships between speed distribution measures of different vehicles and road geometry can be an indicator of speed disparity in mixed technologies. Previous studies proved that speed disparity measures and traffic accidents are inextricably related. Horizontal curves from three geographic areas were selected based on relevant criteria, and speed data were collected at the midpoint of the preceding tangent and starting, ending, and middle point of the curve. Multiple linear mixed effect models (LME) were developed using the instantaneous speed measures representing the speed of vehicles at different points of horizontal curves to recognize relationships between speed variance (standard deviation) and road geometry. A simulation-based framework (Monte Carlo) was introduced to check the speed disparity on horizontal curves in mixed vehicle technologies when consideration is given to the interactions among connected vehicles (CVs), autonomous vehicles (AVs), and non-connected vehicles (NCVs) on horizontal curves. The Monte Carlo method was used in the simulation to randomly sample values for the various parameters from their respective distributions. Theresults show that NCVs had higher speed variation than CVs and AVs. In addition, AVs and CVs contributed to reduce speed disparity in the mixed vehicle technologies in any penetration rates.

Keywords: autonomous vehicles, connected vehicles, non-connected vehicles, speed variance

Procedia PDF Downloads 138
30342 Modeling of Cf-252 and PuBe Neutron Sources by Monte Carlo Method in Order to Develop Innovative BNCT Therapy

Authors: Marta Błażkiewicz, Adam Konefał

Abstract:

Currently, boron-neutron therapy is carried out mainly with the use of a neutron beam generated in research nuclear reactors. This fact limits the possibility of realization of a BNCT in centers distant from the above-mentioned reactors. Moreover, the number of active nuclear reactors in operation in the world is decreasing due to the limited lifetime of their operation and the lack of new installations. Therefore, the possibilities of carrying out boron-neutron therapy based on the neutron beam from the experimental reactor are shrinking. However, the use of nuclear power reactors for BNCT purposes is impossible due to the infrastructure not intended for radiotherapy. Therefore, a serious challenge is to find ways to perform boron-neutron therapy based on neutrons generated outside the research nuclear reactor. This work meets this challenge. Its goal is to develop a BNCT technique based on commonly available neutron sources such as Cf-252 and PuBe, which will enable the above-mentioned therapy in medical centers unrelated to nuclear research reactors. Advances in the field of neutron source fabrication make it possible to achieve strong neutron fluxes. The current stage of research focuses on the development of virtual models of the above-mentioned sources using the Monte Carlo simulation method. In this study, the GEANT4 tool was used, including the model for simulating neutron-matter interactions - High Precision Neutron. Models of neutron sources were developed on the basis of experimental verification based on the activation detectors method with the use of indium foil and the cadmium differentiation method allowing to separate the indium activation contribution from thermal and resonance neutrons. Due to the large number of factors affecting the result of the verification experiment, the 10% discrepancy between the simulation and experiment results was accepted.

Keywords: BNCT, virtual models, neutron sources, monte carlo, GEANT4, neutron activation detectors, gamma spectroscopy

Procedia PDF Downloads 174
30341 Non-Cognitive Skills Associated with Learning in a Serious Gaming Environment: A Pretest-Posttest Experimental Design

Authors: Tanja Kreitenweis

Abstract:

Lifelong learning is increasingly seen as essential for coping with the rapidly changing work environment. To this end, serious games can provide convenient and straightforward access to complex knowledge for all age groups. However, learning achievements depend largely on a learner’s non-cognitive skill disposition (e.g., motivation, self-belief, playfulness, and openness). With the aim of combining the fields of serious games and non-cognitive skills, this research focuses in particular on the use of a business simulation, which conveys change management insights. Business simulations are a subset of serious games and are perceived as a non-traditional learning method. The presented objectives of this work are versatile: (1) developing a scale, which measures learners’ knowledge and skills level before and after a business simulation was played, (2) investigating the influence of non-cognitive skills on learning in this business simulation environment and (3) exploring the moderating role of team preference in this type of learning setting. First, expert interviews have been conducted to develop an appropriate measure for learners’ skills and knowledge assessment. A pretest-posttest experimental design with German management students was implemented to approach the remaining objectives. By using the newly developed, reliable measure, it was found that students’ skills and knowledge state were higher after the simulation had been played, compared to before. A hierarchical regression analysis revealed two positive predictors for this outcome: motivation and self-esteem. Unexpectedly, playfulness had a negative impact. Team preference strengthened the link between grit and playfulness, respectively, and learners’ skills and knowledge state after completing the business simulation. Overall, the data underlined the potential of business simulations to improve learners’ skills and knowledge state. In addition, motivational factors were found as predictors for benefitting most from the applied business simulation. Recommendations are provided for how pedagogues can use these findings.

Keywords: business simulations, change management, (experiential) learning, non-cognitive skills, serious games

Procedia PDF Downloads 98
30340 Feasibility Study of Plant Design with Biomass Direct Chemical Looping Combustion for Power Generation

Authors: Reza Tirsadi Librawan, Tara Vergita Rakhma

Abstract:

The increasing demand for energy and concern of global warming are intertwined issues of critical importance. With the pressing needs of clean, efficient and cost-effective energy conversion processes, an alternative clean energy source is needed. Biomass is one of the preferable options because it is clean and renewable. The efficiency for biomass conversion is constrained by the relatively low energy density and high moisture content from biomass. This study based on bio-based resources presents the Biomass Direct Chemical Looping Combustion Process (BDCLC), an alternative process that has a potential to convert biomass in thermal cracking to produce electricity and CO2. The BDCLC process using iron-based oxygen carriers has been developed as a biomass conversion process with in-situ CO2 capture. The BDCLC system cycles oxygen carriers between two reactor, a reducer reactor and combustor reactor in order to convert coal for electric power generation. The reducer reactor features a unique design: a gas-solid counter-current moving bed configuration to achieve the reduction of Fe2O3 particles to a mixture of Fe and FeO while converting the coal into CO2 and steam. The combustor reactor is a fluidized bed that oxidizes the reduced particles back to Fe2O3 with air. The oxidation of iron is an exothermic reaction and the heat can be recovered for electricity generation. The plant design’s objective is to obtain 5 MW of electricity with the design of the reactor in 900 °C, 2 ATM for the reducer and 1200 °C, 16 ATM for the combustor. We conduct process simulation and analysis to illustrate the individual reactor performance and the overall mass and energy management scheme of BDCLC process that developed by Aspen Plus software. Process simulation is then performed based on the reactor performance data obtained in multistage model.

Keywords: biomass, CO2 capture, direct chemical looping combustion, power generation

Procedia PDF Downloads 494
30339 A Review on Water Models of Surface Water Environment

Authors: Shahbaz G. Hassan

Abstract:

Water quality models are very important to predict the changes in surface water quality for environmental management. The aim of this paper is to give an overview of the water qualities, and to provide directions for selecting models in specific situation. Water quality models include one kind of model based on a mechanistic approach, while other models simulate water quality without considering a mechanism. Mechanistic models can be widely applied and have capabilities for long-time simulation, with highly complexity. Therefore, more spaces are provided to explain the principle and application experience of mechanistic models. Mechanism models have certain assumptions on rivers, lakes and estuaries, which limits the application range of the model, this paper introduces the principles and applications of water quality model based on the above three scenarios. On the other hand, mechanistic models are more easily to compute, and with no limit to the geographical conditions, but they cannot be used with confidence to simulate long term changes. This paper divides the empirical models into two broad categories according to the difference of mathematical algorithm, models based on artificial intelligence and models based on statistical methods.

Keywords: empirical models, mathematical, statistical, water quality

Procedia PDF Downloads 248
30338 The Proposal for a Framework to Face Opacity and Discrimination ‘Sins’ Caused by Consumer Creditworthiness Machines in the EU

Authors: Diogo José Morgado Rebelo, Francisco António Carneiro Pacheco de Andrade, Paulo Jorge Freitas de Oliveira Novais

Abstract:

Not everything in AI-power consumer credit scoring turns out to be a wonder. When using AI in Creditworthiness Assessment (CWA), opacity and unfairness ‘sins’ must be considered to the task be deemed Responsible. AI software is not always 100% accurate, which can lead to misclassification. Discrimination of some groups can be exponentiated. A hetero personalized identity can be imposed on the individual(s) affected. Also, autonomous CWA sometimes lacks transparency when using black box models. However, for this intended purpose, human analysts ‘on-the-loop’ might not be the best remedy consumers are looking for in credit. This study seeks to explore the legality of implementing a Multi-Agent System (MAS) framework in consumer CWA to ensure compliance with the regulation outlined in Article 14(4) of the Proposal for an Artificial Intelligence Act (AIA), dated 21 April 2021 (as per the last corrigendum by the European Parliament on 19 April 2024), Especially with the adoption of Art. 18(8)(9) of the EU Directive 2023/2225, of 18 October, which will go into effect on 20 November 2026, there should be more emphasis on the need for hybrid oversight in AI-driven scoring to ensure fairness and transparency. In fact, the range of EU regulations on AI-based consumer credit will soon impact the AI lending industry locally and globally, as shown by the broad territorial scope of AIA’s Art. 2. Consequently, engineering the law of consumer’s CWA is imperative. Generally, the proposed MAS framework consists of several layers arranged in a specific sequence, as follows: firstly, the Data Layer gathers legitimate predictor sets from traditional sources; then, the Decision Support System Layer, whose Neural Network model is trained using k-fold Cross Validation, provides recommendations based on the feeder data; the eXplainability (XAI) multi-structure comprises Three-Step-Agents; and, lastly, the Oversight Layer has a 'Bottom Stop' for analysts to intervene in a timely manner. From the analysis, one can assure a vital component of this software is the XAY layer. It appears as a transparent curtain covering the AI’s decision-making process, enabling comprehension, reflection, and further feasible oversight. Local Interpretable Model-agnostic Explanations (LIME) might act as a pillar by offering counterfactual insights. SHapley Additive exPlanation (SHAP), another agent in the XAI layer, could address potential discrimination issues, identifying the contribution of each feature to the prediction. Alternatively, for thin or no file consumers, the Suggestion Agent can promote financial inclusion. It uses lawful alternative sources such as the share of wallet, among others, to search for more advantageous solutions to incomplete evaluation appraisals based on genetic programming. Overall, this research aspires to bring the concept of Machine-Centered Anthropocentrism to the table of EU policymaking. It acknowledges that, when put into service, credit analysts no longer exert full control over the data-driven entities programmers have given ‘birth’ to. With similar explanatory agents under supervision, AI itself can become self-accountable, prioritizing human concerns and values. AI decisions should not be vilified inherently. The issue lies in how they are integrated into decision-making and whether they align with non-discrimination principles and transparency rules.

Keywords: creditworthiness assessment, hybrid oversight, machine-centered anthropocentrism, EU policymaking

Procedia PDF Downloads 25
30337 Production, Quality Control, and Biodistribution Studies of 141ce-Edtmp as a Potential Bone Pain Palliation Agent

Authors: Fatemeh Soltani, Simindokht Shirvani Arani, Ali Bahrami Samani, Mahdi Sadeghi, Kamal Yavari

Abstract:

Cerium-141 [T1/2 = 32.501 days, Eβ (max) = 0.580 (29.8%) and 0.435(70.2%) MeV, Eγ=145.44 (48.2%) keV] possesses radionuclidic properties suitable for use in palliative therapy of bone metastases. 141Ce also has gamma energy of 145.44 keV, which resembles that of 99mTc. Therefore, the energy window is adjustable on the Tc-99m energy because of imaging studies. 141Ce can be produced through a relatively easy route that involves thermal neutron bombardment on natural CeO2 in medium flux research reactors (4–5×1013 neutrons/cm2•s). The requirement for an enriched target does not arise. Ethylenediamine tetramethylene phosphonic acid (EDTMP) was synthesized and radiolabeled with 141Ce. Complexation parameters were optimized to achieve maximum yields (>99%). The radiochemical purity of 141Ce-EDTMP was evaluated by radio-thin layer chromatography. The stability of the prepared formulation was monitored for one week at room temperature, and results showed that the preparation was stable during this period (>99%). Biodistribution studies of the complexes carried out in wild-type rats exhibited significant bone uptake with rapid clearance from blood. The properties of produced 141Ce-EDTMP suggest applying a new efficient bone pain palliative therapeutic agent to overcome metastatic bone pains.

Keywords: bone pain palliative, cerium-141, EDTMP, radiopharmaceutical

Procedia PDF Downloads 485
30336 Optimal Design of Multimachine Power System Stabilizers Using Improved Multi-Objective Particle Swarm Optimization Algorithm

Authors: Badr M. Alshammari, T. Guesmi

Abstract:

In this paper, the concept of a non-dominated sorting multi-objective particle swarm optimization with local search (NSPSO-LS) is presented for the optimal design of multimachine power system stabilizers (PSSs). The controller design is formulated as an optimization problem in order to shift the system electromechanical modes in a pre-specified region in the s-plan. A composite set of objective functions comprising the damping factor and the damping ratio of the undamped and lightly damped electromechanical modes is considered. The performance of the proposed optimization algorithm is verified for the 3-machine 9-bus system. Simulation results based on eigenvalue analysis and nonlinear time-domain simulation show the potential and superiority of the NSPSO-LS algorithm in tuning PSSs over a wide range of loading conditions and large disturbance compared to the classic PSO technique and genetic algorithms.

Keywords: multi-objective optimization, particle swarm optimization, power system stabilizer, low frequency oscillations

Procedia PDF Downloads 422
30335 Using Artificial Vision Techniques for Dust Detection on Photovoltaic Panels

Authors: Gustavo Funes, Eduardo Peters, Jose Delpiano

Abstract:

It is widely known that photovoltaic technology has been massively distributed over the last decade despite its low-efficiency ratio. Dust deposition reduces this efficiency even more, lowering the energy production and module lifespan. In this work, we developed an artificial vision algorithm based on CIELAB color space to identify dust over panels in an autonomous way. We performed several experiments photographing three different types of panels, 30W, 340W and 410W. Those panels were soiled artificially with uniform and non-uniform distributed dust. The algorithm proposed uses statistical tools to provide a simulation with a 100% soiled panel and then performs a comparison to get the percentage of dirt in the experimental data set. The simulation uses a seed that is obtained by taking a dust sample from the maximum amount of dust from the dataset. The final result is the dirt percentage and the possible distribution of dust over the panel. Dust deposition is a key factor for plant owners to determine cleaning cycles or identify nonuniform depositions that could lead to module failure and hot spots.

Keywords: dust detection, photovoltaic, artificial vision, soiling

Procedia PDF Downloads 40
30334 A Lagrangian Hamiltonian Computational Method for Hyper-Elastic Structural Dynamics

Authors: Hosein Falahaty, Hitoshi Gotoh, Abbas Khayyer

Abstract:

Performance of a Hamiltonian based particle method in simulation of nonlinear structural dynamics is subjected to investigation in terms of stability and accuracy. The governing equation of motion is derived based on Hamilton's principle of least action, while the deformation gradient is obtained according to Weighted Least Square method. The hyper-elasticity models of Saint Venant-Kirchhoff and a compressible version similar to Mooney- Rivlin are engaged for the calculation of second Piola-Kirchhoff stress tensor, respectively. Stability along with accuracy of numerical model is verified by reproducing critical stress fields in static and dynamic responses. As the results, although performance of Hamiltonian based model is evaluated as being acceptable in dealing with intense extensional stress fields, however kinds of instabilities reveal in the case of violent collision which can be most likely attributed to zero energy singular modes.

Keywords: Hamilton's principle of least action, particle-based method, hyper-elasticity, analysis of stability

Procedia PDF Downloads 334
30333 Conjunctive Management of Surface and Groundwater Resources under Uncertainty: A Retrospective Optimization Approach

Authors: Julius M. Ndambuki, Gislar E. Kifanyi, Samuel N. Odai, Charles Gyamfi

Abstract:

Conjunctive management of surface and groundwater resources is a challenging task due to the spatial and temporal variability nature of hydrology as well as hydrogeology of the water storage systems. Surface water-groundwater hydrogeology is highly uncertain; thus it is imperative that this uncertainty is explicitly accounted for, when managing water resources. Various methodologies have been developed and applied by researchers in an attempt to account for the uncertainty. For example, simulation-optimization models are often used for conjunctive water resources management. However, direct application of such an approach in which all realizations are considered at each iteration of the optimization process leads to a very expensive optimization in terms of computational time, particularly when the number of realizations is large. The aim of this paper, therefore, is to introduce and apply an efficient approach referred to as Retrospective Optimization Approximation (ROA) that can be used for optimizing conjunctive use of surface water and groundwater over a multiple hydrogeological model simulations. This work is based on stochastic simulation-optimization framework using a recently emerged technique of sample average approximation (SAA) which is a sampling based method implemented within the Retrospective Optimization Approximation (ROA) approach. The ROA approach solves and evaluates a sequence of generated optimization sub-problems in an increasing number of realizations (sample size). Response matrix technique was used for linking simulation model with optimization procedure. The k-means clustering sampling technique was used to map the realizations. The methodology is demonstrated through the application to a hypothetical example. In the example, the optimization sub-problems generated were solved and analysed using “Active-Set” core optimizer implemented under MATLAB 2014a environment. Through k-means clustering sampling technique, the ROA – Active Set procedure was able to arrive at a (nearly) converged maximum expected total optimal conjunctive water use withdrawal rate within a relatively few number of iterations (6 to 7 iterations). Results indicate that the ROA approach is a promising technique for optimizing conjunctive water use of surface water and groundwater withdrawal rates under hydrogeological uncertainty.

Keywords: conjunctive water management, retrospective optimization approximation approach, sample average approximation, uncertainty

Procedia PDF Downloads 222