Search results for: models and modeling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9197

Search results for: models and modeling

8957 The Direct Deconvolution Model for the Large Eddy Simulation of Turbulence

Authors: Ning Chang, Zelong Yuan, Yunpeng Wang, Jianchun Wang

Abstract:

Large eddy simulation (LES) has been extensively used in the investigation of turbulence. LES calculates the grid-resolved large-scale motions and leaves small scales modeled by sub lfilterscale (SFS) models. Among the existing SFS models, the deconvolution model has been used successfully in the LES of the engineering flows and geophysical flows. Despite the wide application of deconvolution models, the effects of subfilter scale dynamics and filter anisotropy on the accuracy of SFS modeling have not been investigated in depth. The results of LES are highly sensitive to the selection of fi lters and the anisotropy of the grid, which has been overlooked in previous research. In the current study, two critical aspects of LES are investigated. Firstly, we analyze the influence of sub-fi lter scale (SFS) dynamics on the accuracy of direct deconvolution models (DDM) at varying fi lter-to-grid ratios (FGR) in isotropic turbulence. An array of invertible filters are employed, encompassing Gaussian, Helmholtz I and II, Butterworth, Chebyshev I and II, Cauchy, Pao, and rapidly decaying filters. The signi ficance of FGR becomes evident, as it acts as a pivotal factor in error control for precise SFS stress prediction. When FGR is set to 1, the DDM models cannot accurately reconstruct the SFS stress due to the insufficient resolution of SFS dynamics. Notably, prediction capabilities are enhanced at an FGR of 2, resulting in accurate SFS stress reconstruction, except for cases involving Helmholtz I and II fi lters. A remarkable precision close to 100% is achieved at an FGR of 4 for all DDM models. Additionally, the further exploration extends to the fi lter anisotropy to address its impact on the SFS dynamics and LES accuracy. By employing dynamic Smagorinsky model (DSM), dynamic mixed model (DMM), and direct deconvolution model (DDM) with the anisotropic fi lter, aspect ratios (AR) ranging from 1 to 16 in LES fi lters are evaluated. The findings highlight the DDM's pro ficiency in accurately predicting SFS stresses under highly anisotropic filtering conditions. High correlation coefficients exceeding 90% are observed in the a priori study for the DDM's reconstructed SFS stresses, surpassing those of the DSM and DMM models. However, these correlations tend to decrease as lter anisotropy increases. In the a posteriori studies, the DDM model consistently outperforms the DSM and DMM models across various turbulence statistics, encompassing velocity spectra, probability density functions related to vorticity, SFS energy flux, velocity increments, strain-rate tensors, and SFS stress. It is observed that as fi lter anisotropy intensify , the results of DSM and DMM become worse, while the DDM continues to deliver satisfactory results across all fi lter-anisotropy scenarios. The fi ndings emphasize the DDM framework's potential as a valuable tool for advancing the development of sophisticated SFS models for LES of turbulence.

Keywords: deconvolution model, large eddy simulation, subfilter scale modeling, turbulence

Procedia PDF Downloads 46
8956 An in silico Approach for Exploring the Intercellular Communication in Cancer Cells

Authors: M. Cardenas-Garcia, P. P. Gonzalez-Perez

Abstract:

Intercellular communication is a necessary condition for cellular functions and it allows a group of cells to survive as a population. Throughout this interaction, the cells work in a coordinated and collaborative way which facilitates their survival. In the case of cancerous cells, these take advantage of intercellular communication to preserve their malignancy, since through these physical unions they can send signs of malignancy. The Wnt/β-catenin signaling pathway plays an important role in the formation of intercellular communications, being also involved in a large number of cellular processes such as proliferation, differentiation, adhesion, cell survival, and cell death. The modeling and simulation of cellular signaling systems have found valuable support in a wide range of modeling approaches, which cover a wide spectrum ranging from mathematical models; e.g., ordinary differential equations, statistical methods, and numerical methods– to computational models; e.g., process algebra for modeling behavior and variation in molecular systems. Based on these models, different simulation tools have been developed from mathematical ones to computational ones. Regarding cellular and molecular processes in cancer, its study has also found a valuable support in different simulation tools that, covering a spectrum as mentioned above, have allowed the in silico experimentation of this phenomenon at the cellular and molecular level. In this work, we simulate and explore the complex interaction patterns of intercellular communication in cancer cells using the Cellulat bioinformatics tool, a computational simulation tool developed by us and motivated by two key elements: 1) a biochemically inspired model of self-organizing coordination in tuple spaces, and 2) the Gillespie’s algorithm, a stochastic simulation algorithm typically used to mimic systems of chemical/biochemical reactions in an efficient and accurate way. The main idea behind the Cellulat simulation tool is to provide an in silico experimentation environment that complements and guides in vitro experimentation in intra and intercellular signaling networks. Unlike most of the cell signaling simulation tools, such as E-Cell, BetaWB and Cell Illustrator which provides abstractions to model only intracellular behavior, Cellulat is appropriate for modeling both intracellular signaling and intercellular communication, providing the abstractions required to model –and as a result, simulate– the interaction mechanisms that involve two or more cells, that is essential in the scenario discussed in this work. During the development of this work we made evident the application of our computational simulation tool (Cellulat) for the modeling and simulation of intercellular communication between normal and cancerous cells, and in this way, propose key molecules that may prevent the arrival of malignant signals to the cells that surround the tumor cells. In this manner, we could identify the significant role that has the Wnt/β-catenin signaling pathway in cellular communication, and therefore, in the dissemination of cancer cells. We verified, using in silico experiments, how the inhibition of this signaling pathway prevents that the cells that surround a cancerous cell are transformed.

Keywords: cancer cells, in silico approach, intercellular communication, key molecules, modeling and simulation

Procedia PDF Downloads 229
8955 Using Flow Line Modelling, Remote Sensing for Reconstructing Glacier Volume Loss Model for Athabasca Glacier, Canadian Rockies

Authors: Rituparna Nath, Shawn J. Marshall

Abstract:

Glaciers are one of the main sensitive climatic indicators, as they respond strongly to small climatic shifts. We develop a flow line model of glacier dynamics to simulate the past and future extent of glaciers in the Canadian Rocky Mountains, with the aim of coupling this model within larger scale regional climate models of glacier response to climate change. This paper will focus on glacier-climate modeling and reconstructions of glacier volume from the Little Ice Age (LIA) to present for Athabasca Glacier, Alberta, Canada. Glacier thickness, volume and mass change will be constructed using flow line modelling and examination of different climate scenarios that are able to give good reconstructions of LIA ice extent. With the availability of SPOT 5 imagery, Digital elevation models and GIS Arc Hydro tool, ice catchment properties-glacier width and LIA moraines have been extracted using automated procedures. Simulation of glacier mass change will inform estimates of meltwater run off over the historical period and model calibration from the LIA reconstruction will aid in future projections of the effects of climate change on glacier recession. Furthermore, the model developed will be effective for further future studies with ensembles of glaciers.

Keywords: flow line modeling, Athabasca Glacier, glacier mass balance, Remote Sensing, Arc hydro tool, little ice age

Procedia PDF Downloads 245
8954 Determining the Number of Single Models in a Combined Forecast

Authors: Serkan Aras, Emrah Gulay

Abstract:

Combining various forecasting models is an important tool for researchers to attain more accurate forecasts. A great number of papers have shown that selecting single models as dissimilar models, or methods based on different information as possible leads to better forecasting performances. However, there is not a certain rule regarding the number of single models to be used in any combining methods. This study focuses on determining the optimal or near optimal number for single models with the help of statistical tests. An extensive experiment is carried out by utilizing some well-known time series data sets from diverse fields. Furthermore, many rival forecasting methods and some of the commonly used combining methods are employed. The obtained results indicate that some statistically significant performance differences can be found regarding the number of the single models in the combining methods under investigation.

Keywords: combined forecast, forecasting, M-competition, time series

Procedia PDF Downloads 328
8953 A Study of Population Growth Models and Future Population of India

Authors: Sheena K. J., Jyoti Badge, Sayed Mohammed Zeeshan

Abstract:

A Comparative Study of Exponential and Logistic Population Growth Models in India India is the second most populous city in the world, just behind China, and is going to be in the first place by next year. The Indian population has remarkably at higher rate than the other countries from the past 20 years. There were many scientists and demographers who has formulated various models of population growth in order to study and predict the future population. Some of the models are Fibonacci population growth model, Exponential growth model, Logistic growth model, Lotka-Volterra model, etc. These models have been effective in the past to an extent in predicting the population. However, it is essential to have a detailed comparative study between the population models to come out with a more accurate one. Having said that, this research study helps to analyze and compare the two population models under consideration - exponential and logistic growth models, thereby identifying the most effective one. Using the census data of 2011, the approximate population for 2016 to 2031 are calculated for 20 Indian states using both the models, compared and recorded the data with the actual population. On comparing the results of both models, it is found that logistic population model is more accurate than the exponential model, and using this model, we can predict the future population in a more effective way. This will give an insight to the researchers about the effective models of population and how effective these population models are in predicting the future population.

Keywords: population growth, population models, exponential model, logistic model, fibonacci model, lotka-volterra model, future population prediction, demographers

Procedia PDF Downloads 90
8952 Logical-Probabilistic Modeling of the Reliability of Complex Systems

Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia

Abstract:

The paper presents logical-probabilistic methods, models, and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. It is important to design systems based on structural analysis, research, and evaluation of efficiency indicators. One of the important efficiency criteria is the reliability of the system, which depends on the components of the structure. Quantifying the reliability of large-scale systems is a computationally complex process, and it is advisable to perform it with the help of a computer. Logical-probabilistic modeling is one of the effective means of describing the structure of a complex system and quantitatively evaluating its reliability, which was the basis of our application. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of “weights” of elements of system. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research, and designing of optimal structure systems are carried out.

Keywords: complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability of systems, “weights” of elements

Procedia PDF Downloads 38
8951 Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.

Keywords: Integral differential equations, , L-stable methods, pricing European options, Jump–diffusion model

Procedia PDF Downloads 116
8950 Influence of Parameters of Modeling and Data Distribution for Optimal Condition on Locally Weighted Projection Regression Method

Authors: Farhad Asadi, Mohammad Javad Mollakazemi, Aref Ghafouri

Abstract:

Recent research in neural networks science and neuroscience for modeling complex time series data and statistical learning has focused mostly on learning from high input space and signals. Local linear models are a strong choice for modeling local nonlinearity in data series. Locally weighted projection regression is a flexible and powerful algorithm for nonlinear approximation in high dimensional signal spaces. In this paper, different learning scenario of one and two dimensional data series with different distributions are investigated for simulation and further noise is inputted to data distribution for making different disordered distribution in time series data and for evaluation of algorithm in locality prediction of nonlinearity. Then, the performance of this algorithm is simulated and also when the distribution of data is high or when the number of data is less the sensitivity of this approach to data distribution and influence of important parameter of local validity in this algorithm with different data distribution is explained.

Keywords: local nonlinear estimation, LWPR algorithm, online training method, locally weighted projection regression method

Procedia PDF Downloads 466
8949 Kinetic Modeling of Transesterification of Triacetin Using Synthesized Ion Exchange Resin (SIERs)

Authors: Hafizuddin W. Yussof, Syamsutajri S. Bahri, Adam P. Harvey

Abstract:

Strong anion exchange resins with QN+OH-, have the potential to be developed and employed as heterogeneous catalyst for transesterification, as they are chemically stable to leaching of the functional group. Nine different SIERs (SIER1-9) with QN+OH- were prepared by suspension polymerization of vinylbenzyl chloride-divinylbenzene (VBC-DVB) copolymers in the presence of n-heptane (pore-forming agent). The amine group was successfully grafted into the polymeric resin beads through functionalization with trimethylamine. These SIERs are then used as a catalyst for the transesterification of triacetin with methanol. A set of differential equations that represents the Langmuir-Hinshelwood-Hougen-Watson (LHHW) and Eley-Rideal (ER) models for the transesterification reaction were developed. These kinetic models of LHHW and ER were fitted to the experimental data. Overall, the synthesized ion exchange resin-catalyzed reaction were well-described by the Eley-Rideal model compared to LHHW models, with sum of square error (SSE) of 0.742 and 0.996, respectively.

Keywords: anion exchange resin, Eley-Rideal, Langmuir-Hinshelwood-Hougen-Watson, transesterification

Procedia PDF Downloads 333
8948 The Direct Deconvolutional Model in the Large-Eddy Simulation of Turbulence

Authors: Ning Chang, Zelong Yuan, Yunpeng Wang, Jianchun Wang

Abstract:

The utilization of Large Eddy Simulation (LES) has been extensive in turbulence research. LES concentrates on resolving the significant grid-scale motions while representing smaller scales through subfilter-scale (SFS) models. The deconvolution model, among the available SFS models, has proven successful in LES of engineering and geophysical flows. Nevertheless, the thorough investigation of how sub-filter scale dynamics and filter anisotropy affect SFS modeling accuracy remains lacking. The outcomes of LES are significantly influenced by filter selection and grid anisotropy, factors that have not been adequately addressed in earlier studies. This study examines two crucial aspects of LES: Firstly, the accuracy of direct deconvolution models (DDM) is evaluated concerning sub-filter scale (SFS) dynamics across varying filter-to-grid ratios (FGR) in isotropic turbulence. Various invertible filters are employed, including Gaussian, Helmholtz I and II, Butterworth, Chebyshev I and II, Cauchy, Pao, and rapidly decaying filters. The importance of FGR becomes evident as it plays a critical role in controlling errors for precise SFS stress prediction. When FGR is set to 1, the DDM models struggle to faithfully reconstruct SFS stress due to inadequate resolution of SFS dynamics. Notably, prediction accuracy improves when FGR is set to 2, leading to accurate reconstruction of SFS stress, except for cases involving Helmholtz I and II filters. Remarkably high precision, nearly 100%, is achieved at an FGR of 4 for all DDM models. Furthermore, the study extends to filter anisotropy and its impact on SFS dynamics and LES accuracy. By utilizing the dynamic Smagorinsky model (DSM), dynamic mixed model (DMM), and direct deconvolution model (DDM) with anisotropic filters, aspect ratios (AR) ranging from 1 to 16 are examined in LES filters. The results emphasize the DDM’s proficiency in accurately predicting SFS stresses under highly anisotropic filtering conditions. Notably high correlation coefficients exceeding 90% are observed in the a priori study for the DDM’s reconstructed SFS stresses, surpassing those of the DSM and DMM models. However, these correlations tend to decrease as filter anisotropy increases. In the a posteriori analysis, the DDM model consistently outperforms the DSM and DMM models across various turbulence statistics, including velocity spectra, probability density functions related to vorticity, SFS energy flux, velocity increments, strainrate tensors, and SFS stress. It is evident that as filter anisotropy intensifies, the results of DSM and DMM deteriorate, while the DDM consistently delivers satisfactory outcomes across all filter-anisotropy scenarios. These findings underscore the potential of the DDM framework as a valuable tool for advancing the development of sophisticated SFS models for LES in turbulence research.

Keywords: deconvolution model, large eddy simulation, subfilter scale modeling, turbulence

Procedia PDF Downloads 43
8947 Modeling Residual Modulus of Elasticity of Self-Compacted Concrete Using Artificial Neural Networks

Authors: Ahmed M. Ashteyat

Abstract:

Artificial Neural Network (ANN) models have been widely used in material modeling, inter-correlations, as well as behavior and trend predictions when the nonlinear relationship between system parameters cannot be quantified explicitly and mathematically. In this paper, ANN was used to predict the residual modulus of elasticity (RME) of self compacted concrete (SCC) damaged by heat. The ANN model was built, trained, tested and validated using a total of 112 experimental data sets, gathered from available literature. The data used in model development included temperature, relative humidity conditions, mix proportions, filler types, and fiber type. The result of ANN training, testing, and validation indicated that the RME of SCC, exposed to different temperature and relative humidity levels, could be predicted accurately with ANN techniques. The reliability between the predicated outputs and the actual experimental data was 99%. This show that ANN has strong potential as a feasible tool for predicting residual elastic modulus of SCC damaged by heat within the range of input parameter. The ANN model could be used to estimate the RME of SCC, as a rapid inexpensive substitute for the much more complicated and time consuming direct measurement of the RME of SCC.

Keywords: residual modulus of elasticity, artificial neural networks, self compacted-concrete, material modeling

Procedia PDF Downloads 506
8946 The Role of Uncertainty in the Integration of Environmental Parameters in Energy System Modeling

Authors: Alexander de Tomás, Miquel Sierra, Stefan Pfenninger, Francesco Lombardi, Ines Campos, Cristina Madrid

Abstract:

Environmental parameters are key in the definition of sustainable energy systems yet excluded from most energy system optimization models. Still, decision-making may be misleading without considering them. Environmental analyses of the energy transition are a key part of industrial ecology but often are performed without any input from the users of the information. This work assesses the systemic impacts of energy transition pathways in Portugal. Using the Calliope energy modeling framework, 250+ optimized energy system pathways are generated. A Delphi study helps to identify the relevant criteria for the stakeholders as regards the environmental assessment, which is performed with ENBIOS, a python package that integrates life cycle assessment (LCA) with a metabolic analysis based on complex relations. Furthermore, this study focuses on how the uncertainty propagates through the model’s consortium. With the aim of doing so, a soft link between the Calliope/ENBIOS cascade and Brightway’s data capabilities is built to perform Monte Carlo simulations. These findings highlight the relevance of including uncertainty analysis as a range of values rather than informing energy transition results with a single value.

Keywords: energy transition, energy modeling, uncertainty, sustainability

Procedia PDF Downloads 52
8945 Modelling and Simulation of the Freezing Systems and Heat Pumps Using Unisim® Design

Authors: C. Patrascioiu

Abstract:

The paper describes the modeling and simulation of the heat pumps domain processes. The main objective of the study is the use of the heat pump in propene–propane distillation processes. The modeling and simulation instrument is the Unisim® Design simulator. The paper is structured in three parts: An overview of the compressing gases, the modeling and simulation of the freezing systems, and the modeling and simulation of the heat pumps. For each of these systems, there are presented the Unisim® Design simulation diagrams, the input–output system structure and the numerical results. Future studies will consider modeling and simulation of the propene–propane distillation process with heat pump.

Keywords: distillation, heat pump, simulation, unisim design

Procedia PDF Downloads 334
8944 Machine Learning Approach for Predicting Students’ Academic Performance and Study Strategies Based on Their Motivation

Authors: Fidelia A. Orji, Julita Vassileva

Abstract:

This research aims to develop machine learning models for students' academic performance and study strategy prediction, which could be generalized to all courses in higher education. Key learning attributes (intrinsic, extrinsic, autonomy, relatedness, competence, and self-esteem) used in building the models are chosen based on prior studies, which revealed that the attributes are essential in students’ learning process. Previous studies revealed the individual effects of each of these attributes on students’ learning progress. However, few studies have investigated the combined effect of the attributes in predicting student study strategy and academic performance to reduce the dropout rate. To bridge this gap, we used Scikit-learn in python to build five machine learning models (Decision Tree, K-Nearest Neighbour, Random Forest, Linear/Logistic Regression, and Support Vector Machine) for both regression and classification tasks to perform our analysis. The models were trained, evaluated, and tested for accuracy using 924 university dentistry students' data collected by Chilean authors through quantitative research design. A comparative analysis of the models revealed that the tree-based models such as the random forest (with prediction accuracy of 94.9%) and decision tree show the best results compared to the linear, support vector, and k-nearest neighbours. The models built in this research can be used in predicting student performance and study strategy so that appropriate interventions could be implemented to improve student learning progress. Thus, incorporating strategies that could improve diverse student learning attributes in the design of online educational systems may increase the likelihood of students continuing with their learning tasks as required. Moreover, the results show that the attributes could be modelled together and used to adapt/personalize the learning process.

Keywords: classification models, learning strategy, predictive modeling, regression models, student academic performance, student motivation, supervised machine learning

Procedia PDF Downloads 95
8943 Dynamic vs. Static Bankruptcy Prediction Models: A Dynamic Performance Evaluation Framework

Authors: Mohammad Mahdi Mousavi

Abstract:

Bankruptcy prediction models have been implemented for continuous evaluation and monitoring of firms. With the huge number of bankruptcy models, an extensive number of studies have focused on answering the question that which of these models are superior in performance. In practice, one of the drawbacks of existing comparative studies is that the relative assessment of alternative bankruptcy models remains an exercise that is mono-criterion in nature. Further, a very restricted number of criteria and measure have been applied to compare the performance of competing bankruptcy prediction models. In this research, we overcome these methodological gaps through implementing an extensive range of criteria and measures for comparison between dynamic and static bankruptcy models, and through proposing a multi-criteria framework to compare the relative performance of bankruptcy models in forecasting firm distress for UK firms.

Keywords: bankruptcy prediction, data envelopment analysis, performance criteria, performance measures

Procedia PDF Downloads 221
8942 User Intention Generation with Large Language Models Using Chain-of-Thought Prompting Title

Authors: Gangmin Li, Fan Yang

Abstract:

Personalized recommendation is crucial for any recommendation system. One of the techniques for personalized recommendation is to identify the intention. Traditional user intention identification uses the user’s selection when facing multiple items. This modeling relies primarily on historical behaviour data resulting in challenges such as the cold start, unintended choice, and failure to capture intention when items are new. Motivated by recent advancements in Large Language Models (LLMs) like ChatGPT, we present an approach for user intention identification by embracing LLMs with Chain-of-Thought (CoT) prompting. We use the initial user profile as input to LLMs and design a collection of prompts to align the LLM's response through various recommendation tasks encompassing rating prediction, search and browse history, user clarification, etc. Our tests on real-world datasets demonstrate the improvements in recommendation by explicit user intention identification and, with that intention, merged into a user model.

Keywords: personalized recommendation, generative user modelling, user intention identification, large language models, chain-of-thought prompting

Procedia PDF Downloads 9
8941 Parametric Analysis of Lumped Devices Modeling Using Finite-Difference Time-Domain

Authors: Felipe M. de Freitas, Icaro V. Soares, Lucas L. L. Fortes, Sandro T. M. Gonçalves, Úrsula D. C. Resende

Abstract:

The SPICE-based simulators are quite robust and widely used for simulation of electronic circuits, their algorithms support linear and non-linear lumped components and they can manipulate an expressive amount of encapsulated elements. Despite the great potential of these simulators based on SPICE in the analysis of quasi-static electromagnetic field interaction, that is, at low frequency, these simulators are limited when applied to microwave hybrid circuits in which there are both lumped and distributed elements. Usually the spatial discretization of the FDTD (Finite-Difference Time-Domain) method is done according to the actual size of the element under analysis. After spatial discretization, the Courant Stability Criterion calculates the maximum temporal discretization accepted for such spatial discretization and for the propagation velocity of the wave. This criterion guarantees the stability conditions for the leapfrogging of the Yee algorithm; however, it is known that for the field update, the stability of the complete FDTD procedure depends on factors other than just the stability of the Yee algorithm, because the FDTD program needs other algorithms in order to be useful in engineering problems. Examples of these algorithms are Absorbent Boundary Conditions (ABCs), excitation sources, subcellular techniques, grouped elements, and non-uniform or non-orthogonal meshes. In this work, the influence of the stability of the FDTD method in the modeling of concentrated elements such as resistive sources, resistors, capacitors, inductors and diode will be evaluated. In this paper is proposed, therefore, the electromagnetic modeling of electronic components in order to create models that satisfy the needs for simulations of circuits in ultra-wide frequencies. The models of the resistive source, the resistor, the capacitor, the inductor, and the diode will be evaluated, among the mathematical models for lumped components in the LE-FDTD method (Lumped-Element Finite-Difference Time-Domain), through the parametric analysis of Yee cells size which discretizes the lumped components. In this way, it is sought to find an ideal cell size so that the analysis in FDTD environment is in greater agreement with the expected circuit behavior, maintaining the stability conditions of this method. Based on the mathematical models and the theoretical basis of the required extensions of the FDTD method, the computational implementation of the models in Matlab® environment is carried out. The boundary condition Mur is used as the absorbing boundary of the FDTD method. The validation of the model is done through the comparison between the obtained results by the FDTD method through the electric field values and the currents in the components, and the analytical results using circuit parameters.

Keywords: hybrid circuits, LE-FDTD, lumped element, parametric analysis

Procedia PDF Downloads 123
8940 Three Dimensional Computational Fluid Dynamics Simulation of Wall Condensation inside Inclined Tubes

Authors: Amirhosein Moonesi Shabestary, Eckhard Krepper, Dirk Lucas

Abstract:

The current PhD project comprises CFD-modeling and simulation of condensation and heat transfer inside horizontal pipes. Condensation plays an important role in emergency cooling systems of reactors. The emergency cooling system consists of inclined horizontal pipes which are immersed in a tank of subcooled water. In the case of an accident the water level in the core is decreasing, steam comes in the emergency pipes, and due to the subcooled water around the pipe, this steam will start to condense. These horizontal pipes act as a strong heat sink which is responsible for a quick depressurization of the reactor core when any accident happens. This project is defined in order to model all these processes which happening in the emergency cooling systems. The most focus of the project is on detection of different morphologies such as annular flow, stratified flow, slug flow and plug flow. This project is an ongoing project which has been started 1 year ago in Helmholtz Zentrum Dresden Rossendorf (HZDR), Fluid Dynamics department. In HZDR most in cooperation with ANSYS different models are developed for modeling multiphase flows. Inhomogeneous MUSIG model considers the bubble size distribution and is used for modeling small-scaled dispersed gas phase. AIAD (Algebraic Interfacial Area Density Model) is developed for detection of the local morphology and corresponding switch between them. The recent model is GENTOP combines both concepts. GENTOP is able to simulate co-existing large-scaled (continuous) and small-scaled (polydispersed) structures. All these models are validated for adiabatic cases without any phase change. Therefore, the start point of the current PhD project is using the available models and trying to integrate phase transition and wall condensing models into them. In order to simplify the idea of condensation inside horizontal tubes, 3 steps have been defined. The first step is the investigation of condensation inside a horizontal tube by considering only direct contact condensation (DCC) and neglect wall condensation. Therefore, the inlet of the pipe is considered to be annular flow. In this step, AIAD model is used in order to detect the interface. The second step is the extension of the model to consider wall condensation as well which is closer to the reality. In this step, the inlet is pure steam, and due to the wall condensation, a liquid film occurs near the wall which leads to annular flow. The last step will be modeling of different morphologies which are occurring inside the tube during the condensation via using GENTOP model. By using GENTOP, the dispersed phase is able to be considered and simulated. Finally, the results of the simulations will be validated by experimental data which will be available also in HZDR.

Keywords: wall condensation, direct contact condensation, AIAD model, morphology detection

Procedia PDF Downloads 268
8939 Analyzing How Working From Home Can Lead to Higher Job Satisfaction for Employees Who Have Care Responsibilities Using Structural Equation Modeling

Authors: Christian Louis Kühner, Florian Pfeffel, Valentin Nickolai

Abstract:

Taking care of children, dependents, or pets can be a difficult and time-consuming task. Especially for part- and full-time employees, it can feel exhausting and overwhelming to meet these obligations besides working a job. Thus, working mostly at home and not having to drive to the company can save valuable time and stress. This study aims to show the influence that the working model has on the job satisfaction of employees with care responsibilities in comparison to employees who do not have such obligations. Using structural equation modeling (SEM), the three work models, “work from home”, “working remotely”, and a hybrid model, have been analyzed based on 13 influencing constructs on job satisfaction. These 13 factors have been further summarized into three groups “classic influencing factors”, “influencing factors changed by remote working”, and “new remote working influencing factors”. Based on the influencing factors on job satisfaction, an online survey was conducted with n = 684 employees from the service sector. Here, Cronbach’s alpha of the individual constructs was shown to be suitable. Furthermore, the construct validity of the constructs was confirmed by face validity, content validity, convergent validity (AVE > 0.5: CR > 0.7), and discriminant validity. In addition, confirmatory factor analysis (CFA) confirmed the model fit for the investigated sample (CMIN/DF: 2.567; CFI: 0.927; RMSEA: 0.048). The SEM-analysis has shown that the most significant influencing factor on job satisfaction is “identification with the work” with β = 0.540, followed by “Appreciation” (β = 0.151), “Compensation” (β = 0.124), “Work-Life-Balance” (β = 0.116), and “Communication and Exchange of Information” (β = 0.105). While the significance of each factor can vary depending on the work model, the SEM-analysis shows that the identification with the work is the most significant factor in all three work models and, in the case of the traditional office work model, it is the only significant influencing factor. The study shows that among the employees with care responsibilities, the higher the proportion of working from home in comparison to working from the office, the more satisfied the employees are with their job. Since the work models that meet the requirements of comprehensive care led to higher job satisfaction amongst employees with such obligations, adapting as a company to such private obligations by employees can be crucial to sustained success. Conversely, the satisfaction level of the working model where employees work at the office is higher for workers without caregiving responsibilities.

Keywords: care responsibilities, home office, job satisfaction, structural equation modeling

Procedia PDF Downloads 54
8938 Modeling of Coupled Mechanical State and Diffusion in Composites with Impermeable Fibers

Authors: D. Gueribiz, F. Jacquemin, S. Fréour

Abstract:

During their service life, composite materials are submitted to humid environments. The moisture absorbed by their matrix polymer induced internal stresses which can lead to multi-scale damage and may reduce the lifetime of composite structures. The estimation of internal stresses is based at a first on realistic evaluation of the diffusive behavior of composite materials. Generally, the modeling and simulation of the diffusive behavior of composite materials are extensively investigated through decoupled models based on the assumption of Fickien behavior. For these approaches, the concentration and the deformation (or stresses), the two state variables of the problem considered are governed by independent equations which are solved separately. In this study, a model coupling diffusive behavior with stresses state for a polymer matrix composite reinforced with impermeable fibers is proposed, the investigation of diffusive behavior is based on a more general thermodynamic approach which introduces a dependence of diffusive behavior on internal stresses state. The coupled diffusive behavior modeling was established in first for homogeneous and isotropic matrix and it is, thereafter, extended to impermeable unidirectional composites.

Keywords: composites materials, moisture diffusion, effective moisture diffusivity, coupled moisture diffusion

Procedia PDF Downloads 278
8937 Profitability Assessment of Granite Aggregate Production and the Development of a Profit Assessment Model

Authors: Melodi Mbuyi Mata, Blessing Olamide Taiwo, Afolabi Ayodele David

Abstract:

The purpose of this research is to create empirical models for assessing the profitability of granite aggregate production in Akure, Ondo state aggregate quarries. In addition, an artificial neural network (ANN) model and multivariate predicting models for granite profitability were developed in the study. A formal survey questionnaire was used to collect data for the study. The data extracted from the case study mine for this study includes granite marketing operations, royalty, production costs, and mine production information. The following methods were used to achieve the goal of this study: descriptive statistics, MATLAB 2017, and SPSS16.0 software in analyzing and modeling the data collected from granite traders in the study areas. The ANN and Multi Variant Regression models' prediction accuracy was compared using a coefficient of determination (R²), Root mean square error (RMSE), and mean square error (MSE). Due to the high prediction error, the model evaluation indices revealed that the ANN model was suitable for predicting generated profit in a typical quarry. More quarries in Nigeria's southwest region and other geopolitical zones should be considered to improve ANN prediction accuracy.

Keywords: national development, granite, profitability assessment, ANN models

Procedia PDF Downloads 72
8936 Modeling User Departure Time Choice for Trips in Urban Streets

Authors: Saeed Sayyad Hagh Shomar

Abstract:

Modeling users’ decisions on departure time choice is the main motivation for this research. In particular, it examines the impact of social-demographic features, household, job characteristics and trip qualities on individuals’ departure time choice. Departure time alternatives are presented as adjacent discrete time periods. The choice between these alternatives is done using a discrete choice model. Since a great deal of early morning trips and traffic congestion at that time of the day comprise work trips, the focus of this study is on the work trip over the entire day. Therefore, this study by using questionnaire of stated preference models users’ departure time choice affected by congestion pricing plan in downtown Tehran. Experimental results demonstrate efficient social-demographic impact on work trips’ departure time. These findings have substantial outcomes for the analysis of transportation planning. Particularly, the analysis shows that ignoring the effects of these variables could result in erroneous information and consequently decisions in the field of transportation planning and air quality would fail and cause financial resources loss.

Keywords: modeling, departure time, travel timing, time of the day, congestion pricing, transportation planning

Procedia PDF Downloads 408
8935 Joint Modeling of Longitudinal and Time-To-Event Data with Latent Variable

Authors: Xinyuan Y. Song, Kai Kang

Abstract:

Joint models for analyzing longitudinal and survival data are widely used to investigate the relationship between a failure time process and time-variant predictors. A common assumption in conventional joint models in the survival analysis literature is that all predictors are observable. However, this assumption may not always be supported because unobservable traits, namely, latent variables, which are indirectly observable and should be measured through multiple observed variables, are commonly encountered in the medical, behavioral, and financial research settings. In this study, a joint modeling approach to deal with this feature is proposed. The proposed model comprises three parts. The first part is a dynamic factor analysis model for characterizing latent variables through multiple observed indicators over time. The second part is a random coefficient trajectory model for describing the individual trajectories of latent variables. The third part is a proportional hazard model for examining the effects of time-invariant predictors and the longitudinal trajectories of time-variant latent risk factors on hazards of interest. A Bayesian approach coupled with a Markov chain Monte Carlo algorithm to perform statistical inference. An application of the proposed joint model to a study on the Alzheimer's disease neuroimaging Initiative is presented.

Keywords: Bayesian analysis, joint model, longitudinal data, time-to-event data

Procedia PDF Downloads 113
8934 Multivariate Dependent Frequency-Severity Modeling of Insurance Claims: A Vine Copula Approach

Authors: Islem Kedidi, Rihab Bedoui Bensalem, Faysal Manssouri

Abstract:

In traditional models of insurance data, the number and size of claims are assumed to be independent. Relaxing the independence assumption, this article explores the Vine copula to model dependence structure between multivariate frequency and average severity of insurance claim. To illustrate this approach, we use the Wisconsin local government property insurance fund which offers several insurance protections for motor vehicles, property and contractor’s equipment claims. Results show that the C-vine copula can better characterize the multivariate dependence structure between frequency and severity. Furthermore, we find significant dependencies especially between frequency and average severity among different coverage types.

Keywords: dependency modeling, government insurance, insurance claims, vine copula

Procedia PDF Downloads 172
8933 Building Information Modeling Implementation for Managing an Extra Large Governmental Building Renovation Project

Authors: Pornpote Nusen, Manop Kaewmoracharoen

Abstract:

In recent years, there was an observable shift in fully developed countries from constructing new buildings to modifying existing buildings. The issue was that although an effective instrument like BIM (Building Information Modeling) was well developed for constructing new buildings, it was not widely used to renovate old buildings. BIM was accepted as an effective means to overcome common managerial problems such as project delay, cost overrun, and poor quality of the project life cycle. It was recently introduced in Thailand and rarely used in a renovation project. Today, in Thailand, BIM is mostly used for creating aesthetic 3D models and quantity takeoff purposes, though it can be an effective tool to use as a project management tool in planning and scheduling. Now the governmental sector in Thailand begins to recognize the uses of using BIM to manage a construction project, but the knowledge about the BIM implementation to governmental construction projects is underdeveloped. Further studies need to be conducted to maximize its advantages for the governmental sector. An educational extra large governmental building of 17,000 square-meters was used in this research. It is currently under construction for a two-year renovation project. BIM models of the building for the exterior and interior areas were created for the whole five floors. Then 4D BIM with combination of 3D BIM plus time was created for planning and scheduling. Three focus groups had been done with executive committee, contractors, and officers of the building to discuss the possibility of usage and usefulness of BIM approach over the traditional process. Several aspects were discussed in the positive sides, especially several foreseen problems, such as the inadequate accessibility of ways, the altered ceiling levels, the impractical construction plan created through a traditional approach, and the lack of constructability information. However, for some parties, the cost of BIM implementation was a concern, though, this study believes, its uses outweigh the cost.

Keywords: building information modeling, extra large building, governmental building renovation, project management, renovation, 4D BIM

Procedia PDF Downloads 122
8932 Improving the Training for Civil Engineers by Introducing Virtual Reality Technique

Authors: Manar Al-Ateeq

Abstract:

The building construction industry plays a major role in the economy of the word and the state of Kuwait. This paper evaluates existing new civil site engineers, describes a new system for improvement and insures the importance of prequalifying and developing for new engineers. In order to have a strong base in engineering, educational institutes and workplaces should be responsible to continuously train engineers and update them with new methods and techniques in engineering. As to achieve that, school of engineering should constantly update computational resources to be used in the professions. A survey was prepared for graduated Engineers based on stated objectives to understand the status of graduate engineers in both the public and private sector. Interviews were made with different sectors in Kuwait, and several visits were made to different training centers within different workplaces in Kuwait to evaluate training process and try to improve it. Virtual Reality (VR) technology could be applied as a complement to three-dimensional (3D) modeling, leading to better communication whether in job training, in education or in professional practice. Techniques of 3D modeling and VR can be applied to develop the models related to the construction process. The 3D models can support rehabilitation design as it can be considered as a great tool for monitoring failure and defaults in structures; also it can support decisions based on the visual analyses of alternative solutions. Therefore, teaching computer-aided design (CAD) and VR techniques in school will help engineering students in order to prepare them to site work and also will assist them to consider these technologies as important supports in their later professional practice. This teaching technique will show how the construction works developed, allow the visual simulation of progression of each type of work and help them to know more about the necessary equipment needed for tasks and how it works on site.

Keywords: three dimensional modeling (3DM), civil engineers (CE), professional practice (PP), virtual reality (VR)

Procedia PDF Downloads 144
8931 Modeling and Simulation for 3D Eddy Current Testing in Conducting Materials

Authors: S. Bennoud, M. Zergoug

Abstract:

The numerical simulation of electromagnetic interactions is still a challenging problem, especially in problems that result in fully three dimensional mathematical models. The goal of this work is to use mathematical modeling to characterize the reliability and capacity of eddy current technique to detect and characterize defects embedded in aeronautical in-service pieces. The finite element method is used for describing the eddy current technique in a mathematical model by the prediction of the eddy current interaction with defects. However, this model is an approximation of the full Maxwell equations. In this study, the analysis of the problem is based on a three dimensional finite element model that computes directly the electromagnetic field distortions due to defects.

Keywords: eddy current, finite element method, non destructive testing, numerical simulations

Procedia PDF Downloads 414
8930 Risk Based Building Information Modeling (BIM) for Urban Infrastructure Transportation Project

Authors: Debasis Sarkar

Abstract:

Building Information Modeling (BIM) is a holistic documentation process for operational visualization, design coordination, estimation and project scheduling. BIM software defines objects parametrically and it is a tool for virtual reality. Primary advantage of implementing BIM is the visual coordination of the building structure and systems such as Mechanical, Electrical and Plumbing (MEP) and it also identifies the possible conflicts between the building systems. This paper is an attempt to develop a risk based BIM model which would highlight the primary advantages of application of BIM pertaining to urban infrastructure transportation project. It has been observed that about 40% of the Architecture, Engineering and Construction (AEC) companies use BIM but primarily for their outsourced projects. Also, 65% of the respondents agree that BIM would be used quiet strongly for future construction projects in India. The 3D models developed with Revit 2015 software would reduce co-ordination problems amongst the architects, structural engineers, contractors and building service providers (MEP). Integration of risk management along with BIM would provide enhanced co-ordination, collaboration and high probability of successful completion of the complex infrastructure transportation project within stipulated time and cost frame.

Keywords: building information modeling (BIM), infrastructure transportation, project risk management, underground metro rail

Procedia PDF Downloads 279
8929 An Online 3D Modeling Method Based on a Lossless Compression Algorithm

Authors: Jiankang Wang, Hongyang Yu

Abstract:

This paper proposes a portable online 3D modeling method. The method first utilizes a depth camera to collect data and compresses the depth data using a frame-by-frame lossless data compression method. The color image is encoded using the H.264 encoding format. After the cloud obtains the color image and depth image, a 3D modeling method based on bundlefusion is used to complete the 3D modeling. The results of this study indicate that this method has the characteristics of portability, online, and high efficiency and has a wide range of application prospects.

Keywords: 3D reconstruction, bundlefusion, lossless compression, depth image

Procedia PDF Downloads 50
8928 Landscape Genetic and Species Distribution Modeling of Date Palm (Phoenix dactylifera L.)

Authors: Masoud Sheidaei, Fahimeh Koohdar

Abstract:

Date palms are economically important tree plants with high nutrition and medicinal values. More than 400 date palm cultivars are cultivated in many regions of Iran, but no report is available on landscape genetics and species distribution modeling of these trees from the country. Therefore, the present study provides a detailed insight into the genetic diversity and structure of date palm populations in Iran and investigates the effects of geographical and climatic variables on the structuring of genetic diversity in them. We used different computational methods in the study like, spatial principal components analysis (sPCA), redundancy analysis (RDA), latent factor mixed model (LFMM), and Maxent and Dismo models of species distribution modeling. We used a combination of different molecular markers for this study. The results showed that both global and local spatial features play an important role in the genetic structuring of date palms, and the genetic regions associated with local adaptation and climatic variables were identified. The effects of climatic change on the distribution of these taxa and the genetic regions adaptive to these changes will be discussed.

Keywords: adaptive genetic regions, genetic diversity, isolation by distance, populations divergence

Procedia PDF Downloads 75