Search results for: 3D Models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6785

Search results for: 3D Models

4205 Modeling Methodologies for Optimization and Decision Support on Coastal Transport Information System (Co.Tr.I.S.)

Authors: Vassilios Moussas, Dimos N. Pantazis, Panagioths Stratakis

Abstract:

The aim of this paper is to present the optimization methodology developed in the frame of a Coastal Transport Information System. The system will be used for the effective design of coastal transportation lines and incorporates subsystems that implement models, tools and techniques that may support the design of improved networks. The role of the optimization and decision subsystem is to provide the user with better and optimal scenarios that will best fulfill any constrains, goals or requirements posed. The complexity of the problem and the large number of parameters and objectives involved led to the adoption of an evolutionary method (Genetic Algorithms). The problem model and the subsystem structure are presented in detail, and, its support for simulation is also discussed.

Keywords: coastal transport, modeling, optimization

Procedia PDF Downloads 500
4204 Implementation of Green Deal Policies and Targets in Energy System Optimization Models: The TEMOA-Europe Case

Authors: Daniele Lerede, Gianvito Colucci, Matteo Nicoli, Laura Savoldi

Abstract:

The European Green Deal is the first internationally agreed set of measures to contrast climate change and environmental degradation. Besides the main target of reducing emissions by at least 55% by 2030, it sets the target of accompanying European countries through an energy transition to make the European Union into a modern, resource-efficient, and competitive net-zero emissions economy by 2050, decoupling growth from the use of resources and ensuring a fair adaptation of all social categories to the transformation process. While the general purpose to allow the realization of the purposes of the Green Deal already dates back to 2019, strategies and policies keep being developed coping with recent circumstances and achievements. However, general long-term measures like the Circular Economy Action Plan, the proposals to shift from fossil natural gas to renewable and low-carbon gases, in particular biomethane and hydrogen, and to end the sale of gasoline and diesel cars by 2035, will all have significant effects on energy supply and demand evolution across the next decades. The interactions between energy supply and demand over long-term time frames are usually assessed via energy system models to derive useful insights for policymaking and to address technological choices and research and development. TEMOA-Europe is a newly developed energy system optimization model instance based on the minimization of the total cost of the system under analysis, adopting a technologically integrated, detailed, and explicit formulation and considering the evolution of the system in partial equilibrium in competitive markets with perfect foresight. TEMOA-Europe is developed on the TEMOA platform, an open-source modeling framework totally implemented in Python, therefore ensuring third-party verification even on large and complex models. TEMOA-Europe is based on a single-region representation of the European Union and EFTA countries on a time scale between 2005 and 2100, relying on a set of assumptions for socio-economic developments based on projections by the International Energy Outlook and a large technological dataset including 7 sectors: the upstream and power sectors for the production of all energy commodities and the end-use sectors, including industry, transport, residential, commercial and agriculture. TEMOA-Europe also includes an updated hydrogen module considering its production, storage, transportation, and utilization. Besides, it can rely on a wide set of innovative technologies, ranging from nuclear fusion and electricity plants equipped with CCS in the power sector to electrolysis-based steel production processes and steel in the industrial sector – with a techno-economic characterization based on public literature – to produce insightful energy scenarios and especially to cope with the very long analyzed time scale. The aim of this work is to examine in detail the scheme of measures and policies for the realization of the purposes of the Green Deal and to transform them into a set of constraints and new socio-economic development pathways. Based on them, TEMOA-Europe will be used to produce and comparatively analyze scenarios to assess the consequences of Green Deal-related measures on the future evolution of the energy mix over the whole energy system in an economic optimization environment.

Keywords: European Green Deal, energy system optimization modeling, scenario analysis, TEMOA-Europe

Procedia PDF Downloads 106
4203 Implementation and Validation of a Damage-Friction Constitutive Model for Concrete

Authors: L. Madouni, M. Ould Ouali, N. E. Hannachi

Abstract:

Two constitutive models for concrete are available in ABAQUS/Explicit, the Brittle Cracking Model and the Concrete Damaged Plasticity Model, and their suitability and limitations are well known. The aim of the present paper is to implement a damage-friction concrete constitutive model and to evaluate the performance of this model by comparing the predicted response with experimental data. The constitutive formulation of this material model is reviewed. In order to have consistent results, the parameter identification and calibration for the model have been performed. Several numerical simulations are presented in this paper, whose results allow for validating the capability of the proposed model for reproducing the typical nonlinear performances of concrete structures under different monotonic and cyclic load conditions. The results of the evaluation will be used for recommendations concerning the application and further improvements of the investigated model.

Keywords: Abaqus, concrete, constitutive model, numerical simulation

Procedia PDF Downloads 365
4202 Simulation of Surge Protection for a Direct Current Circuit

Authors: Pedro Luis Ferrer Penalver, Edmundo da Silva Braga

Abstract:

In this paper, the performance of a simple surge protection for a direct current circuit was simulated. The protection circuit was developed from modified electric macro models of a gas discharge tube and a transient voltage suppressor diode. Moreover, a combination wave generator circuit was used as source of energy surges. The simulations showed that the circuit presented ensures immunity corresponding with test level IV of the IEC 61000-4-5:2014 international standard. The developed circuit can be modified to meet the requirements of any other equipment to be protected. Similarly, the parameters of the combination wave generator can be changed to provide different surge amplitudes.

Keywords: combination wave generator, IEC 61000-4-5, Pspice simulation, surge protection

Procedia PDF Downloads 328
4201 JREM: An Approach for Formalising Models in the Requirements Phase with JSON and NoSQL Databases

Authors: Aitana Alonso-Nogueira, Helia Estévez-Fernández, Isaías García

Abstract:

This paper presents an approach to reduce some of its current flaws in the requirements phase inside the software development process. It takes the software requirements of an application, makes a conceptual modeling about it and formalizes it within JSON documents. This formal model is lodged in a NoSQL database which is document-oriented, that is, MongoDB, because of its advantages in flexibility and efficiency. In addition, this paper underlines the contributions of the detailed approach and shows some applications and benefits for the future work in the field of automatic code generation using model-driven engineering tools.

Keywords: conceptual modelling, JSON, NoSQL databases, requirements engineering, software development

Procedia PDF Downloads 379
4200 The Use of Computer-Aided Design in Small Contractors in a Local Area of Korea

Authors: Myunghoun Jang

Abstract:

A survey of small-size contractors in Jeju was conducted to investigate college graduate's computer-aided design (CAD) competence. Most of small-size contractors use CAD software to review and update drawings submitted from an architect. This research analyzed the curriculum of the architectural engineering in several national universities. The CAD classes have 4 or 6 hours per week and use AutoCAD primarily. This paper proposes that a CAD class needs 6 hours per week, 2D drawing is the main theme in the curriculum, and exercises to make 3D models are also included in the CAD class. An improved method, for example Internet cafe and real time feedbacks using smartphones, to evaluate the reports and exercise results is necessary.

Keywords: CAD (Computer Aided Design), CAD education, education improvement, small-size contractor

Procedia PDF Downloads 268
4199 Numerical Investigation of Flow Past in a Staggered Tube Bundle

Authors: Kerkouri Abdelkadir

Abstract:

Numerical calculations of turbulent flows are one of the most prominent modern interests in various engineering applications. Due to the difficulty of predicting, following up and studying this flow for computational fluid dynamic (CFD), in this paper, we simulated numerical study of a flow past in a staggered tube bundle, using CFD Code ANSYS FLUENT with several models of turbulence following: k-ε, k-ω and SST approaches. The flow is modeled based on the experimental studies. The predictions of mean velocities are in very good agreement with detailed LDA (Laser Doppler Anemometry) measurements performed in 8 stations along the depth of the array. The sizes of the recirculation zones behind the cylinders are also predicted. The simulations are conducted for Reynolds numbers of 12858. The Reynolds number is set to depend experimental results.

Keywords: flow, tube bundle, ANSYS Fluent, CFD, turbulence, LDA, RANS (k-ε, k-ω, SST)

Procedia PDF Downloads 166
4198 Development of Structural Deterioration Models for Flexible Pavement Using Traffic Speed Deflectometer Data

Authors: Sittampalam Manoharan, Gary Chai, Sanaul Chowdhury, Andrew Golding

Abstract:

The primary objective of this paper is to present a simplified approach to develop the structural deterioration model using traffic speed deflectometer data for flexible pavements. Maintaining assets to meet functional performance is not economical or sustainable in the long terms, and it would end up needing much more investments for road agencies and extra costs for road users. Performance models have to be included for structural and functional predicting capabilities, in order to assess the needs, and the time frame of those needs. As such structural modelling plays a vital role in the prediction of pavement performance. A structural condition is important for the prediction of remaining life and overall health of a road network and also major influence on the valuation of road pavement. Therefore, the structural deterioration model is a critical input into pavement management system for predicting pavement rehabilitation needs accurately. The Traffic Speed Deflectometer (TSD) is a vehicle-mounted Doppler laser system that is capable of continuously measuring the structural bearing capacity of a pavement whilst moving at traffic speeds. The device’s high accuracy, high speed, and continuous deflection profiles are useful for network-level applications such as predicting road rehabilitations needs and remaining structural service life. The methodology adopted in this model by utilizing time series TSD maximum deflection (D0) data in conjunction with rutting, rutting progression, pavement age, subgrade strength and equivalent standard axle (ESA) data. Then, regression analyses were undertaken to establish a correlation equation of structural deterioration as a function of rutting, pavement age, seal age and equivalent standard axle (ESA). This study developed a simple structural deterioration model which will enable to incorporate available TSD structural data in pavement management system for developing network-level pavement investment strategies. Therefore, the available funding can be used effectively to minimize the whole –of- life cost of the road asset and also improve pavement performance. This study will contribute to narrowing the knowledge gap in structural data usage in network level investment analysis and provide a simple methodology to use structural data effectively in investment decision-making process for road agencies to manage aging road assets.

Keywords: adjusted structural number (SNP), maximum deflection (D0), equant standard axle (ESA), traffic speed deflectometer (TSD)

Procedia PDF Downloads 151
4197 Analyzing Competitive Advantage of Internet of Things and Data Analytics in Smart City Context

Authors: Petra Hofmann, Dana Koniel, Jussi Luukkanen, Walter Nieminen, Lea Hannola, Ilkka Donoghue

Abstract:

The Covid-19 pandemic forced people to isolate and become physically less connected. The pandemic hasnot only reshaped people’s behaviours and needs but also accelerated digital transformation (DT). DT of cities has become an imperative with the outlook of converting them into smart cities in the future. Embedding digital infrastructure and smart city initiatives as part of the normal design, construction, and operation of cities provides a unique opportunity to improve connection between people. Internet of Things (IoT) is an emerging technology and one of the drivers in DT. It has disrupted many industries by introducing different services and business models, and IoT solutions are being applied in multiple fields, including smart cities. As IoT and data are fundamentally linked together, IoT solutions can only create value if the data generated by the IoT devices is analysed properly. Extracting relevant conclusions and actionable insights by using established techniques, data analytics contributes significantly to the growth and success of IoT applications and investments. Companies must grasp DT and be prepared to redesign their offerings and business models to remain competitive in today’s marketplace. As there are many IoT solutions available today, the amount of data is tremendous. The challenge for companies is to understand what solutions to focus on and how to prioritise and which data to differentiate from the competition. This paper explains how IoT and data analytics can impact competitive advantage and how companies should approach IoT and data analytics to translate them into concrete offerings and solutions in the smart city context. The study was carried out as a qualitative, literature-based research. A case study is provided to validate the preservation of company’s competitive advantage through smart city solutions. The results of the researchcontribution provide insights into the different factors and considerations related to creating competitive advantage through IoT and data analytics deployment in the smart city context. Furthermore, this paper proposes a framework that merges the factors and considerations with examples of offerings and solutions in smart cities. The data collected through IoT devices, and the intelligent use of it, can create a competitive advantage to companies operating in smart city business. Companies should take into consideration the five forces of competition that shape industries and pay attention to the technological, organisational, and external contexts which define factors for consideration of competitive advantages in the field of IoT and data analytics. Companies that can utilise these key assets in their businesses will most likely conquer the markets and have a strong foothold in the smart city business.

Keywords: internet of things, data analytics, smart cities, competitive advantage

Procedia PDF Downloads 95
4196 Analysing the Cost of Immigrants to the National Health System in Eastern Macedonia and Thrace

Authors: T. Theodosiou, P. Polychronidou, A. G. Karasavvoglou

Abstract:

The latest years the number of immigrants at Greece has increased dramatically. Their impact on the National Health System (NHS) has not been yet thoroughly investigated. This paper analyses the cost of immigrants to the NHS hospitals of the region of Eastern Macedonia and Thrace. The data are collected from 2005 to 2011 from five different hospitals and are analysed using linear mixed effects models in order to investigate the effects of nationality and year on the cost of hospitalization and treatment. The results show that generally the Greek nationality patients have a higher mean cost of hospitalization compared to the immigrants and that there is an increasing trend for the cost except for the year 2010.

Keywords: cost, Eastern Macedonia and Thrace, immigrants, national health system

Procedia PDF Downloads 245
4195 A Strategic Communication Design Model for Indigenous Knowledge Management

Authors: Dilina Janadith Nawarathne

Abstract:

This article presents the initial development of a communication model (Model_isi) as the means of gathering, preserving and transferring indigenous knowledge in the field of knowledge management. The article first discusses the need for an appropriate complimentary model for indigenous knowledge management which differs from the existing methods and models. Then the paper suggests the newly developed model for indigenous knowledge management which generate as result of blending key aspects of different disciplines, which can be implemented as a complementary approach for the existing scientific method. The paper further presents the effectiveness of the developed method in reflecting upon a pilot demonstration carried out on selected indigenous communities of Sri Lanka.

Keywords: indigenous knowledge management, knowledge transferring, tacit knowledge, research model, asian centric philosophy

Procedia PDF Downloads 480
4194 Evaluating Forecasting Strategies for Day-Ahead Electricity Prices: Insights From the Russia-Ukraine Crisis

Authors: Alexandra Papagianni, George Filis, Panagiotis Papadopoulos

Abstract:

The liberalization of the energy market and the increasing penetration of fluctuating renewables (e.g., wind and solar power) have heightened the importance of the spot market for ensuring efficient electricity supply. This is further emphasized by the EU’s goal of achieving net-zero emissions by 2050. The day-ahead market (DAM) plays a key role in European energy trading, accounting for 80-90% of spot transactions and providing critical insights for next-day pricing. Therefore, short-term electricity price forecasting (EPF) within the DAM is crucial for market participants to make informed decisions and improve their market positioning. Existing literature highlights out-of-sample performance as a key factor in assessing EPF accuracy, with influencing factors such as predictors, forecast horizon, model selection, and strategy. Several studies indicate that electricity demand is a primary price determinant, while renewable energy sources (RES) like wind and solar significantly impact price dynamics, often lowering prices. Additionally, incorporating data from neighboring countries, due to market coupling, further improves forecast accuracy. Most studies predict up to 24 steps ahead using hourly data, while some extend forecasts using higher-frequency data (e.g., half-hourly or quarter-hourly). Short-term EPF methods fall into two main categories: statistical and computational intelligence (CI) methods, with hybrid models combining both. While many studies use advanced statistical methods, particularly through different versions of traditional AR-type models, others apply computational techniques such as artificial neural networks (ANNs) and support vector machines (SVMs). Recent research combines multiple methods to enhance forecasting performance. Despite extensive research on EPF accuracy, a gap remains in understanding how forecasting strategy affects prediction outcomes. While iterated strategies are commonly used, they are often chosen without justification. This paper contributes by examining whether the choice of forecasting strategy impacts the quality of day-ahead price predictions, especially for multi-step forecasts. We evaluate both iterated and direct methods, exploring alternative ways of conducting iterated forecasts on benchmark and state-of-the-art forecasting frameworks. The goal is to assess whether these factors should be considered by end-users to improve forecast quality. We focus on the Greek DAM using data from July 1, 2021, to March 31, 2022. This period is chosen due to significant price volatility in Greece, driven by its dependence on natural gas and limited interconnection capacity with larger European grids. The analysis covers two phases: pre-conflict (January 1, 2022, to February 23, 2022) and post-conflict (February 24, 2022, to March 31, 2022), following the Russian-Ukraine conflict that initiated an energy crisis. We use the mean absolute percentage error (MAPE) and symmetric mean absolute percentage error (sMAPE) for evaluation, as well as the Direction of Change (DoC) measure to assess the accuracy of price movement predictions. Our findings suggest that forecasters need to apply all strategies across different horizons and models. Different strategies may be required for different horizons to optimize both accuracy and directional predictions, ensuring more reliable forecasts.

Keywords: short-term electricity price forecast, forecast strategies, forecast horizons, recursive strategy, direct strategy

Procedia PDF Downloads 11
4193 Comparison between Photogrammetric and Structure from Motion Techniques in Processing Unmanned Aerial Vehicles Imageries

Authors: Ahmed Elaksher

Abstract:

Over the last few years, significant progresses have been made and new approaches have been proposed for efficient collection of 3D spatial data from Unmanned aerial vehicles (UAVs) with reduced costs compared to imagery from satellite or manned aircraft. In these systems, a low-cost GPS unit provides the position, velocity of the vehicle, a low-quality inertial measurement unit (IMU) determines its orientation, and off-the-shelf cameras capture the images. Structure from Motion (SfM) and photogrammetry are the main tools for 3D surface reconstruction from images collected by these systems. Unlike traditional techniques, SfM allows the computation of calibration parameters using point correspondences across images without performing a rigorous laboratory or field calibration process and it is more flexible in that it does not require consistent image overlap or same rotation angles between successive photos. These benefits make SfM ideal for UAVs aerial mapping. In this paper, a direct comparison between SfM Digital Elevation Models (DEM) and those generated through traditional photogrammetric techniques was performed. Data was collected by a 3DR IRIS+ Quadcopter with a Canon PowerShot S100 digital camera. Twenty ground control points were randomly distributed on the ground and surveyed with a total station in a local coordinate system. Images were collected from an altitude of 30 meters with a ground resolution of nine mm/pixel. Data was processed with PhotoScan, VisualSFM, Imagine Photogrammetry, and a photogrammetric algorithm developed by the author. The algorithm starts with performing a laboratory camera calibration then the acquired imagery undergoes an orientation procedure to determine the cameras’ positions and orientations. After the orientation is attained, correlation based image matching is conducted to automatically generate three-dimensional surface models followed by a refining step using sub-pixel image information for high matching accuracy. Tests with different number and configurations of the control points were conducted. Camera calibration parameters estimated from commercial software and those obtained with laboratory procedures were comparable. Exposure station positions were within less than few centimeters and insignificant differences, within less than three seconds, among orientation angles were found. DEM differencing was performed between generated DEMs and few centimeters vertical shifts were found.

Keywords: UAV, photogrammetry, SfM, DEM

Procedia PDF Downloads 295
4192 Epistemic Uncertainty Analysis of Queue with Vacations

Authors: Baya Takhedmit, Karim Abbas, Sofiane Ouazine

Abstract:

The vacations queues are often employed to model many real situations such as computer systems, communication networks, manufacturing and production systems, transportation systems and so forth. These queueing models are solved at fixed parameters values. However, the parameter values themselves are determined from a finite number of observations and hence have uncertainty associated with them (epistemic uncertainty). In this paper, we consider the M/G/1/N queue with server vacation and exhaustive discipline where we assume that the vacation parameter values have uncertainty. We use the Taylor series expansions approach to estimate the expectation and variance of model output, due to epistemic uncertainties in the model input parameters.

Keywords: epistemic uncertainty, M/G/1/N queue with vacations, non-parametric sensitivity analysis, Taylor series expansion

Procedia PDF Downloads 434
4191 Simulation of Wet Scrubbers for Flue Gas Desulfurization

Authors: Anders Schou Simonsen, Kim Sorensen, Thomas Condra

Abstract:

Wet scrubbers are used for flue gas desulfurization by injecting water directly into the flue gas stream from a set of sprayers. The water droplets will flow freely inside the scrubber, and flow down along the scrubber walls as a thin wall film while reacting with the gas phase to remove SO₂. This complex multiphase phenomenon can be divided into three main contributions: the continuous gas phase, the liquid droplet phase, and the liquid wall film phase. This study proposes a complete model, where all three main contributions are taken into account and resolved using OpenFOAM for the continuous gas phase, and MATLAB for the liquid droplet and wall film phases. The 3D continuous gas phase is composed of five species: CO₂, H₂O, O₂, SO₂, and N₂, which are resolved along with momentum, energy, and turbulence. Source terms are present for four species, energy and momentum, which are affecting the steady-state solution. The liquid droplet phase experiences breakup, collisions, dynamics, internal chemistry, evaporation and condensation, species mass transfer, energy transfer and wall film interactions. Numerous sub-models have been implemented and coupled to realise the above-mentioned phenomena. The liquid wall film experiences impingement, acceleration, atomization, separation, internal chemistry, evaporation and condensation, species mass transfer, and energy transfer, which have all been resolved using numerous sub-models as well. The continuous gas phase has been coupled with the liquid phases using source terms by an approach, where the two software packages are couples using a link-structure. The complete CFD model has been verified using 16 experimental tests from an existing scrubber installation, where a gradient-based pattern search optimization algorithm has been used to tune numerous model parameters to match the experimental results. The CFD model needed to be fast for evaluation in order to apply this optimization routine, where approximately 1000 simulations were needed. The results show that the complex multiphase phenomena governing wet scrubbers can be resolved in a single model. The optimization routine was able to tune the model to accurately predict the performance of an existing installation. Furthermore, the study shows that a coupling between OpenFOAM and MATLAB is realizable, where the data and source term exchange increases the computational requirements by approximately 5%. This allows for exploiting the benefits of both software programs.

Keywords: desulfurization, discrete phase, scrubber, wall film

Procedia PDF Downloads 268
4190 Attention Problems among Adolescents: Examining Educational Environments

Authors: Zhidong Zhang, Zhi-Chao Zhang, Georgianna Duarte

Abstract:

This study investigated the attention problems with the instrument of Achenbach System of Empirically Based Assessment (ASEBA). Two thousand eight hundred and ninety-four adolescents were surveyed by using a stratified sampling method. We examined the relationships between relevant background variables and attention problems. Multiple regression models were applied to analyze the data. Relevant variables such as sports activities, hobbies, age, grade and the number of close friends were included in this study as predictive variables. The analysis results indicated that educational environments and extracurricular activities are important factors which influence students’ attention problems.

Keywords: adolescents, ASEBA, attention problems, educational environments, stratified sampling

Procedia PDF Downloads 286
4189 Aerodynamic Analysis of a Frontal Deflector for Vehicles

Authors: C. Malça, N. Alves, A. Mateus

Abstract:

This work was one of the tasks of the Manufacturing2Client project, whose objective was to develop a frontal deflector to be commercialized in the automotive industry, using new project and manufacturing methods. In this task, in particular, it was proposed to develop the ability to predict computationally the aerodynamic influence of flow in vehicles, in an effort to reduce fuel consumption in vehicles from class 3 to 8. With this aim, two deflector models were developed and their aerodynamic performance analyzed. The aerodynamic study was done using the Computational Fluid Dynamics (CFD) software Ansys CFX and allowed the calculation of the drag coefficient caused by the vehicle motion for the different configurations considered. Moreover, the reduction of diesel consumption and carbon dioxide (CO2) emissions associated with the optimized deflector geometry could be assessed.

Keywords: erodynamic analysis, CFD, CO2 emissions, drag coefficient, frontal deflector, fuel consumption

Procedia PDF Downloads 407
4188 Symbolic Computation for the Multi-Soliton Solutions of a Class of Fifth-Order Evolution Equations

Authors: Rafat Alshorman, Fadi Awawdeh

Abstract:

By employing a simplified bilinear method, a class of generalized fifth-order KdV (gfKdV) equations which arise in nonlinear lattice, plasma physics and ocean dynamics are investigated. With the aid of symbolic computation, both solitary wave solutions and multiple-soliton solutions are obtained. These new exact solutions will extend previous results and help us explain the properties of nonlinear solitary waves in many physical models in shallow water. Parametric analysis is carried out in order to illustrate that the soliton amplitude, width and velocity are affected by the coefficient parameters in the equation.

Keywords: multiple soliton solutions, fifth-order evolution equations, Cole-Hopf transformation, Hirota bilinear method

Procedia PDF Downloads 323
4187 The Experimental and Modeling Adsorption Properties of Sr2+ on Raw and Purified Bentonite

Authors: A. A. Khodadadi, S. C. Ravaj, B. D. Tavildari, M. B. Abdolahi

Abstract:

The adsorption properties of local bentonite (Semnan Iran) and purified prepared from this bentonite towards Sr2+ adsorption, were investigated by batch equilibration. The influence of equilibration time, adsorption isotherms, kinetic adsorption, solution pH, and presence of EDTA and NaCl on these properties was studied and discussed. Kinetic data were found to be well fitted with a pseudo-second order kinetic model. Sr2+ is preferably adsorbed by bentonite and purified bentonite. The D-R isotherm model has the best fit with experimental data than other adsorption isotherm models. The maximum adsorption of Sr2+ representing the highest negative charge density on the surface of the adsorbent was seen at pH 12. Presence of EDTA and NaCl decreased the amount of Sr2+ adsorption.

Keywords: bentonite, purified bentonite, Sr2+, equilibrium isotherm, kinetics

Procedia PDF Downloads 375
4186 The Impact of Monetary Policy on Aggregate Market Liquidity: Evidence from Indian Stock Market

Authors: Byomakesh Debata, Jitendra Mahakud

Abstract:

The recent financial crisis has been characterized by massive monetary policy interventions by the Central bank, and it has amplified the importance of liquidity for the stability of the stock market. This paper empirically elucidates the actual impact of monetary policy interventions on stock market liquidity covering all National Stock Exchange (NSE) Stocks, which have been traded continuously from 2002 to 2015. The present study employs a multivariate VAR model along with VAR-granger causality test, impulse response functions, block exogeneity test, and variance decomposition to analyze the direction as well as the magnitude of the relationship between monetary policy and market liquidity. Our analysis posits a unidirectional relationship between monetary policy (call money rate, base money growth rate) and aggregate market liquidity (traded value, turnover ratio, Amihud illiquidity ratio, turnover price impact, high-low spread). The impulse response function analysis clearly depicts the influence of monetary policy on stock liquidity for every unit innovation in monetary policy variables. Our results suggest that an expansionary monetary policy increases aggregate stock market liquidity and the reverse is documented during the tightening of monetary policy. To ascertain whether our findings are consistent across all periods, we divided the period of study as pre-crisis (2002 to 2007) and post-crisis period (2007-2015) and ran the same set of models. Interestingly, all liquidity variables are highly significant in the post-crisis period. However, the pre-crisis period has witnessed a moderate predictability of monetary policy. To check the robustness of our results we ran the same set of VAR models with different monetary policy variables and found the similar results. Unlike previous studies, we found most of the liquidity variables are significant throughout the sample period. This reveals the predictability of monetary policy on aggregate market liquidity. This study contributes to the existing body of literature by documenting a strong predictability of monetary policy on stock liquidity in an emerging economy with an order driven market making system like India. Most of the previous studies have been carried out in developing economies with quote driven or hybrid market making system and their results are ambiguous across different periods. From an eclectic sense, this study may be considered as a baseline study to further find out the macroeconomic determinants of liquidity of stocks at individual as well as aggregate level.

Keywords: market liquidity, monetary policy, order driven market, VAR, vector autoregressive model

Procedia PDF Downloads 375
4185 Adsorption Studies of Lead from Aqueos Solutions on Cocount Shell Activated Carbon

Authors: G. E. Sharaf El-Deen, S. E. A. Sharaf El-Deen

Abstract:

Activated carbon was prepared from coconut shell (ACS); a discarded agricultural waste was used to produce bioadsorbent through easy and environmental friendly processes. This activated carbon based biosorbent was evaluated for adsorptive removal of lead from water. The characterisation results showed this biosorbent had very high specific surface area and functional groups. The adsorption equilibrium data was well described by Langmuir, whilst kinetics data by pseudo-first order, pseudo-second order and Intraparticle diffusion models. The adsorption process could be described by the pseudo-second order kinetic.

Keywords: coconut shell, activated carbon, adsorption isotherm and kinetics, lead removal

Procedia PDF Downloads 308
4184 On Periodic Integer-Valued Moving Average Models

Authors: Aries Nawel, Bentarzi Mohamed

Abstract:

This paper deals with the study of some probabilistic and statistical properties of a Periodic Integer-Valued Moving Average Model (PINMA_{S}(q)). The closed forms of the mean, the second moment and the periodic autocovariance function are obtained. Furthermore, the time reversibility of the model is discussed in details. Moreover, the estimation of the underlying parameters are obtained by the Yule-Walker method, the Conditional Least Square method (CLS) and the Weighted Conditional Least Square method (WCLS). A simulation study is carried out to evaluate the performance of the estimation method. Moreover, an application on real data set is provided.

Keywords: periodic integer-valued moving average, periodically correlated process, time reversibility, count data

Procedia PDF Downloads 203
4183 Rheological Evaluation of Various Indigenous Gums

Authors: Yogita Weikey, Shobha Lata Sinha, Satish Kumar Dewangan

Abstract:

In the present investigation, rheology of the three different natural gums has been evaluated experimentally using MCR 102 rheometer. Various samples based on the variation of the concentration of the solid gum powder have been prepared. Their non-Newtonian behavior has been observed by the consistency plots and viscosity variation plots with respect to different solid concentration. The viscosity-shear rate curves of gums are similar and the behavior is shear thinning. Gums are showing pseudoplastic behavior. The value of k and n are calculated by using various models. Results show that the Herschel–Bulkley rheological model is reliable to describe the relationship of shear stress as a function of shear rate. R² values are also calculated to support the choice of gum selection.

Keywords: bentonite, Indian gum, non-Newtonian model, rheology

Procedia PDF Downloads 310
4182 Closest Possible Neighbor of a Different Class: Explaining a Model Using a Neighbor Migrating Generator

Authors: Hassan Eshkiki, Benjamin Mora

Abstract:

The Neighbor Migrating Generator is a simple and efficient approach to finding the closest potential neighbor(s) with a different label for a given instance and so without the need to calibrate any kernel settings at all. This allows determining and explaining the most important features that will influence an AI model. It can be used to either migrate a specific sample to the class decision boundary of the original model within a close neighborhood of that sample or identify global features that can help localising neighbor classes. The proposed technique works by minimizing a loss function that is divided into two components which are independently weighted according to three parameters α, β, and ω, α being self-adjusting. Results show that this approach is superior to past techniques when detecting the smallest changes in the feature space and may also point out issues in models like over-fitting.

Keywords: explainable AI, EX AI, feature importance, counterfactual explanations

Procedia PDF Downloads 195
4181 Air Breakdown Voltage Prediction in Post-arcing Conditions for Compact Circuit Breakers

Authors: Jing Nan

Abstract:

The air breakdown voltage in compact circuit breakers is a critical factor in the design and reliability of electrical distribution systems. This voltage determines the threshold at which the air insulation between conductors will fail or 'break down,' leading to an arc. This phenomenon is highly sensitive to the conditions within the breaker, such as the temperature and the distance between electrodes. Typically, air breakdown voltage models have been reliable for predicting failure under standard operational temperatures. However, in conditions post-arcing, where temperatures can soar above 2000K, these models face challenges due to the complex physics of ionization and electron behaviour at such high-energy states. Building upon the foundational understanding that the breakdown mechanism is initiated by free electrons and propelled by electric fields, which lead to ionization and, potentially, to avalanche or streamer formation, we acknowledge the complexity introduced by high-temperature environments. Recognizing the limitations of existing experimental data, a notable research gap exists in the accurate prediction of breakdown voltage at elevated temperatures, typically observed post-arcing, where temperatures exceed 2000K.To bridge this knowledge gap, we present a method that integrates gap distance and high-temperature effects into air breakdown voltage assessment. The proposed model is grounded in the physics of ionization, accounting for the dynamic behaviour of free electrons which, under intense electric fields at elevated temperatures, lead to thermal ionization and potentially reach the threshold for streamer formation as Meek's criterion. Employing the Saha equation, our model calculates equilibrium electron densities, adapting to the atmospheric pressure and the hot temperature regions indicative of post-arc temperature conditions. Our model is rigorously validated against established experimental data, demonstrating substantial improvements in predicting air breakdown voltage in the high-temperature regime. This work significantly improves the predictive power for air breakdown voltage under conditions that closely mimic operational stressors in compact circuit breakers. Looking ahead, the proposed methods are poised for further exploration in alternative insulating media, like SF6, enhancing the model's utility for a broader range of insulation technologies and contributing to the future of high-temperature electrical insulation research.

Keywords: air breakdown voltage, high-temperature insulation, compact circuit breakers, electrical discharge, saha equation

Procedia PDF Downloads 84
4180 Forecasting Market Share of Electric Vehicles in Taiwan Using Conjoint Models and Monte Carlo Simulation

Authors: Li-hsing Shih, Wei-Jen Hsu

Abstract:

Recently, the sale of electrical vehicles (EVs) has increased dramatically due to maturing technology development and decreasing cost. Governments of many countries have made regulations and policies in favor of EVs due to their long-term commitment to net zero carbon emissions. However, due to uncertain factors such as the future price of EVs, forecasting the future market share of EVs is a challenging subject for both the auto industry and local government. This study tries to forecast the market share of EVs using conjoint models and Monte Carlo simulation. The research is conducted in three phases. (1) A conjoint model is established to represent the customer preference structure on purchasing vehicles while five product attributes of both EV and internal combustion engine vehicles (ICEV) are selected. A questionnaire survey is conducted to collect responses from Taiwanese consumers and estimate the part-worth utility functions of all respondents. The resulting part-worth utility functions can be used to estimate the market share, assuming each respondent will purchase the product with the highest total utility. For example, attribute values of an ICEV and a competing EV are given respectively, two total utilities of the two vehicles of a respondent are calculated and then knowing his/her choice. Once the choices of all respondents are known, an estimate of market share can be obtained. (2) Among the attributes, future price is the key attribute that dominates consumers’ choice. This study adopts the assumption of a learning curve to predict the future price of EVs. Based on the learning curve method and past price data of EVs, a regression model is established and the probability distribution function of the price of EVs in 2030 is obtained. (3) Since the future price is a random variable from the results of phase 2, a Monte Carlo simulation is then conducted to simulate the choices of all respondents by using their part-worth utility functions. For instance, using one thousand generated future prices of an EV together with other forecasted attribute values of the EV and an ICEV, one thousand market shares can be obtained with a Monte Carlo simulation. The resulting probability distribution of the market share of EVs provides more information than a fixed number forecast, reflecting the uncertain nature of the future development of EVs. The research results can help the auto industry and local government make more appropriate decisions and future action plans.

Keywords: conjoint model, electrical vehicle, learning curve, Monte Carlo simulation

Procedia PDF Downloads 70
4179 Mathematical Modelling of Different Types of Body Support Surface for Pressure Ulcer Prevention

Authors: Mahbub C. Mishu, Venktesh N. Dubey, Tamas Hickish, Jonathan Cole

Abstract:

Pressure ulcer is a common problem for today's healthcare industry. It occurs due to external load applied to the skin. Also when the subject is immobile for a longer period of time and there is continuous load applied to a particular area of human body,blood flow gets reduced and as a result pressure ulcer develops. Body support surface has a significant role in preventing ulceration so it is important to know the characteristics of support surface under loading conditions. In this paper we have presented mathematical models of different types of viscoelastic materials and also we have shown the validation of our simulation results with experiments.

Keywords: pressure ulcer, viscoelastic material, mathematical model, experimental validation

Procedia PDF Downloads 311
4178 Sustainability Effect of Informality and Globalisation: Capturing Spatial Spillovers and Threshold Effects in African and European Economies

Authors: Segun Thompson Bolarinwa, Munacinga Simatele, Adedamola Victoria Adegbuyi

Abstract:

Using World Bank’s nascent measure of sustainability, this paper examines the relationship between informality and sustainability in selected 7 African and 7 European developing economies. Specifically, the work examines the roles of informality on sustainability, interactive effect of globalisation in the nexus and the threshold of informality on sustainability suing spatial econometric and dynamic panel threshold panel models. Overall, the results indicate mixed effects of positive and negative pf informality on sustainability in Africa and Europe respectively. Recommendations are presented.

Keywords: spatial and dynamic, informality, Africa, Europe, globalisation, sustainability

Procedia PDF Downloads 24
4177 Methods for Business Process Simulation Based on Petri Nets

Authors: K. Shoylekova, K. Grigorova

Abstract:

The Petri nets are the first standard for business process modeling. Most probably, it is one of the core reasons why all new standards created afterwards have to be so reformed as to reach the stage of mapping the new standard onto Petri nets. The paper presents a Business process repository based on a universal database. The repository provides the possibility the data about a given process to be stored in three different ways. Business process repository is developed with regard to the reformation of a given model to a Petri net in order to be easily simulated two different techniques for business process simulation based on Petri nets - Yasper and Woflan are discussed. Their advantages and drawbacks are outlined. The way of simulating business process models, stored in the Business process repository is shown.

Keywords: business process repository, petri nets, simulation, Woflan, Yasper

Procedia PDF Downloads 371
4176 Analysing Anime as the Narration of Resistance: Case Study of Japanese Vampire Anime

Authors: Patrycja Pichnicka

Abstract:

Anime is the Japanese art of animation and a kind of Japanese animated movie, different from the Western ones by its specific features. In the world dominated by live action movies, mostly the ones produced in the United States, Japanese animated movies, which constitute a large part of the Japanese movie industry, play the role of the Other. They adapt elements of Western culture and technology to create something that resists global Western domination. This phenomenon is particularly interesting to observe in the case of narration borrowed from the Western culture, yet transformed in a specific manner: such as Vampire Narration. The phenomenon should be examined using the theory of cultural adaptation of Siergiei Arutiunow, as well as theory of cultural hegemony and postcolonial theories, including the theory of the discourse of resistance. Relations of cultural hegemony and resistance have been mentioned in works of Susan Napier, however they are worth to be fully developed. Anime relations to globally dominating culture reveal tension between submission and resistance in which non-Western identity is constructed and performed. Nonetheless, the tension between the Global/Western and the Japanese is not the only one existing in contemporaneous Japanese society and culture. Sexual, gender, class, and ethnic issues are also expressed in and through pop culture narrations. Using the basic division of the types of cultural adaptation we can trace the line of the evolution of the Japanese cultural attitude towards the West, expressed in the Vampire Narration from the time of American occupation till now. These attitudes changed from the submissive assimilation or reproduction of cultural models, through the simple opposition, to the more nuanced attitude of nowadays. However, according to Kimberlé Crenshaw’s intersectional theory, there is no one category of discrimination or submission. There are individuals or groups existing on the cross of two or more categories of emancipation. If the Japanese were culturally subdued to the Westerner, the Japanese woman was doubly subdued: as a woman and as a Japanese. The emancipation of one group can deepen the submission of another one, of internal Other, of the group in which two or more categories of domination/submission intersect. That is why some Japanese female authors enthusiastically reproduce the Western cultural models, even if this means a cultural hegemony of the West over the Japanese. They see, as women, more liberal attitudes towards their gender in the Western culture than in the Japanese culture, as it is constructed and produced by Japanese men. The Japanese anime is the realm in which sophisticated art meets social tendencies and cultural attitudes. Anime examination permits to study of the composed contemporaneous Japanese identity, as well as general rules of cultural relations.

Keywords: anime, cultural hegemony, intercultural relations, resistance, vampire narration

Procedia PDF Downloads 143