Search results for: NARX (Nonlinear Autoregressive Exogenous Model)
14436 Performance Evaluation of Al Jame’s Roundabout Using SIDRA
Authors: D. Muley, H. S. Al-Mandhari
Abstract:
This paper evaluates the performance of a multi-lane four-legged modern roundabout operating in Muscat using SIDRA model. The performance measures include Degree of Saturation (DOS), average delay, and queue lengths. The geometric and traffic data were used for model preparation. Gap acceptance parameters, critical gap, and follow-up headway were used for calibration of SIDRA model. The results from the analysis showed that currently the roundabout is experiencing delays up to 610 seconds with DOS 1.67 during peak hour. Further, sensitivity analysis for general and roundabout parameters was performed, amongst lane width, cruise speed, inscribed diameter, entry radius, and entry angle showed that inscribed diameter is the most crucial factor affecting delay and DOS. Upgradation of the roundabout to the fully signalized junction was found as the suitable solution which will serve for future years with LOS C for design year having DOS of 0.9 with average control delay of 51.9 seconds per vehicle.Keywords: performance analysis, roundabout, sensitivity analysis, SIDRA
Procedia PDF Downloads 38414435 Stackelberg Security Game for Optimizing Security of Federated Internet of Things Platform Instances
Authors: Violeta Damjanovic-Behrendt
Abstract:
This paper presents an approach for optimal cyber security decisions to protect instances of a federated Internet of Things (IoT) platform in the cloud. The presented solution implements the repeated Stackelberg Security Game (SSG) and a model called Stochastic Human behaviour model with AttRactiveness and Probability weighting (SHARP). SHARP employs the Subjective Utility Quantal Response (SUQR) for formulating a subjective utility function, which is based on the evaluations of alternative solutions during decision-making. We augment the repeated SSG (including SHARP and SUQR) with a reinforced learning algorithm called Naïve Q-Learning. Naïve Q-Learning belongs to the category of active and model-free Machine Learning (ML) techniques in which the agent (either the defender or the attacker) attempts to find an optimal security solution. In this way, we combine GT and ML algorithms for discovering optimal cyber security policies. The proposed security optimization components will be validated in a collaborative cloud platform that is based on the Industrial Internet Reference Architecture (IIRA) and its recently published security model.Keywords: security, internet of things, cloud computing, stackelberg game, machine learning, naive q-learning
Procedia PDF Downloads 35614434 Integration of Big Data to Predict Transportation for Smart Cities
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system. The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.Keywords: big data, machine learning, smart city, social cost, transportation network
Procedia PDF Downloads 26314433 Thermal Comfort Evaluation in an Office Space Based on Pmv-Ppd Model
Authors: Kaoutar Jraida
Abstract:
Growing evidence demonstrates that thermal conditions in office buildings broadly influence productivity of workers. The purpose of this study is to evaluate and analyze the indoor thermal comfort in an office space based on the calculation of predicted mean vote and predicted percentage of dissatisfied (PMV-PPD) model and field survey.Keywords: Office, Predicted Mean Vote (PMV), Percentage People Dissatisfied (PPD), Thermal comfort
Procedia PDF Downloads 19714432 The Role of Institutional Quality and Institutional Quality Distance on Trade: The Case of Agricultural Trade within the Southern African Development Community Region
Authors: Kgolagano Mpejane
Abstract:
The study applies a New Institutional Economics (NIE) analytical framework to trade in developing economies by assessing the impacts of institutional quality and institutional quality distance on agricultural trade using a panel data of 15 Southern African Development Community (SADC) countries from the years 1991-2010. The issue of institutions on agricultural trade has not been accorded the necessary attention in the literature, particularly in developing economies. Therefore, the paper empirically tests the gravity model of international trade by measuring the impact of political, economic and legal institutions on intra SADC agricultural trade. The gravity model is noted for its exploratory power and strong theoretical foundation. However, the model has statistical shortcomings in dealing with zero trade values and heteroscedasticity residuals leading to biased results. Therefore, this study employs a two stage Heckman selection model with a Probit equation to estimate the influence of institutions on agricultural trade. The selection stages include the inverse Mills ratio to account for the variable bias of the gravity model. The Heckman model accounts for zero trade values and is robust in the presence of heteroscedasticity. The empirical results of the study support the NIE theory premise that institutions matter in trade. The results demonstrate that institutions determine bilateral agricultural trade on different margins with political institutions having positive and significant influence on bilateral agricultural trade flows within the SADC region. Legal and economic institutions have significant and negative effects on SADC trade. Furthermore, the results of this study confirm that institutional quality distance influences agricultural trade. Legal and political institutional distance have a positive and significant influence on bilateral agricultural trade while the influence of economic, institutional quality is negative and insignificant. The results imply that nontrade barriers, in the form of institutional quality and institutional quality distance, are significant factors limiting intra SADC agricultural trade. Therefore, gains from intra SADC agricultural trade can be attained through the improvement of institutions within the region.Keywords: agricultural trade, institutions, gravity model, SADC
Procedia PDF Downloads 14914431 AI-Enabled Smart Contracts for Reliable Traceability in the Industry 4.0
Authors: Harris Niavis, Dimitra Politaki
Abstract:
The manufacturing industry was collecting vast amounts of data for monitoring product quality thanks to the advances in the ICT sector and dedicated IoT infrastructure is deployed to track and trace the production line. However, industries have not yet managed to unleash the full potential of these data due to defective data collection methods and untrusted data storage and sharing. Blockchain is gaining increasing ground as a key technology enabler for Industry 4.0 and the smart manufacturing domain, as it enables the secure storage and exchange of data between stakeholders. On the other hand, AI techniques are more and more used to detect anomalies in batch and time-series data that enable the identification of unusual behaviors. The proposed scheme is based on smart contracts to enable automation and transparency in the data exchange, coupled with anomaly detection algorithms to enable reliable data ingestion in the system. Before sensor measurements are fed to the blockchain component and the smart contracts, the anomaly detection mechanism uniquely combines artificial intelligence models to effectively detect unusual values such as outliers and extreme deviations in data coming from them. Specifically, Autoregressive integrated moving average, Long short-term memory (LSTM) and Dense-based autoencoders, as well as Generative adversarial networks (GAN) models, are used to detect both point and collective anomalies. Towards the goal of preserving the privacy of industries' information, the smart contracts employ techniques to ensure that only anonymized pointers to the actual data are stored on the ledger while sensitive information remains off-chain. In the same spirit, blockchain technology guarantees the security of the data storage through strong cryptography as well as the integrity of the data through the decentralization of the network and the execution of the smart contracts by the majority of the blockchain network actors. The blockchain component of the Data Traceability Software is based on the Hyperledger Fabric framework, which lays the ground for the deployment of smart contracts and APIs to expose the functionality to the end-users. The results of this work demonstrate that such a system can increase the quality of the end-products and the trustworthiness of the monitoring process in the smart manufacturing domain. The proposed AI-enabled data traceability software can be employed by industries to accurately trace and verify records about quality through the entire production chain and take advantage of the multitude of monitoring records in their databases.Keywords: blockchain, data quality, industry4.0, product quality
Procedia PDF Downloads 19114430 Soap Film Enneper Minimal Surface Model
Authors: Yee Hooi Min, Mohdnasir Abdul Hadi
Abstract:
Tensioned membrane structure in the form of Enneper minimal surface can be considered as a sustainable development for the green environment and technology, it also can be used to support the effectiveness used of energy and the structure. Soap film in the form of Enneper minimal surface model has been studied. The combination of shape and internal forces for the purpose of stiffness and strength is an important feature of membrane surface. For this purpose, form-finding using soap film model has been carried out for Enneper minimal surface models with variables u=v=0.6 and u=v=1.0. Enneper soap film models with variables u=v=0.6 and u=v=1.0 provides an alternative choice for structural engineers to consider the tensioned membrane structure in the form of Enneper minimal surface applied in the building industry. It is expected to become an alternative building material to be considered by the designer.Keywords: Enneper, minimal surface, soap film, tensioned membrane structure
Procedia PDF Downloads 55814429 User-Perceived Quality Factors for Certification Model of Web-Based System
Authors: Jamaiah H. Yahaya, Aziz Deraman, Abdul Razak Hamdan, Yusmadi Yah Jusoh
Abstract:
One of the most essential issues in software products is to maintain it relevancy to the dynamics of the user’s requirements and expectation. Many studies have been carried out in quality aspect of software products to overcome these problems. Previous software quality assessment models and metrics have been introduced with strengths and limitations. In order to enhance the assurance and buoyancy of the software products, certification models have been introduced and developed. From our previous experiences in certification exercises and case studies collaborating with several agencies in Malaysia, the requirements for user based software certification approach is identified and demanded. The emergence of social network applications, the new development approach such as agile method and other varieties of software in the market have led to the domination of users over the software. As software become more accessible to the public through internet applications, users are becoming more critical in the quality of the services provided by the software. There are several categories of users in web-based systems with different interests and perspectives. The classifications and metrics are identified through brain storming approach with includes researchers, users and experts in this area. The new paradigm in software quality assessment is the main focus in our research. This paper discusses the classifications of users in web-based software system assessment and their associated factors and metrics for quality measurement. The quality model is derived based on IEEE structure and FCM model. The developments are beneficial and valuable to overcome the constraints and improve the application of software certification model in future.Keywords: software certification model, user centric approach, software quality factors, metrics and measurements, web-based system
Procedia PDF Downloads 40614428 Soil-Structure Interaction in Stiffness and Strength Degrading Systems
Authors: Enrique Bazan-Zurita, Sittipong Jarernprasert, Jacobo Bielak
Abstract:
We study the effects of soil-structure interaction (SSI) on the inelastic seismic response of a single-degree-of-freedom system whose hysteretic behaviour exhibits stiffness and/or strength degrading characteristics. Two sets of accelerograms are used as seismic input: the first comprising 87 record from stiff to medium stiff sites in California, and the second comprising 66 records from the soft lakebed of Mexico City. This study focuses in three seismic response parameters: ductility demand, inter-story drift, and total lateral displacement. The results allow quantitative estimates of changes in such parameters in an SSI system in comparison with those corresponding to the associated fixed-base system. We found that degrading features affect significantly both the response of fixed-base structures and the impact of soil-structure interaction. We propose a procedure to incorporate the results of this and similar studies in seismic design regulations for SSI system with anticipated nonlinear degrading behaviour.Keywords: inelastic, seismic, building, foundation, interaction
Procedia PDF Downloads 28714427 Mathematical Model for Progressive Phase Distribution of Ku-band Reflectarray Antennas
Authors: M. Y. Ismail, M. Inam, A. F. M. Zain, N. Misran
Abstract:
Progressive phase distribution is an important consideration in reflect array antenna design which is required to form a planar wave in front of the reflect array aperture. This paper presents a detailed mathematical model in order to determine the required reflection phase values from individual element of a reflect array designed in Ku-band frequency range. The proposed technique of obtaining reflection phase can be applied for any geometrical design of elements and is independent of number of array elements. Moreover the model also deals with the solution of reflect array antenna design with both centre and off-set feed configurations. The theoretical modeling has also been implemented for reflect arrays constructed on 0.508 mm thickness of different dielectric substrates. The results show an increase in the slope of the phase curve from 4.61°/mm to 22.35°/mm by varying the material properties.Keywords: mathematical modeling, progressive phase distribution, reflect array antenna, reflection phase
Procedia PDF Downloads 38414426 Information Technology and Business Alignments among Different Divisions: A Comparative Analysis of Japan and South Korea
Authors: Michiko Miyamoto
Abstract:
This paper empirically investigates whether information technology (IT) strategies, business strategies, and divisions are aligned to meet overall business goals for Korean Small and medium-sized enterprises (SMEs), based on structure based Strategic Alignment Model, and make comparison with those of Japanese SMEs. Using 2,869 valid responses of Korean Human Capital Corporate Panel survey, a result of this study suggests that Korean human resources (HR) departments have a major influence over IT strategy, which is the same as Japanese SMEs, even though their management styles are quite different. As for IT strategy, it is not related to other departments at all for Korean SMEs. The Korean management seems to possess a great power over each division, such as Sales/Service, Research and Development/Technical Experts, HR, and Production.Keywords: IT-business alignment, structured based strategic alignment model, structural equation model, human resources department
Procedia PDF Downloads 27214425 MGAUM—Towards a Mobile Government Adoption and Utilization Model: The Case of Saudi Arabia
Authors: Mohammed Alonazi, Natalia Beloff, Martin White
Abstract:
This paper presents a proposal for a mobile government adoption and utilization model (MGAUM), which is a framework designed to increase the adoption rate of m-government services in Saudi Arabia. Recent advances in mobile technologies such are Mobile compatibilities, The development of wireless communication, mobile applications and devices are enabling governments to deliver services in new ways to citizens more efficiently and economically. In the last decade, many governments around the globe are utilizing these advances effectively to develop their next generation of e-government services. However, a low adoption rate of m-government services by citizens is a common problem in Arabian countries, including Saudi Arabia. Yet, to our knowledge, very little research has been conducted focused on understanding the factors that influence citizen adoption of these m-government services in this part of the world. A set of social, cultural and technological factors have been identified in the literature, which has led to the formulation of associated research questions and hypotheses. These hypotheses will be tested on Saudi citizens using questionnaires and interview methods based around the technology acceptance model. A key objective of the MGAUM framework is to investigate and understand Saudi citizens perception towards adoption and utilization of m-government services.Keywords: e-government, m-government, citizen services quality, technology acceptance model, Saudi Arabia, adoption framework.
Procedia PDF Downloads 31314424 Combining ASTER Thermal Data and Spatial-Based Insolation Model for Identification of Geothermal Active Areas
Authors: Khalid Hussein, Waleed Abdalati, Pakorn Petchprayoon, Khaula Alkaabi
Abstract:
In this study, we integrated ASTER thermal data with an area-based spatial insolation model to identify and delineate geothermally active areas in Yellowstone National Park (YNP). Two pairs of L1B ASTER day- and nighttime scenes were used to calculate land surface temperature. We employed the Emissivity Normalization Algorithm which separates temperature from emissivity to calculate surface temperature. We calculated the incoming solar radiation for the area covered by each of the four ASTER scenes using an insolation model and used this information to compute temperature due to solar radiation. We then identified the statistical thermal anomalies using land surface temperature and the residuals calculated from modeled temperatures and ASTER-derived surface temperatures. Areas that had temperatures or temperature residuals greater than 2σ and between 1σ and 2σ were considered ASTER-modeled thermal anomalies. The areas identified as thermal anomalies were in strong agreement with the thermal areas obtained from the YNP GIS database. Also the YNP hot springs and geysers were located within areas identified as anomalous thermal areas. The consistency between our results and known geothermally active areas indicate that thermal remote sensing data, integrated with a spatial-based insolation model, provides an effective means for identifying and locating areas of geothermal activities over large areas and rough terrain.Keywords: thermal remote sensing, insolation model, land surface temperature, geothermal anomalies
Procedia PDF Downloads 37214423 Kinetic Model to Interpret Whistler Waves in Multicomponent Non-Maxwellian Space Plasmas
Authors: Warda Nasir, M. N. S. Qureshi
Abstract:
Whistler waves are right handed circularly polarized waves and are frequently observed in space plasmas. The Low frequency branch of the Whistler waves having frequencies nearly around 100 Hz, known as Lion roars, are frequently observed in magnetosheath. Another feature of the magnetosheath is the observations of flat top electron distributions with single as well as two electron populations. In the past, lion roars were studied by employing kinetic model using classical bi-Maxwellian distribution function, however, could not be justified both on quantitatively as well as qualitatively grounds. We studied Whistler waves by employing kinetic model using non-Maxwellian distribution function such as the generalized (r,q) distribution function which is the generalized form of kappa and Maxwellian distribution functions by employing kinetic theory with single or two electron populations. We compare our results with the Cluster observations and found good quantitative and qualitative agreement between them. At times when lion roars are observed (not observed) in the data and bi-Maxwellian could not provide the sufficient growth (damping) rates, we showed that when generalized (r,q) distribution function is employed, the resulted growth (damping) rates exactly match the observations.Keywords: kinetic model, whistler waves, non-maxwellian distribution function, space plasmas
Procedia PDF Downloads 31614422 A Prediction Model of Adopting IPTV
Authors: Jeonghwan Jeon
Abstract:
With the advent of IPTV in the fierce competition with existing broadcasting system, it is emerged as an important issue to predict how much the adoption of IPTV service will be. This paper aims to suggest a prediction model for adopting IPTV using classification and Ranking Belief Simplex (CaRBS). A simplex plot method of representing data allows a clear visual representation to the degree of interaction of the support from the variables to the prediction of the objects. CaRBS is applied to the survey data on the IPTV adoption.Keywords: prediction, adoption, IPTV, CaRBS
Procedia PDF Downloads 41814421 Identification of Vessel Class with Long Short-Term Memory Using Kinematic Features in Maritime Traffic Control
Authors: Davide Fuscà, Kanan Rahimli, Roberto Leuzzi
Abstract:
Preventing abuse and illegal activities in a given area of the sea is a very difficult and expensive task. Artificial intelligence offers the possibility to implement new methods to identify the vessel class type from the kinematic features of the vessel itself. The task strictly depends on the quality of the data. This paper explores the application of a deep, long short-term memory model by using AIS flow only with a relatively low quality. The proposed model reaches high accuracy on detecting nine vessel classes representing the most common vessel types in the Ionian-Adriatic Sea. The model has been applied during the Adriatic-Ionian trial period of the international EU ANDROMEDA H2020 project to identify vessels performing behaviors far from the expected one depending on the declared type.Keywords: maritime surveillance, artificial intelligence, behavior analysis, LSTM
Procedia PDF Downloads 23414420 Modeling in the Middle School: Eighth-Grade Students’ Construction of the Summer Job Problem
Authors: Neslihan Sahin Celik, Ali Eraslan
Abstract:
Mathematical model and modeling are one of the topics that have been intensively discussed in recent years. In line with the results of the PISA studies, researchers in many countries have begun to question how much students in school-education system are prepared to solve the real-world problems they encounter in their future professional lives. As a result, many mathematics educators have begun to emphasize the importance of new skills and understanding such as constructing, Hypothesizing, Describing, manipulating, predicting, working together for complex and multifaceted problems for success in beyond the school. When students increasingly face this kind of situations in their daily life, it is important to make sure that students have enough experience to work together and interpret mathematical situations that enable them to think in different ways and share their ideas with their peers. Thus, model eliciting activities are one of main tools that help students to gain experiences and the new skills required. This research study was carried on the town center of a big city located in the Black Sea region in Turkey. The participants were eighth-grade students in a middle school. After a six-week preliminary study, three students in an eighth-grade classroom were selected using criterion sampling technique and placed in a focus group. The focus group of three students was videotaped as they worked on a model eliciting activity, the Summer Job Problem. The conversation of the group was transcribed, examined with students’ written work and then qualitatively analyzed through the lens of Blum’s (1996) modeling processing cycle. The study results showed that eighth grade students can successfully work with the model eliciting, develop a model based on the two parameters and review the whole process. On the other hand, they had difficulties to relate parameters to each other and take all parameters into account to establish the model.Keywords: middle school, modeling, mathematical modeling, summer job problem
Procedia PDF Downloads 34014419 Thorium Resources of Georgia – Is It Its Future Energy ?
Authors: Avtandil Okrostsvaridze, Salome Gogoladze
Abstract:
In the light of exhaustion of hydrocarbon reserves of new energy resources, its search is of vital importance problem for the modern civilization. At the time of energy resource crisis, the radioactive element thorium (232Th) is considered as the main energy resource for the future of our civilization. Modern industry uses thorium in high-temperature and high-tech tools, but the most important property of thorium is that like uranium it can be used as fuel in nuclear reactors. However, thorium has a number of advantages compared to this element: Its concentration in the earth crust is 4-5 times higher than uranium; extraction and enrichment of thorium is much cheaper than of uranium; it is less radioactive; its waste products complete destruction is possible; thorium yields much more energy than uranium. Nowadays, developed countries, among them India and China, have started intensive work for creation of thorium nuclear reactors and intensive search for thorium reserves. It is not excluded that in the next 10 years these reactors will completely replace uranium reactors. Thorium ore mineralization is genetically related to alkaline-acidic magmatism. Thorium accumulations occur as in endogen marked as in exogenous conditions. Unfortunately, little is known about the reserves of this element in Georgia, as planned prospecting-exploration works of thorium have never been carried out here. Although, 3 ore occurrences of this element are detected: 1) In the Greater Caucasus Kakheti segment, in the hydrothermally altered rocks of the Lower Jurassic clay-shales, where thorium concentrations varied between 51 - 3882g/t; 2) In the eastern periphery of the Dzirula massif, in the hydrothermally alteration rocks of the cambrian quartz-diorite gneisses, where thorium concentrations varied between 117-266 g/t; 3) In active contact zone of the Eocene volcanites and syenitic intrusive in Vakijvari ore field of the Guria region, where thorium concentrations varied between 185 – 428 g/t. In addition, geological settings of the areas, where thorium occurrences were fixed, give a theoretical basis on possible accumulation of practical importance thorium ores. Besides, the Black Sea Guria region magnetite sand which is transported from Vakijvari ore field, should contain significant reserves of thorium. As the research shows, monazite (thorium containing mineral) is involved in magnetite in the form of the thinnest inclusions. The world class thorium deposit concentrations of this element vary within the limits of 50-200 g/t. Accordingly, on the basis of these data, thorium resources found in Georgia should be considered as perspective ore deposits. Generally, we consider that complex investigation of thorium should be included into the sphere of strategic interests of the state, because future energy of Georgia, will probably be thorium.Keywords: future energy, Georgia, ore field, thorium
Procedia PDF Downloads 49414418 Analysis of Energy Flows as An Approach for The Formation of Monitoring System in the Sustainable Regional Development
Authors: Inese Trusina, Elita Jermolajeva
Abstract:
Global challenges require a transition from the existing linear economic model to a model that will consider nature as a life support system for the developmenton the way to social well-being in the frame of the ecological economics paradigm. The article presentsbasic definitions for the development of formalized description of sustainabledevelopment monitoring. It provides examples of calculating the parameters of monitoring for the Baltic Sea region countries and their primary interpretation.Keywords: sustainability, development, power, ecological economics, regional economic, monitoring
Procedia PDF Downloads 12214417 Bayesian Locally Approach for Spatial Modeling of Visceral Leishmaniasis Infection in Northern and Central Tunisia
Authors: Kais Ben-Ahmed, Mhamed Ali-El-Aroui
Abstract:
This paper develops a Local Generalized Linear Spatial Model (LGLSM) to describe the spatial variation of Visceral Leishmaniasis (VL) infection risk in northern and central Tunisia. The response from each region is a number of affected children less than five years of age recorded from 1996 through 2006 from Tunisian pediatric departments and treated as a poison county level data. The model includes climatic factors, namely averages of annual rainfall, extreme values of low temperatures in winter and high temperatures in summer to characterize the climate of each region according to each continentality index, the pluviometric quotient of Emberger (Q2) to characterize bioclimatic regions and component for residual extra-poison variation. The statistical results show the progressive increase in the number of affected children in regions with high continentality index and low mean yearly rainfull. On the other hand, an increase in pluviometric quotient of Emberger contributed to a significant increase in VL incidence rate. When compared with the original GLSM, Bayesian locally modeling is improvement and gives a better approximation of the Tunisian VL risk estimation. According to the Bayesian approach inference, we use vague priors for all parameters model and Markov Chain Monte Carlo method.Keywords: generalized linear spatial model, local model, extra-poisson variation, continentality index, visceral leishmaniasis, Tunisia
Procedia PDF Downloads 39914416 Impact of Marangoni Stress and Mobile Surface Charge on Electrokinetics of Ionic Liquids Over Hydrophobic Surfaces
Authors: Somnath Bhattacharyya
Abstract:
The mobile adsorbed surface charge on hydrophobic surfaces can modify the velocity slip condition as well as create a Marangoni stress at the interface. The functionalized hydrophobic walls of micro/nanopores, e.g., graphene nanochannels, may possess physio-sorbed ions. The lateral mobility of the physisorbed absorbed ions creates a friction force as well as an electric force, leading to a modification in the velocity slip condition at the hydrophobic surface. In addition, the non-uniform distribution of these surface ions creates a surface tension gradient, leading to a Marangoni stress. The impact of the mobile surface charge on streaming potential and electrochemical energy conversion efficiency in a pressure-driven flow of ionized liquid through the nanopore is addressed. Also, enhanced electro-osmotic flow through the hydrophobic nanochannel is also analyzed. The mean-filed electrokinetic model is modified to take into account the short-range non-electrostatic steric interactions and the long-range Coulomb correlations. The steric interaction is modeled by considering the ions as charged hard spheres of finite radius suspended in the electrolyte medium. The electrochemical potential is modified by including the volume exclusion effect, which is modeled based on the BMCSL equation of state. The electrostatic correlation is accounted for in the ionic self-energy. The extremal of the self-energy leads to a fourth-order Poisson equation for the electric field. The ion transport is governed by the modified Nernst-Planck equation, which includes the ion steric interactions; born force arises due to the spatial variation of the dielectric permittivity and the dielectrophoretic force on the hydrated ions. This ion transport equation is coupled with the Navier-Stokes equation describing the flow of the ionized fluid and the 3fourth-order Poisson equation for the electric field. We numerically solve the coupled set of nonlinear governing equations along with the prescribed boundary conditions by adopting a control volume approach over a staggered grid arrangement. In the staggered grid arrangements, velocity components are stored on the midpoint of the cell faces to which they are normal, whereas the remaining scalar variables are stored at the center of each cell. The convection and electromigration terms are discretized at each interface of the control volumes using the total variation diminishing (TVD) approach to capture the strong convection resulting from the highly enhanced fluid flow due to the modified model. In order to link pressure to the continuity equation, we adopt a pressure correction-based iterative SIMPLE (Semi-Implicit Method for Pressure-Linked Equations) algorithm, in which the discretized continuity equation is converted to a Poisson equation involving pressure correction terms. Our results show that the physisorbed ions on a hydrophobic surface create an enhanced slip velocity when streaming potential, which enhances the convection current. However, the electroosmotic flow attenuates due to the mobile surface ions.Keywords: microfluidics, electroosmosis, streaming potential, electrostatic correlation, finite sized ions
Procedia PDF Downloads 7214415 Mathematical Modelling and AI-Based Degradation Analysis of the Second-Life Lithium-Ion Battery Packs for Stationary Applications
Authors: Farhad Salek, Shahaboddin Resalati
Abstract:
The production of electric vehicles (EVs) featuring lithium-ion battery technology has substantially escalated over the past decade, demonstrating a steady and persistent upward trajectory. The imminent retirement of electric vehicle (EV) batteries after approximately eight years underscores the critical need for their redirection towards recycling, a task complicated by the current inadequacy of recycling infrastructures globally. A potential solution for such concerns involves extending the operational lifespan of electric vehicle (EV) batteries through their utilization in stationary energy storage systems during secondary applications. Such adoptions, however, require addressing the safety concerns associated with batteries’ knee points and thermal runaways. This paper develops an accurate mathematical model representative of the second-life battery packs from a cell-to-pack scale using an equivalent circuit model (ECM) methodology. Neural network algorithms are employed to forecast the degradation parameters based on the EV batteries' aging history to develop a degradation model. The degradation model is integrated with the ECM to reflect the impacts of the cycle aging mechanism on battery parameters during operation. The developed model is tested under real-life load profiles to evaluate the life span of the batteries in various operating conditions. The methodology and the algorithms introduced in this paper can be considered the basis for Battery Management System (BMS) design and techno-economic analysis of such technologies.Keywords: second life battery, electric vehicles, degradation, neural network
Procedia PDF Downloads 6614414 Seismic Fragility Curves for Shallow Circular Tunnels under Different Soil Conditions
Authors: Siti Khadijah Che Osmi, Syed Mohd Ahmad
Abstract:
This paper presents a methodology to develop fragility curves for shallow tunnels so as to describe a relationship between seismic hazard and tunnel vulnerability. Emphasis is given to the influence of surrounding soil material properties because the dynamic behaviour of the tunnel mostly depends on it. Four ground properties of soils ranging from stiff to soft soils are selected. A 3D nonlinear time history analysis is used to evaluate the seismic response of the tunnel when subjected to five real earthquake ground intensities. The derived curves show the future probabilistic performance of the tunnels based on the predicted level of damage states corresponding to the peak ground acceleration. A comparison of the obtained results with the previous literature is provided to validate the reliability of the proposed fragility curves. Results show the significant role of soil properties and input motions in evaluating the seismic performance and response of shallow tunnels.Keywords: fragility analysis, seismic performance, tunnel lining, vulnerability
Procedia PDF Downloads 31614413 Robust Fuzzy PID Stabilizer: Modified Shuffled Frog Leaping Algorithm
Authors: Oveis Abedinia, Noradin Ghadimi, Nasser Mikaeilvand, Roza Poursoleiman, Asghar Poorfaraj
Abstract:
In this paper a robust Fuzzy Proportional Integral Differential (PID) controller is applied to multi-machine power system based on Modified Shuffled Frog Leaping (MSFL) algorithm. This newly proposed controller is more efficient because it copes with oscillations and different operating points. In this strategy the gains of the PID controller is optimized using the proposed technique. The nonlinear problem is formulated as an optimization problem for wide ranges of operating conditions using the MSFL algorithm. The simulation results demonstrate the effectiveness, good robustness and validity of the proposed method through some performance indices such as ITAE and FD under wide ranges operating conditions in comparison with TS and GSA techniques. The single-machine infinite bus system and New England 10-unit 39-bus standard power system are employed to illustrate the performance of the proposed method.Keywords: fuzzy PID, MSFL, multi-machine, low frequency oscillation
Procedia PDF Downloads 43414412 Local Interpretable Model-agnostic Explanations (LIME) Approach to Email Spam Detection
Authors: Rohini Hariharan, Yazhini R., Blessy Maria Mathew
Abstract:
The task of detecting email spam is a very important one in the era of digital technology that needs effective ways of curbing unwanted messages. This paper presents an approach aimed at making email spam categorization algorithms transparent, reliable and more trustworthy by incorporating Local Interpretable Model-agnostic Explanations (LIME). Our technique assists in providing interpretable explanations for specific classifications of emails to help users understand the decision-making process by the model. In this study, we developed a complete pipeline that incorporates LIME into the spam classification framework and allows creating simplified, interpretable models tailored to individual emails. LIME identifies influential terms, pointing out key elements that drive classification results, thus reducing opacity inherent in conventional machine learning models. Additionally, we suggest a visualization scheme for displaying keywords that will improve understanding of categorization decisions by users. We test our method on a diverse email dataset and compare its performance with various baseline models, such as Gaussian Naive Bayes, Multinomial Naive Bayes, Bernoulli Naive Bayes, Support Vector Classifier, K-Nearest Neighbors, Decision Tree, and Logistic Regression. Our testing results show that our model surpasses all other models, achieving an accuracy of 96.59% and a precision of 99.12%.Keywords: text classification, LIME (local interpretable model-agnostic explanations), stemming, tokenization, logistic regression.
Procedia PDF Downloads 4914411 Heat Source Temperature for Centered Heat Source on Isotropic Plate with Lower Surface Forced Cooling Using Neural Network and Three Different Materials
Authors: Fadwa Haraka, Ahmad Elouatouati, Mourad Taha Janan
Abstract:
In this study, we propose a neural network based method in order to calculate the heat source temperature of isotropic plate with lower surface forced cooling. To validate the proposed model, the heat source temperatures values will be compared to the analytical method -variables separation- and finite element model. The mathematical simulation is done through 3D numerical simulation by COMSOL software considering three different materials: Aluminum, Copper, and Graphite. The proposed method will lead to a formulation of the heat source temperature based on the thermal and geometric properties of the base plate.Keywords: thermal model, thermal resistance, finite element simulation, neural network
Procedia PDF Downloads 36114410 Selection of Intensity Measure in Probabilistic Seismic Risk Assessment of a Turkish Railway Bridge
Authors: M. F. Yilmaz, B. Ö. Çağlayan
Abstract:
Fragility curve is an effective common used tool to determine the earthquake performance of structural and nonstructural components. Also, it is used to determine the nonlinear behavior of bridges. There are many historical bridges in the Turkish railway network; the earthquake performances of these bridges are needed to be investigated. To derive fragility curve Intensity measures (IMs) and Engineering demand parameters (EDP) are needed to be determined. And the relation between IMs and EDP are needed to be derived. In this study, a typical simply supported steel girder riveted railway bridge is studied. Fragility curves of this bridge are derived by two parameters lognormal distribution. Time history analyses are done for selected 60 real earthquake data to determine the relation between IMs and EDP. Moreover, efficiency, practicality, and sufficiency of three different IMs are discussed. PGA, Sa(0.2s) and Sa(1s), the most common used IMs parameters for fragility curve in the literature, are taken into consideration in terms of efficiency, practicality and sufficiency.Keywords: railway bridges, earthquake performance, fragility analyses, selection of intensity measures
Procedia PDF Downloads 36114409 Effects of Magnetization Patterns on Characteristics of Permanent Magnet Linear Synchronous Generator for Wave Energy Converter Applications
Authors: Sung-Won Seo, Jang-Young Choi
Abstract:
The rare earth magnets used in synchronous generators offer many advantages, including high efficiency, greatly reduced the size, and weight. The permanent magnet linear synchronous generator (PMLSG) allows for direct drive without the need for a mechanical device. Therefore, the PMLSG is well suited to translational applications, such as wave energy converters and free piston energy converters. This manuscript compares the effects of different magnetization patterns on the characteristics of double-sided PMLSGs in slotless stator structures. The Halbach array has a higher flux density in air-gap than the Vertical array, and the advantages of its performance and efficiency are widely known. To verify the advantage of Halbach array, we apply a finite element method (FEM) and analytical method. In general, a FEM and an analytical method are used in the electromagnetic analysis for determining model characteristics, and the FEM is preferable to magnetic field analysis. However, the FEM is often slow and inflexible. On the other hand, the analytical method requires little time and produces accurate analysis of the magnetic field. Therefore, the flux density in air-gap and the Back-EMF can be obtained by FEM. In addition, the results from the analytical method correspond well with the FEM results. The model of the Halbach array reveals less copper loss than the model of the Vertical array, because of the Halbach array’s high output power density. The model of the Vertical array is lower core loss than the model of Halbach array, because of the lower flux density in air-gap. Therefore, the current density in the Vertical model is higher for identical power output. The completed manuscript will include the magnetic field characteristics and structural features of both models, comparing various results, and specific comparative analysis will be presented for the determination of the best model for application in a wave energy converting system.Keywords: wave energy converter, permanent magnet linear synchronous generator, finite element method, analytical method
Procedia PDF Downloads 30414408 A Structuring and Classification Method for Assigning Application Areas to Suitable Digital Factory Models
Authors: R. Hellmuth
Abstract:
The method of factory planning has changed a lot, especially when it is about planning the factory building itself. Factory planning has the task of designing products, plants, processes, organization, areas, and the building of a factory. Regular restructuring is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity and Ambiguity) lead to more frequent restructuring measures within a factory. A digital factory model is the planning basis for rebuilding measures and becomes an indispensable tool. Furthermore, digital building models are increasingly being used in factories to support facility management and manufacturing processes. The main research question of this paper is, therefore: What kind of digital factory model is suitable for the different areas of application during the operation of a factory? First, different types of digital factory models are investigated, and their properties and usabilities for use cases are analysed. Within the scope of investigation are point cloud models, building information models, photogrammetry models, and these enriched with sensor data are examined. It is investigated which digital models allow a simple integration of sensor data and where the differences are. Subsequently, possible application areas of digital factory models are determined by means of a survey and the respective digital factory models are assigned to the application areas. Finally, an application case from maintenance is selected and implemented with the help of the appropriate digital factory model. It is shown how a completely digitalized maintenance process can be supported by a digital factory model by providing information. Among other purposes, the digital factory model is used for indoor navigation, information provision, and display of sensor data. In summary, the paper shows a structuring of digital factory models that concentrates on the geometric representation of a factory building and its technical facilities. A practical application case is shown and implemented. Thus, the systematic selection of digital factory models with the corresponding application cases is evaluated.Keywords: building information modeling, digital factory model, factory planning, maintenance
Procedia PDF Downloads 11114407 Stock Price Prediction with 'Earnings' Conference Call Sentiment
Authors: Sungzoon Cho, Hye Jin Lee, Sungwhan Jeon, Dongyoung Min, Sungwon Lyu
Abstract:
Major public corporations worldwide use conference calls to report their quarterly earnings. These 'earnings' conference calls allow for questions from stock analysts. We investigated if it is possible to identify sentiment from the call script and use it to predict stock price movement. We analyzed call scripts from six companies, two each from Korea, China and Indonesia during six years 2011Q1 – 2017Q2. Random forest with Frequency-based sentiment scores using Loughran MacDonald Dictionary did better than control model with only financial indicators. When the stock prices went up 20 days from earnings release, our model predicted correctly 77% of time. When the model predicted 'up,' actual stock prices went up 65% of time. This preliminary result encourages us to investigate advanced sentiment scoring methodologies such as topic modeling, auto-encoder, and word2vec variants.Keywords: earnings call script, random forest, sentiment analysis, stock price prediction
Procedia PDF Downloads 294