Search results for: Grey prediction model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18046

Search results for: Grey prediction model

15166 Developing a Green Strategic Management Model with regarding HSE-MS

Authors: Amin Padash, Gholam Reza Nabi Bid Hendi, Hassan Hoveidi

Abstract:

Purpose: The aim of this research is developing a model for green management based on Health, Safety and Environmental Management System. An HSE-MS can be a powerful tool for organizations to both improve their environmental, health and safety performance, and enhance their business efficiency to green management. Model: The model is developed in this study can be used for industries as guidelines for implementing green management issue by considering Health, Safety and Environmental Management System. Case Study: The Pars Special Economic / Energy Zone Organization on behalf of Iran’s Petroleum Ministry and National Iranian Oil Company (NIOC) manages and develops the South and North oil and gas fields in the region. Methodology: This research according to objective is applied and based on implementing is descriptive and also prescription. We used technique MCDM (Multiple Criteria Decision-Making) for determining the priorities of the factors. Based on process approach the model consists of the following steps and components: first factors involved in green issues are determined. Based on them a framework is considered. Then with using MCDM (Multiple Criteria Decision-Making) algorithms (TOPSIS) the priority of basic variables are determined. The authors believe that the proposed model and results of this research can aid industries managers to implement green subjects according to Health, Safety and Environmental Management System in a more efficient and effective manner. Finding and conclusion: Basic factors involved in green issues and their weights can be the main finding. Model and relation between factors are the other finding of this research. The case is considered Petrochemical Company for promoting the system of ecological industry thinking.

Keywords: Fuzzy-AHP method , green management, health, safety and environmental management system, MCDM technique, TOPSIS

Procedia PDF Downloads 416
15165 Finite Element Analysis for Earing Prediction Incorporating the BBC2003 Material Model with Fully Implicit Integration Method: Derivation and Numerical Algorithm

Authors: Sajjad Izadpanah, Seyed Hadi Ghaderi, Morteza Sayah Irani, Mahdi Gerdooei

Abstract:

In this research work, a sophisticated yield criterion known as BBC2003, capable of describing planar anisotropic behaviors of aluminum alloy sheets, was integrated into the commercial finite element code ABAQUS/Standard via a user subroutine. The complete formulation of the implementation process using a fully implicit integration scheme, i.e., the classic backward Euler method, is presented, and relevant aspects of the yield criterion are introduced. In order to solve nonlinear differential and algebraic equations, the line-search algorithm was adopted in the user-defined material subroutine (UMAT) to expand the convergence domain of the iterative Newton-Raphson method. The developed subroutine was used to simulate a challenging computational problem with complex stress states, i.e., deep drawing of an anisotropic aluminum alloy AA3105. The accuracy and stability of the developed subroutine were confirmed by comparing the numerically predicted earing and thickness variation profiles with the experimental results, which showed an excellent agreement between numerical and experimental earing and thickness profiles. The integration of the BBC2003 yield criterion into ABAQUS/Standard represents a significant contribution to the field of computational mechanics and provides a useful tool for analyzing the mechanical behavior of anisotropic materials subjected to complex loading conditions.

Keywords: BBC2003 yield function, plastic anisotropy, fully implicit integration scheme, line search algorithm, explicit and implicit integration schemes

Procedia PDF Downloads 78
15164 Service Interactions Coordination Using a Declarative Approach: Focuses on Deontic Rule from Semantics of Business Vocabulary and Rules Models

Authors: Nurulhuda A. Manaf, Nor Najihah Zainal Abidin, Nur Amalina Jamaludin

Abstract:

Coordinating service interactions are a vital part of developing distributed applications that are built up as networks of autonomous participants, e.g., software components, web services, online resources, involve a collaboration between a diverse number of participant services on different providers. The complexity in coordinating service interactions reflects how important the techniques and approaches require for designing and coordinating the interaction between participant services to ensure the overall goal of a collaboration between participant services is achieved. The objective of this research is to develop capability of steering a complex service interaction towards a desired outcome. Therefore, an efficient technique for modelling, generating, and verifying the coordination of service interactions is developed. The developed model describes service interactions using service choreographies approach and focusing on a declarative approach, advocating an Object Management Group (OMG) standard, Semantics of Business Vocabulary and Rules (SBVR). This model, namely, SBVR model for service choreographies focuses on a declarative deontic rule expressing both obligation and prohibition, which can be more useful in working with coordinating service interactions. The generated SBVR model is then be formulated and be transformed into Alloy model using Alloy Analyzer for verifying the generated SBVR model. The transformation of SBVR into Alloy allows to automatically generate the corresponding coordination of service interactions (service choreography), hence producing an immediate instance of execution that satisfies the constraints of the specification and verifies whether a specific request can be realised in the given choreography in the generated choreography.

Keywords: service choreography, service coordination, behavioural modelling, complex interactions, declarative specification, verification, model transformation, semantics of business vocabulary and rules, SBVR

Procedia PDF Downloads 159
15163 A Phase Field Approach to Model Crack Interface Interaction in Ceramic Matrix Composites

Authors: Dhaladhuli Pranavi, Amirtham Rajagopal

Abstract:

There are various failure modes in ceramic matrix composites; notable ones are fiber breakage, matrix cracking and fiber matrix debonding. Crack nucleation and propagation in microstructure of such composites requires an understanding of interaction of crack with the multiple inclusion heterogeneous system and interfaces. In order to assess structural integrity, the material parameters especially of the interface that governs the crack growth should be determined. In the present work, a nonlocal phase field approach is proposed to model the crack interface interaction in such composites. Nonlocal approaches help in understanding the complex mechanisms of delamination growth and mitigation and operates at a material length scale. The performance of the proposed formulation is illustrated through representative numerical examples. The model proposed is implemented in the framework of the finite element method. Several parametric studies on interface crack interaction are conducted. The proposed model is easy and simple to implement and works very well in modeling fracture in composite systems.

Keywords: composite, interface, nonlocal, phase field

Procedia PDF Downloads 146
15162 Application of Seasonal Autoregressive Integrated Moving Average Model for Forecasting Monthly Flows in Waterval River, South Africa

Authors: Kassahun Birhanu Tadesse, Megersa Olumana Dinka

Abstract:

Reliable future river flow information is basic for planning and management of any river systems. For data scarce river system having only a river flow records like the Waterval River, a univariate time series models are appropriate for river flow forecasting. In this study, a univariate Seasonal Autoregressive Integrated Moving Average (SARIMA) model was applied for forecasting Waterval River flow using GRETL statistical software. Mean monthly river flows from 1960 to 2016 were used for modeling. Different unit root tests and Mann-Kendall trend analysis were performed to test the stationarity of the observed flow time series. The time series was differenced to remove the seasonality. Using the correlogram of seasonally differenced time series, different SARIMA models were identified, their parameters were estimated, and diagnostic check-up of model forecasts was performed using white noise and heteroscedasticity tests. Finally, based on minimum Akaike Information (AIc) and Hannan-Quinn (HQc) criteria, SARIMA (3, 0, 2) x (3, 1, 3)12 was selected as the best model for Waterval River flow forecasting. Therefore, this model can be used to generate future river information for water resources development and management in Waterval River system. SARIMA model can also be used for forecasting other similar univariate time series with seasonality characteristics.

Keywords: heteroscedasticity, stationarity test, trend analysis, validation, white noise

Procedia PDF Downloads 209
15161 Exploring the Effect of Using Lesh Model in Enhancing Prospective Mathematics Teachers’ Number Sense

Authors: Areej Isam Barham

Abstract:

Developing students’ number sense is an essential element in the learning of mathematics. Number sense is one of the foundational ideas in mathematics where students need to understand numbers, representing them in different ways, and realize the relationships among numbers. Number sense also reflects students’ understanding of the meaning of operations, how they related to one another, how to compute fluently and make reasonable estimates. Developing students’ number sense in the mathematics classroom requires good preparation for mathematics teachers, those who will direct their students towards the real understanding of numbers and its implementation in the learning of mathematics. This study describes the development of elementary prospective mathematics teachers’ number sense through a mathematics teaching methods course at Qatar University. The study examined the effect of using the Lesh model in enhancing mathematics prospective teachers’ number sense. Thirty-nine elementary prospective mathematics teachers involved in the current study. The study followed an experimental research approach, and quantitative research methods were used to answer the research questions. Pre-post number sense test was constructed and implemented before and after teaching by using the Lesh model. Data were analyzed using Statistical Packages for Social Sciences (SPSS). Descriptive data analysis and t-test were used to examine the impact of using the Lesh model in enhancing prospective teachers’ number sense. Finding of the study indicated poor number sense and limited numeracy skills before implementing the use of the Lesh model, which highly demonstrate the importance of the study. The results of the study also revealed a positive impact on the use of the Lesh model in enhancing prospective teachers’ number sense with statistically significant differences. The discussion of the study addresses different features and issues related to the participants’ number sense. In light of the study, the research presents recommendations and suggestions for the future development of mathematics prospective teachers’ number sense.

Keywords: number sense, Lesh model, prospective mathematics teachers, development of number sense

Procedia PDF Downloads 144
15160 Machine Learning Model to Predict TB Bacteria-Resistant Drugs from TB Isolates

Authors: Rosa Tsegaye Aga, Xuan Jiang, Pavel Vazquez Faci, Siqing Liu, Simon Rayner, Endalkachew Alemu, Markos Abebe

Abstract:

Tuberculosis (TB) is a major cause of disease globally. In most cases, TB is treatable and curable, but only with the proper treatment. There is a time when drug-resistant TB occurs when bacteria become resistant to the drugs that are used to treat TB. Current strategies to identify drug-resistant TB bacteria are laboratory-based, and it takes a longer time to identify the drug-resistant bacteria and treat the patient accordingly. But machine learning (ML) and data science approaches can offer new approaches to the problem. In this study, we propose to develop an ML-based model to predict the antibiotic resistance phenotypes of TB isolates in minutes and give the right treatment to the patient immediately. The study has been using the whole genome sequence (WGS) of TB isolates as training data that have been extracted from the NCBI repository and contain different countries’ samples to build the ML models. The reason that different countries’ samples have been included is to generalize the large group of TB isolates from different regions in the world. This supports the model to train different behaviors of the TB bacteria and makes the model robust. The model training has been considering three pieces of information that have been extracted from the WGS data to train the model. These are all variants that have been found within the candidate genes (F1), predetermined resistance-associated variants (F2), and only resistance-associated gene information for the particular drug. Two major datasets have been constructed using these three information. F1 and F2 information have been considered as two independent datasets, and the third information is used as a class to label the two datasets. Five machine learning algorithms have been considered to train the model. These are Support Vector Machine (SVM), Random forest (RF), Logistic regression (LR), Gradient Boosting, and Ada boost algorithms. The models have been trained on the datasets F1, F2, and F1F2 that is the F1 and the F2 dataset merged. Additionally, an ensemble approach has been used to train the model. The ensemble approach has been considered to run F1 and F2 datasets on gradient boosting algorithm and use the output as one dataset that is called F1F2 ensemble dataset and train a model using this dataset on the five algorithms. As the experiment shows, the ensemble approach model that has been trained on the Gradient Boosting algorithm outperformed the rest of the models. In conclusion, this study suggests the ensemble approach, that is, the RF + Gradient boosting model, to predict the antibiotic resistance phenotypes of TB isolates by outperforming the rest of the models.

Keywords: machine learning, MTB, WGS, drug resistant TB

Procedia PDF Downloads 56
15159 Investigations on the Influence of Web Openings on the Load Bearing Behavior of Steel Beams

Authors: Felix Eyben, Simon Schaffrath, Markus Feldmann

Abstract:

A building should maximize the potential for use through its design. Therefore, flexible use is always important when designing a steel structure. To create flexibility, steel beams with web openings are increasingly used, because these offer the advantage that cables, pipes and other technical equipment can easily be routed through without detours, allowing for more space-saving and aesthetically pleasing construction. This can also significantly reduce the height of ceiling systems. Until now, beams with web openings were not explicitly considered in the European standard. However, this is to be done with the new EN 1993-1-13, in which design rules for different opening forms are defined. In order to further develop the design concepts, beams with web openings under bending are therefore to be investigated in terms of damage mechanics as part of a German national research project aiming to optimize the verifications for steel structures based on a wider database and a validated damage prediction. For this purpose, first, fundamental factors influencing the load-bearing behavior of girders with web openings under bending load were investigated numerically without taking material damage into account. Various parameter studies were carried out for this purpose. For example, the factors under study were the opening shape, size and position as well as structural aspects as the span length, arrangement of stiffeners and loading situation. The load-bearing behavior is evaluated using resulting load-deformation curves. These results are compared with the design rules and critically analyzed. Experimental tests are also planned based on these results. Moreover, the implementation of damage mechanics in the form of the modified Bai-Wierzbicki model was examined. After the experimental tests will have been carried out, the numerical models are validated and further influencing factors will be investigated on the basis of parametric studies.

Keywords: damage mechanics, finite element, steel structures, web openings

Procedia PDF Downloads 177
15158 An Application of the Single Equation Regression Model

Authors: S. K. Ashiquer Rahman

Abstract:

Recently, oil has become more influential in almost every economic sector as a key material. As can be seen from the news, when there are some changes in an oil price or OPEC announces a new strategy, its effect spreads to every part of the economy directly and indirectly. That’s a reason why people always observe the oil price and try to forecast the changes of it. The most important factor affecting the price is its supply which is determined by the number of wildcats drilled. Therefore, a study about the number of wellheads and other economic variables may give us some understanding of the mechanism indicated by the amount of oil supplies. In this paper, we will consider a relationship between the number of wellheads and three key factors: the price of the wellhead, domestic output, and GNP constant dollars. We also add trend variables in the models because the consumption of oil varies from time to time. Moreover, this paper will use an econometrics method to estimate parameters in the model, apply some tests to verify the result we acquire, and then conclude the model.

Keywords: price, domestic output, GNP, trend variable, wildcat activity

Procedia PDF Downloads 66
15157 An Enhanced Digital Forensic Model for Internet of Things Forensic

Authors: Tina Wu, Andrew Martin

Abstract:

The expansion of the Internet of Things (IoT) brings a new level of threat. Attacks on IoT are already being used by criminals to form botnets, launch Distributed Denial of Service (DDoS) and distribute malware. This opens a whole new digital forensic arena to develop forensic methodologies in order to have the capability to investigate IoT related crimes. However, existing proposed IoT forensic models are still premature requiring further improvement and validation, many lack details on the acquisition and analysis phase. This paper proposes an enhanced theoretical IoT digital forensic model focused on identifying and acquiring the main sources of evidence in a methodical way. In addition, this paper presents a theoretical acquisition framework of the different stages required in order to be capable of acquiring evidence from IoT devices.

Keywords: acquisition, Internet of Things, model, zoning

Procedia PDF Downloads 273
15156 Building Information Modeling Applied for the Measurement of Water Footprint of Construction Supplies

Authors: Julio Franco

Abstract:

Water is used, directly and indirectly, in all activities of the construction productive chain, making it a subject of worldwide relevance for sustainable development. The ongoing expansion of urban areas leads to a high demand for natural resources, which in turn cause significant environmental impacts. The present work proposes the application of BIM tools to assist the measurement of the water footprint (WF) of civil construction supplies. Data was inserted into the model as element properties, allowing them to be analyzed by element or in the whole model. The WF calculation was automated using parameterization in Autodesk Revit software. Parameterization was associated to the materials of each element in the model so that any changes in these elements directly alter the results of WF calculations. As a case study, we applied into a building project model to test the parameterized calculus of WF. Results show that the proposed parameterization successfully automated WF calculations according to design changes. We envision this tool to assist the measurement and rationalization of the environmental impact in terms of WF of construction projects.

Keywords: building information modeling, BIM, sustainable development, water footprint

Procedia PDF Downloads 152
15155 Operation Cycle Model of ASz62IR Radial Aircraft Engine

Authors: M. Duk, L. Grabowski, P. Magryta

Abstract:

Today's very important element relating to air transport is the environment impact issues. Nowadays there are no emissions standards for turbine and piston engines used in air transport. However, it should be noticed that the environmental effect in the form of exhaust gases from aircraft engines should be as small as possible. For this purpose, R&D centers often use special software to simulate and to estimate the negative effect of engine working process. For cooperation between the Lublin University of Technology and the Polish aviation company WSK "PZL-KALISZ" S.A., to achieve more effective operation of the ASz62IR engine, one of such tools have been used. The AVL Boost software allows to perform 1D simulations of combustion process of piston engines. ASz62IR is a nine-cylinder aircraft engine in a radial configuration. In order to analyze the impact of its working process on the environment, the mathematical model in the AVL Boost software have been made. This model contains, among others, model of the operation cycle of the cylinders. This model was based on a volume change in combustion chamber according to the reciprocating movement of a piston. The simplifications that all of the pistons move identically was assumed. The changes in cylinder volume during an operating cycle were specified. Those changes were important to determine the energy balance of a cylinder in an internal combustion engine which is fundamental for a model of the operating cycle. The calculations for cylinder thermodynamic state were based on the first law of thermodynamics. The change in the mass in the cylinder was calculated from the sum of inflowing and outflowing masses including: cylinder internal energy, heat from the fuel, heat losses, mass in cylinder, cylinder pressure and volume, blowdown enthalpy, evaporation heat etc. The model assumed that the amount of heat released in combustion process was calculated from the pace of combustion, using Vibe model. For gas exchange, it was also important to consider heat transfer in inlet and outlet channels because of much higher values there than for flow in a straight pipe. This results from high values of heat exchange coefficients and temperature coefficients near valves and valve seats. A Zapf modified model of heat exchange was used. To use the model with the flight scenarios, the impact of flight altitude on engine performance has been analyze. It was assumed that the pressure and temperature at the inlet and outlet correspond to the values resulting from the model for International Standard Atmosphere (ISA). Comparing this model of operation cycle with the others submodels of the ASz62IR engine, it could be noticed, that a full analysis of the performance of the engine, according to the ISA conditions, can be made. This work has been financed by the Polish National Centre for Research and Development, INNOLOT, under

Keywords: aviation propulsion, AVL Boost, engine model, operation cycle, aircraft engine

Procedia PDF Downloads 296
15154 Probability-Based Damage Detection of Structures Using Model Updating with Enhanced Ideal Gas Molecular Movement Algorithm

Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee

Abstract:

Model updating method has received increasing attention in damage detection structures based on measured modal parameters. Therefore, a probability-based damage detection (PBDD) procedure based on a model updating procedure is presented in this paper, in which a one-stage model-based damage identification technique based on the dynamic features of a structure is investigated. The presented framework uses a finite element updating method with a Monte Carlo simulation that considers the uncertainty caused by measurement noise. Enhanced ideal gas molecular movement (EIGMM) is used as the main algorithm for model updating. Ideal gas molecular movement (IGMM) is a multiagent algorithm based on the ideal gas molecular movement. Ideal gas molecules disperse rapidly in different directions and cover all the space inside. This is embedded in the high speed of molecules, collisions between them and with the surrounding barriers. In IGMM algorithm to accomplish the optimal solutions, the initial population of gas molecules is randomly generated and the governing equations related to the velocity of gas molecules and collisions between those are utilized. In this paper, an enhanced version of IGMM, which removes unchanged variables after specified iterations, is developed. The proposed method is implemented on two numerical examples in the field of structural damage detection. The results show that the proposed method can perform well and competitive in PBDD of structures.

Keywords: enhanced ideal gas molecular movement (EIGMM), ideal gas molecular movement (IGMM), model updating method, probability-based damage detection (PBDD), uncertainty quantification

Procedia PDF Downloads 282
15153 A Comparative Asessment of Some Algorithms for Modeling and Forecasting Horizontal Displacement of Ialy Dam, Vietnam

Authors: Kien-Trinh Thi Bui, Cuong Manh Nguyen

Abstract:

In order to simulate and reproduce the operational characteristics of a dam visually, it is necessary to capture the displacement at different measurement points and analyze the observed movement data promptly to forecast the dam safety. The accuracy of forecasts is further improved by applying machine learning methods to data analysis progress. In this study, the horizontal displacement monitoring data of the Ialy hydroelectric dam was applied to machine learning algorithms: Gaussian processes, multi-layer perceptron neural networks, and the M5-rules algorithm for modelling and forecasting of horizontal displacement of the Ialy hydropower dam (Vietnam), respectively, for analysing. The database which used in this research was built by collecting time series of data from 2006 to 2021 and divided into two parts: training dataset and validating dataset. The final results show all three algorithms have high performance for both training and model validation, but the MLPs is the best model. The usability of them are further investigated by comparison with a benchmark models created by multi-linear regression. The result show the performance which obtained from all the GP model, the MLPs model and the M5-Rules model are much better, therefore these three models should be used to analyze and predict the horizontal displacement of the dam.

Keywords: Gaussian processes, horizontal displacement, hydropower dam, Ialy dam, M5-Rules, multi-layer perception neural networks

Procedia PDF Downloads 218
15152 A Pedagogical Study of Computational Design in a Simulated Building Information Modeling-Cloud Environment

Authors: Jaehwan Jung, Sung-Ah Kim

Abstract:

Building Information Modeling (BIM) provides project stakeholders with various information about property and geometry of entire component as a 3D object-based parametric building model. BIM represents a set of Information and solutions that are expected to improve collaborative work process and quality of the building design. To improve collaboration among project participants, the BIM model should provide the necessary information to remote participants in real time and manage the information in the process. The purpose of this paper is to propose a process model that can apply effective architectural design collaborative work process in architectural design education in BIM-Cloud environment.

Keywords: BIM, cloud computing, collaborative design, digital design education

Procedia PDF Downloads 438
15151 LORA: A Learning Outcome Modelling Approach for Higher Education

Authors: Aqeel Zeid, Hasna Anees, Mohamed Adheeb, Mohamed Rifan, Kalpani Manathunga

Abstract:

To achieve constructive alignment in a higher education program, a clear set of learning outcomes must be defined. Traditional learning outcome definition techniques such as Bloom’s taxonomy are not written to be utilized by the student. This might be disadvantageous for students in student-centric learning settings where the students are expected to formulate their own learning strategies. To solve the problem, we propose the learning outcome relation and aggregation (LORA) model. To achieve alignment, we developed learning outcome, assessment, and resource authoring tools which help teachers to tag learning outcomes during creation. A pilot study was conducted with an expert panel consisting of experienced professionals in the education domain to evaluate whether the LORA model and tools present an improvement over the traditional methods. The panel unanimously agreed that the model and tools are beneficial and effective. Moreover, it helped them model learning outcomes in a more student centric and descriptive way.

Keywords: learning design, constructive alignment, Bloom’s taxonomy, learning outcome modelling

Procedia PDF Downloads 191
15150 Model of Application of Blockchain Technology in Public Finances

Authors: M. Vlahovic

Abstract:

This paper presents a model of public finances, which combines three concepts: participatory budgeting, crowdfunding and blockchain technology. Participatory budgeting is defined as a process in which community members decide how to spend a part of community’s budget. Crowdfunding is a practice of funding a project by collecting small monetary contributions from a large number of people via an Internet platform. Blockchain technology is a distributed ledger that enables efficient and reliable transactions that are secure and transparent. In this hypothetical model, the government or authorities on local/regional level would set up a platform where they would propose public projects to citizens. Citizens would browse through projects and support or vote for those which they consider justified and necessary. In return, they would be entitled to a tax relief in the amount of their monetary contribution. Since the blockchain technology enables tracking of transactions, it can be used to mitigate corruption, money laundering and lack of transparency in public finances. Models of its application have already been created for e-voting, health records or land registries. By presenting a model of application of blockchain technology in public finances, this paper takes into consideration the potential of blockchain technology to disrupt governments and make processes more democratic, secure, transparent and efficient. The framework for this paper consists of multiple streams of research, including key concepts of direct democracy, public finance (especially the voluntary theory of public finance), information and communication technology, especially blockchain technology and crowdfunding. The framework defines rules of the game, basic conditions for the implementation of the model, benefits, potential problems and development perspectives. As an oversimplified map of a new form of public finances, the proposed model identifies primary factors, that influence the possibility of implementation of the model, and that could be tracked, measured and controlled in case of experimentation with the model.

Keywords: blockchain technology, distributed ledger, participatory budgeting, crowdfunding, direct democracy, internet platform, e-government, public finance

Procedia PDF Downloads 155
15149 Econophysical Approach on Predictability of Financial Crisis: The 2001 Crisis of Turkey and Argentina Case

Authors: Arzu K. Kamberli, Tolga Ulusoy

Abstract:

Technological developments and the resulting global communication have made the 21st century when large capitals are moved from one end to the other via a button. As a result, the flow of capital inflows has accelerated, and capital inflow has brought with it crisis-related infectiousness. Considering the irrational human behavior, the financial crisis in the world under the influence of the whole world has turned into the basic problem of the countries and increased the interest of the researchers in the reasons of the crisis and the period in which they lived. Therefore, the complex nature of the financial crises and its linearly unexplained structure have also been included in the new discipline, econophysics. As it is known, although financial crises have prediction mechanisms, there is no definite information. In this context, in this study, using the concept of electric field from the electrostatic part of physics, an early econophysical approach for global financial crises was studied. The aim is to define a model that can take place before the financial crises, identify financial fragility at an earlier stage and help public and private sector members, policy makers and economists with an econophysical approach. 2001 Turkey crisis has been assessed with data from Turkish Central Bank which is covered between 1992 to 2007, and for 2001 Argentina crisis, data was taken from IMF and the Central Bank of Argentina from 1997 to 2007. As an econophysical method, an analogy is used between the Gauss's law used in the calculation of the electric field and the forecasting of the financial crisis. The concept of Φ (Financial Flux) has been adopted for the pre-warning of the crisis by taking advantage of this analogy, which is based on currency movements and money mobility. For the first time used in this study Φ (Financial Flux) calculations obtained by the formula were analyzed by Matlab software, and in this context, in 2001 Turkey and Argentina Crisis for Φ (Financial Flux) crisis of values has been confirmed to give pre-warning.

Keywords: econophysics, financial crisis, Gauss's Law, physics

Procedia PDF Downloads 158
15148 Facility Anomaly Detection with Gaussian Mixture Model

Authors: Sunghoon Park, Hank Kim, Jinwon An, Sungzoon Cho

Abstract:

Internet of Things allows one to collect data from facilities which are then used to monitor them and even predict malfunctions in advance. Conventional quality control methods focus on setting a normal range on a sensor value defined between a lower control limit and an upper control limit, and declaring as an anomaly anything falling outside it. However, interactions among sensor values are ignored, thus leading to suboptimal performance. We propose a multivariate approach which takes into account many sensor values at the same time. In particular Gaussian Mixture Model is used which is trained to maximize likelihood value using Expectation-Maximization algorithm. The number of Gaussian component distributions is determined by Bayesian Information Criterion. The negative Log likelihood value is used as an anomaly score. The actual usage scenario goes like a following. For each instance of sensor values from a facility, an anomaly score is computed. If it is larger than a threshold, an alarm will go off and a human expert intervenes and checks the system. A real world data from Building energy system was used to test the model.

Keywords: facility anomaly detection, gaussian mixture model, anomaly score, expectation maximization algorithm

Procedia PDF Downloads 276
15147 Developing an ANN Model to Predict Anthropometric Dimensions Based on Real Anthropometric Database

Authors: Waleed A. Basuliman, Khalid S. AlSaleh, Mohamed Z. Ramadan

Abstract:

Applying the anthropometric dimensions is considered one of the important factors when designing any human-machine system. In this study, the estimation of anthropometric dimensions has been improved by developing artificial neural network that aims to predict the anthropometric measurements of the male in Saudi Arabia. A total of 1427 Saudi males from age 6 to 60 participated in measuring twenty anthropometric dimensions. These anthropometric measurements are important for designing the majority of work and life applications in Saudi Arabia. The data were collected during 8 months from different locations in Riyadh City. Five of these dimensions were used as predictors variables (inputs) of the model, and the remaining fifteen dimensions were set to be the measured variables (outcomes). The hidden layers have been varied during the structuring stage, and the best performance was achieved with the network structure 6-25-15. The results showed that the developed Neural Network model was significantly able to predict the body dimensions for the population of Saudi Arabia. The network mean absolute percentage error (MAPE) and the root mean squared error (RMSE) were found 0.0348 and 3.225 respectively. The accuracy of the developed neural network was evaluated by compare the predicted outcomes with a multiple regression model. The ANN model performed better and resulted excellent correlation coefficients between the predicted and actual dimensions.

Keywords: artificial neural network, anthropometric measurements, backpropagation, real anthropometric database

Procedia PDF Downloads 582
15146 Evolved Bat Algorithm Based Adaptive Fuzzy Sliding Mode Control with LMI Criterion

Authors: P.-W. Tsai, C.-Y. Chen, C.-W. Chen

Abstract:

In this paper, the stability analysis of a GA-Based adaptive fuzzy sliding model controller for a nonlinear system is discussed. First, a nonlinear plant is well-approximated and described with a reference model and a fuzzy model, both involving FLC rules. Then, FLC rules and the consequent parameter are decided on via an Evolved Bat Algorithm (EBA). After this, we guarantee a new tracking performance inequality for the control system. The tracking problem is characterized to solve an eigenvalue problem (EVP). Next, an adaptive fuzzy sliding model controller (AFSMC) is proposed to stabilize the system so as to achieve good control performance. Lyapunov’s direct method can be used to ensure the stability of the nonlinear system. It is shown that the stability analysis can reduce nonlinear systems into a linear matrix inequality (LMI) problem. Finally, a numerical simulation is provided to demonstrate the control methodology.

Keywords: adaptive fuzzy sliding mode control, Lyapunov direct method, swarm intelligence, evolved bat algorithm

Procedia PDF Downloads 446
15145 Modeling the Impacts of Road Construction on Lands Values

Authors: Maha Almumaiz, Harry Evdorides

Abstract:

Change in land value typically occurs when a new interurban road construction causes an increase in accessibility; this change in the adjacent lands values differs according to land characteristics such as geographic location, land use type, land area and sale time (appraisal time). A multiple regression model is obtained to predict the percent change in land value (CLV) based on four independent variables namely land distance from the constructed road, area of land, nature of land use and time from the works completion of the road. The random values of percent change in land value were generated using Microsoft Excel with a range of up to 35%. The trend of change in land value with the four independent variables was determined from the literature references. The statistical analysis and model building process has been made by using the IBM SPSS V23 software. The Regression model suggests, for lands that are located within 3 miles as the straight distance from the road, the percent CLV is between (0-35%) which is depending on many factors including distance from the constructed road, land use, land area and time from works completion of the new road.

Keywords: interurban road, land use types, new road construction, percent CLV, regression model

Procedia PDF Downloads 268
15144 On the Utility of Bidirectional Transformers in Gene Expression-Based Classification

Authors: Babak Forouraghi

Abstract:

A genetic circuit is a collection of interacting genes and proteins that enable individual cells to implement and perform vital biological functions such as cell division, growth, death, and signaling. In cell engineering, synthetic gene circuits are engineered networks of genes specifically designed to implement functionalities that are not evolved by nature. These engineered networks enable scientists to tackle complex problems such as engineering cells to produce therapeutics within the patient's body, altering T cells to target cancer-related antigens for treatment, improving antibody production using engineered cells, tissue engineering, and production of genetically modified plants and livestock. Construction of computational models to realize genetic circuits is an especially challenging task since it requires the discovery of the flow of genetic information in complex biological systems. Building synthetic biological models is also a time-consuming process with relatively low prediction accuracy for highly complex genetic circuits. The primary goal of this study was to investigate the utility of a pre-trained bidirectional encoder transformer that can accurately predict gene expressions in genetic circuit designs. The main reason behind using transformers is their innate ability (attention mechanism) to take account of the semantic context present in long DNA chains that are heavily dependent on the spatial representation of their constituent genes. Previous approaches to gene circuit design, such as CNN and RNN architectures, are unable to capture semantic dependencies in long contexts, as required in most real-world applications of synthetic biology. For instance, RNN models (LSTM, GRU), although able to learn long-term dependencies, greatly suffer from vanishing gradient and low-efficiency problem when they sequentially process past states and compresses contextual information into a bottleneck with long input sequences. In other words, these architectures are not equipped with the necessary attention mechanisms to follow a long chain of genes with thousands of tokens. To address the above-mentioned limitations, a transformer model was built in this work as a variation to the existing DNA Bidirectional Encoder Representations from Transformers (DNABERT) model. It is shown that the proposed transformer is capable of capturing contextual information from long input sequences with an attention mechanism. In previous works on genetic circuit design, the traditional approaches to classification and regression, such as Random Forrest, Support Vector Machine, and Artificial Neural Networks, were able to achieve reasonably high R2 accuracy levels of 0.95 to 0.97. However, the transformer model utilized in this work, with its attention-based mechanism, was able to achieve a perfect accuracy level of 100%. Further, it is demonstrated that the efficiency of the transformer-based gene expression classifier is not dependent on the presence of large amounts of training examples, which may be difficult to compile in many real-world gene circuit designs.

Keywords: machine learning, classification and regression, gene circuit design, bidirectional transformers

Procedia PDF Downloads 65
15143 Your First Step to Understanding Research Ethics: Psychoneurolinguistic Approach

Authors: Sadeq Al Yaari, Ayman Al Yaari, Adham Al Yaari, Montaha Al Yaari, Aayah Al Yaari, Sajedah Al Yaari

Abstract:

Objective: This research aims at investigating the research ethics in the field of science. Method: It is an exploratory research wherein the researchers attempted to cover the phenomenon at hand from all specialists’ viewpoints. Results Discussion is based upon the findings resulted from the analysis the researcher undertook. Concerning the results’ prediction, the researcher needs first to seek highly qualified people in the field of research as well as in the field of statistics who share the philosophy of the research. Then s/he should make sure that s/he is adequately trained in the specific techniques, methods and statically programs that are used at the study. S/he should also believe in continually analysis for the data in the most current methods.

Keywords: research ethics, legal, rights, psychoneurolinguistics

Procedia PDF Downloads 50
15142 [Keynote Speech]: Simulation Studies of Pulsed Voltage Effects on Cells

Authors: Jiahui Song

Abstract:

In order to predict or explain a complicated biological process, it is important first to construct mathematical models that can be used to yield analytical solutions. Through numerical simulation, mathematical model results can be used to test scenarios that might not be easily attained in a laboratory experiment, or to predict parameters or phenomena. High-intensity, nanosecond pulse electroporation has been a recent development in bioelectrics. The dynamic pore model can be achieved by including a dynamic aspect and a dependence on the pore population density into pore formation energy equation to analyze and predict such electroporation effects. For greater accuracy, with inclusion of atomistic details, molecular dynamics (MD) simulations were also carried out during this study. Besides inducing pores in cells, external voltages could also be used in principle to modulate action potential generation in nerves. This could have an application in electrically controlled ‘pain management’. Also a simple model-based rate equation treatment of the various cellular bio-chemical processes has been used to predict the pulse number dependent cell survival trends.

Keywords: model, high-intensity, nanosecond, bioelectrics

Procedia PDF Downloads 230
15141 Assessment of Alternative Water Resources and Growing Media in Green Roofs

Authors: Hamideh Nouri, Sattar Chavoshi Borujeni

Abstract:

Grey infrastructure is an unavoidable part of urbanisation that is threatening the local microclimates. Sustainable urbanisation requires more green infrastructure in cities such as green roofs to minimise urbanisation impacts. The environmental, social and economic benefits of green roofs are widely deliberated. However, there is still a lack of assessment of the water management for green roofs. This paper aimed to assess the irrigation management of green roofs in a semi-arid region where blue water scarcity is one of the primary challenges in urban water management. To determine the appropriate water source and growing media for green roofs, an experiment was established at the University of South Australia, Australia. This study compared the performance of two growing media and three water sources on the drainage quality, medium weight and survival rate of potted Tussock grass (Poa labillardieral), an endemic plant to Australia and recommended for green roofs. Three irrigation sources were tap water, mixed of wastewater-stormwater, and rainwater. The growing media were natural sandy loam soil and Scoria - one of the most used commercial growing media for green roofs. The drainage quality of these media was tested by analysing leachate samples. Medium weight was measured before and after watering, and all pots were monitored for their survival rates. Results showed that although plant growing development was significantly higher in Scoria, the survival rate was lower. For all three water sources, EC and pH of the leachate were significantly lower from Scoria than the sandy loam soil. However, the mixed of wastewater-stormwater had the highest EC, and rainwater had the lowest EC. Results did not present a significant difference between pH of different water resources in the same media. Our experimental results found the scoria and rainwater as the best sources of medium and water for green roofs.

Keywords: green smart cities, urban water, green roofs, green walls, wastewater, stormwater

Procedia PDF Downloads 161
15140 The Log S-fbm Nested Factor Model

Authors: Othmane Zarhali, Cécilia Aubrun, Emmanuel Bacry, Jean-Philippe Bouchaud, Jean-François Muzy

Abstract:

The Nested factor model was introduced by Bouchaud and al., where the asset return fluctuations are explained by common factors representing the market economic sectors and residuals (noises) sharing with the factors a common dominant volatility mode in addition to the idiosyncratic mode proper to each residual. This construction infers that the factors-residuals log volatilities are correlated. Here, we consider the case of a single factor where the only dominant common mode is a S-fbm process (introduced by Peng, Bacry and Muzy) with Hurst exponent H around 0.11 and the residuals having in addition to the previous common mode idiosyncratic components with Hurst exponents H around 0. The reason for considering this configuration is twofold: preserve the Nested factor model’s characteristics introduced by Bouchaud and al. and propose a framework through which the stylized fact reported by Peng and al. is reproduced, where it has been observed that the Hurst exponents of stock indices are large as compared to those of individual stocks. In this work, we show that the Log S-fbm Nested factor model’s construction leads to a Hurst exponent of single stocks being the ones of the idiosyncratic volatility modes and the Hurst exponent of the index being the one of the common volatility modes. Furthermore, we propose a statistical procedure to estimate the Hurst factor exponent from the stock returns dynamics together with theoretical guarantees, with good results in the limit where the number of stocks N goes to infinity. Last but not least, we show that the factor can be seen as an index constructed from the single stocks weighted by specific coefficients.

Keywords: hurst exponent, log S-fbm model, nested factor model, small intermittency approximation

Procedia PDF Downloads 59
15139 Modeling of Bipolar Charge Transport through Nanocomposite Films for Energy Storage

Authors: Meng H. Lean, Wei-Ping L. Chu

Abstract:

The effects of ferroelectric nanofiller size, shape, loading, and polarization, on bipolar charge injection, transport, and recombination through amorphous and semicrystalline polymers are studied. A 3D particle-in-cell model extends the classical electrical double layer representation to treat ferroelectric nanoparticles. Metal-polymer charge injection assumes Schottky emission and Fowler-Nordheim tunneling, migration through field-dependent Poole-Frenkel mobility, and recombination with Monte Carlo selection based on collision probability. A boundary integral equation method is used for solution of the Poisson equation coupled with a second-order predictor-corrector scheme for robust time integration of the equations of motion. The stability criterion of the explicit algorithm conforms to the Courant-Friedrichs-Levy limit. Trajectories for charge that make it through the film are curvilinear paths that meander through the interspaces. Results indicate that charge transport behavior depends on nanoparticle polarization with anti-parallel orientation showing the highest leakage conduction and lowest level of charge trapping in the interaction zone. Simulation prediction of a size range of 80 to 100 nm to minimize attachment and maximize conduction is validated by theory. Attached charge fractions go from 2.2% to 97% as nanofiller size is decreased from 150 nm to 60 nm. Computed conductivity of 0.4 x 1014 S/cm is in agreement with published data for plastics. Charge attachment is increased with spheroids due to the increase in surface area, and especially so for oblate spheroids showing the influence of larger cross-sections. Charge attachment to nanofillers and nanocrystallites increase with vol.% loading or degree of crystallinity, and saturate at about 40 vol.%.

Keywords: nanocomposites, nanofillers, electrical double layer, bipolar charge transport

Procedia PDF Downloads 360
15138 The Effects of Quality of Web-Based Applications on Competitive Advantage: An Empirical Study in Commercial Banks in Jordan

Authors: Faisal Asad Aburub

Abstract:

Many organizations are investing in web applications and technologies in order to be competitive, some of them could not achieve its goals. The quality of web-based applications could play an important role for organizations to be competitive. So the aim of this study is to investigate the impact of quality of web-based applications to achieve a competitive advantage. A new model has been developed. An empirical investigation was performed on a banking sector in Jordan to test the new model. The results show that impact of web-based applications on competitive advantage is significant. Finally, further work is planned to validate and evaluate the proposed model using several domains.

Keywords: competitive advantage, web-based applications, empirical investigation, commercial banks in Jordan

Procedia PDF Downloads 345
15137 Comparison of the Anthropometric Obesity Indices in Prediction of Cardiovascular Disease Risk: Systematic Review and Meta-analysis

Authors: Saeed Pourhassan, Nastaran Maghbouli

Abstract:

Statement of the problem: The relationship between obesity and cardiovascular diseases has been studied widely(1). The distribution of fat tissue gained attention in relation to cardiovascular risk factors during lang-time research (2). American College of Cardiology/American Heart Association (ACC/AHA) is widely and the most reliable tool to be used as a cardiovascular risk (CVR) assessment tool(3). This study aimed to determine which anthropometric index is better in discrimination of high CVR patients from low risks using ACC/AHA score in addition to finding the best index as a CVR predictor among both genders in different races and countries. Methodology & theoretical orientation: The literature in PubMed, Scopus, Embase, Web of Science, and Google Scholar were searched by two independent investigators using the keywords "anthropometric indices," "cardiovascular risk," and "obesity." The search strategy was limited to studies published prior to Jan 2022 as full-texts in the English language. Studies using ACC/AHA risk assessment tool as CVR and those consisted at least 2 anthropometric indices (ancient ones and novel ones) are included. Study characteristics and data were extracted. The relative risks were pooled with the use of the random-effect model. Analysis was repeated in subgroups. Findings: Pooled relative risk for 7 studies with 16,348 participants were 1.56 (1.35-1.72) for BMI, 1.67(1.36-1.83) for WC [waist circumference], 1.72 (1.54-1.89) for WHR [waist-to-hip ratio], 1.60 (1.44-1.78) for WHtR [waist-to-height ratio], 1.61 (1.37-1.82) for ABSI [A body shape index] and 1.63 (1.32-1.89) for CI [Conicity index]. Considering gender, WC among females and WHR among men gained the highest RR. The heterogeneity of studies was moderate (α²: 56%), which was not decreased by subgroup analysis. Some indices such as VAI and LAP were evaluated just in one study. Conclusion & significance: This meta-analysis showed WHR could predict CVR better in comparison to BMI or WHtR. Some new indices like CI and ABSI are less accurate than WHR and WC. Among women, WC seems to be a better choice to predict cardiovascular disease risk.

Keywords: obesity, cardiovascular disease, risk assessment, anthropometric indices

Procedia PDF Downloads 108