Search results for: Weibull distribution model
18302 Modelling of Cavity Growth in Underground Coal Gasification
Authors: Preeti Aghalayam, Jay Shah
Abstract:
Underground coal gasification (UCG) is the in-situ gasification of unmineable coals to produce syngas. In UCG, gasifying agents are injected into the coal seam, and a reactive cavity is formed due to coal consumption. The cavity formed is typically hemispherical, and this report consists of the MATLAB model of the UCG cavity to predict the composition of the output gases. There are seven radial and two time-variant ODEs. A MATLAB solver (ode15s) is used to solve the radial ODEs from the above equations. Two for-loops are implemented in the model, i.e., one for time variations and another for radial variation. In the time loop, the radial odes are solved using the MATLAB solver. The radial loop is nested inside the time loop, and the density odes are numerically solved using the Euler method. The model is validated by comparing it with the literature results of laboratory-scale experiments. The model predicts the radial and time variation of the product gases inside the cavity.Keywords: gasification agent, MATLAB model, syngas, underground coal gasification (UCG)
Procedia PDF Downloads 20618301 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models
Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah
Abstract:
In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model
Procedia PDF Downloads 24218300 Fragility Analysis of Weir Structure Subjected to Flooding Water Damage
Authors: Oh Hyeon Jeon, WooYoung Jung
Abstract:
In this study, seepage analysis was performed by the level difference between upstream and downstream of weir structure for safety evaluation of weir structure against flooding. Monte Carlo Simulation method was employed by considering the probability distribution of the adjacent ground parameter, i.e., permeability coefficient of weir structure. Moreover, by using a commercially available finite element program (ABAQUS), modeling of the weir structure is carried out. Based on this model, the characteristic of water seepage during flooding was determined at each water level with consideration of the uncertainty of their corresponding permeability coefficient. Subsequently, fragility function could be constructed based on this response from numerical analysis; this fragility function results could be used to determine the weakness of weir structure subjected to flooding disaster. They can also be used as a reference data that can comprehensively predict the probability of failur,e and the degree of damage of a weir structure.Keywords: weir structure, seepage, flood disaster fragility, probabilistic risk assessment, Monte-Carlo simulation, permeability coefficient
Procedia PDF Downloads 35218299 Prediction Compressive Strength of Self-Compacting Concrete Containing Fly Ash Using Fuzzy Logic Inference System
Authors: Belalia Douma Omar, Bakhta Boukhatem, Mohamed Ghrici
Abstract:
Self-compacting concrete (SCC) developed in Japan in the late 80s has enabled the construction industry to reduce demand on the resources, improve the work condition and also reduce the impact of environment by elimination of the need for compaction. Fuzzy logic (FL) approaches has recently been used to model some of the human activities in many areas of civil engineering applications. Especially from these systems in the model experimental studies, very good results have been obtained. In the present study, a model for predicting compressive strength of SCC containing various proportions of fly ash, as partial replacement of cement has been developed by using Adaptive Neuro-Fuzzy Inference System (ANFIS). For the purpose of building this model, a database of experimental data were gathered from the literature and used for training and testing the model. The used data as the inputs of fuzzy logic models are arranged in a format of five parameters that cover the total binder content, fly ash replacement percentage, water content, super plasticizer and age of specimens. The training and testing results in the fuzzy logic model have shown a strong potential for predicting the compressive strength of SCC containing fly ash in the considered range.Keywords: self-compacting concrete, fly ash, strength prediction, fuzzy logic
Procedia PDF Downloads 33518298 Natural Ventilation around and through Building: A Numerical Study
Authors: A. Kaddour, S. M. A. Bekkouche
Abstract:
Limiting heat losses during ventilation of indoor building spaces has become a basic aim for architects. Much experience has been gained in terms of ventilation of indoor spaces. Nevertheless, due to the complex applications, attempts to create a theoretical base for solving the problems related to the issue are limited, especially determining the minimum ventilation period required within a designated space. In this paper we have approached this matter, both theoretically and computationally. The conclusion we reached was that controlled ventilation of spaces through vent holes that successively open and close at regular time intervals can limit the excessive circulation of air masses, which in turn limits heat losses. Air change rates through open and tilted windows in rooms of residential buildings driven by atmospheric motions are investigated to evaluate natural ventilation concepts. Model of thermal building simulations is used. A separated sample storey and a sample single room in larger scales were used to measure air transport through window openings under the influence of the external pressure distribution.Keywords: natural ventilation, temperature factor, air change rates, air circulation
Procedia PDF Downloads 44218297 A Mathematical Description of a Growing Cell Colony Based on the Mechanical Bidomain Model
Authors: Debabrata Auddya, Bradley J. Roth
Abstract:
The mechanical bidomain model is used to describe a colony of cells growing on a substrate. Analytical expressions are derived for the intracellular and extracellular displacements. Mechanotransduction events are driven by the difference between the displacements in the two spaces, corresponding to the force acting on integrins. The equation for the displacement consists of two terms: one proportional to the radius that is the same in the intracellular and extracellular spaces (the monodomain term) and one that is proportional to a modified Bessel function that is responsible for mechanotransduction (the bidomain term). The model predicts that mechanotransduction occurs within a few length constants of the colony’s edge, and an expression for the length constant contains the intracellular and extracellular shear moduli and the spring constant of the integrins coupling the two spaces. The model predictions are qualitatively consistent with experiments on human embryonic stem cell colonies, in which differentiation is localized near the edge.Keywords: cell colony, integrin, mechanical bidomain model, stem cell, stress-strain, traction force
Procedia PDF Downloads 23818296 Different Stages for the Creation of Electric Arc Plasma through Slow Rate Current Injection to Single Exploding Wire, by Simulation and Experiment
Authors: Ali Kadivar, Kaveh Niayesh
Abstract:
This work simulates the voltage drop and resistance of the explosion of copper wires of diameters 25, 40, and 100 µm surrounded by 1 bar nitrogen exposed to a 150 A current and before plasma formation. The absorption of electrical energy in an exploding wire is greatly diminished when the plasma is formed. This study shows the importance of considering radiation and heat conductivity in the accuracy of the circuit simulations. The radiation of the dense plasma formed on the wire surface is modeled with the Net Emission Coefficient (NEC) and is mixed with heat conductivity through PLASIMO® software. A time-transient code for analyzing wire explosions driven by a slow current rise rate is developed. It solves a circuit equation coupled with one-dimensional (1D) equations for the copper electrical conductivity as a function of its physical state and Net Emission Coefficient (NEC) radiation. At first, an initial voltage drop over the copper wire, current, and temperature distribution at the time of expansion is derived. The experiments have demonstrated that wires remain rather uniform lengthwise during the explosion and can be simulated utilizing 1D simulations. Data from the first stage are then used as the initial conditions of the second stage, in which a simplified 1D model for high-Mach-number flows is adopted to describe the expansion of the core. The current was carried by the vaporized wire material before it was dispersed in nitrogen by the shock wave. In the third stage, using a three-dimensional model of the test bench, the streamer threshold is estimated. Electrical breakdown voltage is calculated without solving a full-blown plasma model by integrating Townsend growth coefficients (TdGC) along electric field lines. BOLSIG⁺ and LAPLACE databases are used to calculate the TdGC at different mixture ratios of nitrogen/copper vapor. The simulations show both radiation and heat conductivity should be considered for an adequate description of wire resistance, and gaseous discharges start at lower voltages than expected due to ultraviolet radiation and the exploding shocks, which may have ionized the nitrogen.Keywords: exploding wire, Townsend breakdown mechanism, streamer, metal vapor, shock waves
Procedia PDF Downloads 8818295 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling
Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed
Abstract:
The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.Keywords: streamflow, neural network, optimisation, algorithm
Procedia PDF Downloads 15218294 Numerical Modeling of Air Pollution with PM-Particles and Dust
Authors: N. Gigauri, A. Surmava, L. Intskirveli, V. Kukhalashvili, S. Mdivani
Abstract:
The subject of our study is atmospheric air pollution with numerical modeling. In the presented article, as the object of research, there is chosen city Tbilisi, the capital of Georgia, with a population of one and a half million and a difficult terrain. The main source of pollution in Tbilisi is currently vehicles and construction dust. The concentrations of dust and PM (Particulate Matter) were determined in the air of Tbilisi and in its vicinity. There are estimated their monthly maximum, minimum, and average concentrations. Processes of dust propagation in the atmosphere of the city and its surrounding territory are modelled using a 3D regional model of atmospheric processes and an admixture transfer-diffusion equation. There were taken figures of distribution of the polluted cloud and dust concentrations in different areas of the city at different heights and at different time intervals with the background stationary westward and eastward wind. It is accepted that the difficult terrain and mountain-bar circulation affect the deformation of the cloud and its spread, there are determined time periods when the dust concentration in the city is greater than MAC (Maximum Allowable Concentration, MAC=0.5 mg/m³).Keywords: air pollution, dust, numerical modeling, PM-particles
Procedia PDF Downloads 14018293 A Model of Preventing Global Financial Crisis: Gauss Law Model Proposal Used in Electrical Field Calculations
Authors: Arzu K. Kamberli
Abstract:
This article examines the relationship between economics and physics, starting with Adam Smith, with a new econophysics approach in Economics-Physics with the Gauss Law model proposal using for the Electric Field calculation, which will allow us to anticipate the Global Financial Crisis. For this purpose, the similarities between the Gauss Law using the electric field calculations and the global financial crisis have been explained on the formula, and a model has been suggested to predict the risks of the financial systems from the electricity field calculations. Thus, this study is expected to help for preventing the Global Financial Crisis with the contribution of the science of economics and physics from the aspect of econophysics.Keywords: econophysics, electric field, financial system, Gauss law, global financial crisis
Procedia PDF Downloads 28518292 Interoperability Maturity Models for Consideration When Using School Management Systems in South Africa: A Scoping Review
Authors: Keneilwe Maremi, Marlien Herselman, Adele Botha
Abstract:
The main purpose and focus of this paper are to determine the Interoperability Maturity Models to consider when using School Management Systems (SMS). The importance of this is to inform and help schools with knowing which Interoperability Maturity Model is best suited for their SMS. To address the purpose, this paper will apply a scoping review to ensure that all aspects are provided. The scoping review will include papers written from 2012-2019 and a comparison of the different types of Interoperability Maturity Models will be discussed in detail, which includes the background information, the levels of interoperability, and area for consideration in each Maturity Model. The literature was obtained from the following databases: IEEE Xplore and Scopus, the following search engines were used: Harzings, and Google Scholar. The topic of the paper was used as a search term for the literature and the term ‘Interoperability Maturity Models’ was used as a keyword. The data were analyzed in terms of the definition of Interoperability, Interoperability Maturity Models, and levels of interoperability. The results provide a table that shows the focus area of concern for each Maturity Model (based on the scoping review where only 24 papers were found to be best suited for the paper out of 740 publications initially identified in the field). This resulted in the most discussed Interoperability Maturity Model for consideration (Information Systems Interoperability Maturity Model (ISIMM) and Organizational Interoperability Maturity Model for C2 (OIM)).Keywords: interoperability, interoperability maturity model, school management system, scoping review
Procedia PDF Downloads 20918291 Co-integration for Soft Commodities with Non-Constant Volatility
Authors: E. Channol, O. Collet, N. Kostyuchyk, T. Mesbah, Quoc Hoang Long Nguyen
Abstract:
In this paper, a pricing model is proposed for co-integrated commodities extending Larsson model. The futures formulae have been derived and tests have been performed with non-constant volatility. The model has been applied to energy commodities (gas, CO2, energy) and soft commodities (corn, wheat). Results show that non-constant volatility leads to more accurate short term prices, which provides better evaluation of value-at-risk and more generally improve the risk management.Keywords: co-integration, soft commodities, risk management, value-at-risk
Procedia PDF Downloads 54718290 Comparative Analysis of Patent Protection between Health System and Enterprises in Shanghai, China
Authors: Na Li, Yunwei Zhang, Yuhong Niu
Abstract:
The study discussed the patent protections of health system and enterprises in Shanghai. The comparisons of technical distribution and scopes of patent protections between Shanghai health system and enterprises were used by the methods of IPC classification, co-words analysis and visual social network. Results reflected a decreasing order within IPC A61 area, namely A61B, A61K, A61M, and A61F. A61B required to be further investigated. The highest authorized patents A61B17 of A61B of IPC A61 area was found. Within A61B17, fracture fixation, ligament reconstruction, cardiac surgery, and biopsy detection were regarded as common concerned fields by Shanghai health system and enterprises. However, compared with cardiac closure which Shanghai enterprises paid attention to, Shanghai health system was more inclined to blockages and hemostatic tools. The results also revealed that the scopes of patent protections of Shanghai enterprises were relatively centralized. Shanghai enterprises had a series of comprehensive strategies for protecting core patents. In contrast, Shanghai health system was considered to be lack of strategic patent protections for core patents.Keywords: co-words analysis, IPC classification, patent protection, technical distribution
Procedia PDF Downloads 13418289 Modeling Sustainable Truck Rental Operations Using Closed-Loop Supply Chain Network
Authors: Khaled S. Abdallah, Abdel-Aziz M. Mohamed
Abstract:
Moving industries consume numerous resources and dispose masses of used packaging materials. Proper sorting, recycling and disposing the packaging materials is necessary to avoid a sever pollution disaster. This research paper presents a conceptual model to propose sustainable truck rental operations instead of the regular one. An optimization model was developed to select the locations of truck rental centers, collection sites, maintenance and repair sites, and identify the rental fees to be charged for all routes that maximize the total closed supply chain profits. Fixed costs of vehicle purchasing, costs of constructing collection centers and repair centers, as well as the fixed costs paid to use disposal and recycling centers are considered. Operating costs include the truck maintenance, repair costs as well as the cost of recycling and disposing the packing materials, and the costs of relocating the truck are presented in the model. A mixed integer model is developed followed by a simulation model to examine the factors affecting the operation of the model.Keywords: modeling, truck rental, supply chains management.
Procedia PDF Downloads 22818288 Evaluation of Biochemical Oxygen Demand and Dissolved Oxygen for Thames River by Using Stream Water Quality Model
Authors: Ghassan Al-Dulaimi
Abstract:
This paper studied the biochemical parameter (BOD5) and (DO) for the Thames River (Canada-Ontario). Water samples have been collected from Thames River along different points between Chatham to Woodstock and were analysed for various water quality parameters during the low flow season (April). The study involves the application of the stream water quality model QUAL2K model to simulate and predict the dissolved oxygen (DO) and biochemical oxygen demand (BOD5) profiles for Thames River in a stretch of 251 kilometers. The model output showed that DO in the entire river was within the limit of not less than 4 mg/L. For Carbonaceous Biochemical Oxygen Demand CBOD, the entire river may be divided into two main reaches; the first one is extended from Chatham City (0 km) to London (150 km) and has a CBOD concentration of 2 mg/L, and the second reach has CBOD range (2–4) mg/L in which begins from London city and extend to near Woodstock city (73km).Keywords: biochemical oxygen demand, dissolved oxygen, Thames river, QUAL2K model
Procedia PDF Downloads 9318287 Restricted Boltzmann Machines and Deep Belief Nets for Market Basket Analysis: Statistical Performance and Managerial Implications
Authors: H. Hruschka
Abstract:
This paper presents the first comparison of the performance of the restricted Boltzmann machine and the deep belief net on binary market basket data relative to binary factor analysis and the two best-known topic models, namely Dirichlet allocation and the correlated topic model. This comparison shows that the restricted Boltzmann machine and the deep belief net are superior to both binary factor analysis and topic models. Managerial implications that differ between the investigated models are treated as well. The restricted Boltzmann machine is defined as joint Boltzmann distribution of hidden variables and observed variables (purchases). It comprises one layer of observed variables and one layer of hidden variables. Note that variables of the same layer are not connected. The comparison also includes deep belief nets with three layers. The first layer is a restricted Boltzmann machine based on category purchases. Hidden variables of the first layer are used as input variables by the second-layer restricted Boltzmann machine which then generates second-layer hidden variables. Finally, in the third layer hidden variables are related to purchases. A public data set is analyzed which contains one month of real-world point-of-sale transactions in a typical local grocery outlet. It consists of 9,835 market baskets referring to 169 product categories. This data set is randomly split into two halves. One half is used for estimation, the other serves as holdout data. Each model is evaluated by the log likelihood for the holdout data. Performance of the topic models is disappointing as the holdout log likelihood of the correlated topic model – which is better than Dirichlet allocation - is lower by more than 25,000 compared to the best binary factor analysis model. On the other hand, binary factor analysis on its own is clearly surpassed by both the restricted Boltzmann machine and the deep belief net whose holdout log likelihoods are higher by more than 23,000. Overall, the deep belief net performs best. We also interpret hidden variables discovered by binary factor analysis, the restricted Boltzmann machine and the deep belief net. Hidden variables characterized by the product categories to which they are related differ strongly between these three models. To derive managerial implications we assess the effect of promoting each category on total basket size, i.e., the number of purchased product categories, due to each category's interdependence with all the other categories. The investigated models lead to very different implications as they disagree about which categories are associated with higher basket size increases due to a promotion. Of course, recommendations based on better performing models should be preferred. The impressive performance advantages of the restricted Boltzmann machine and the deep belief net suggest continuing research by appropriate extensions. To include predictors, especially marketing variables such as price, seems to be an obvious next step. It might also be feasible to take a more detailed perspective by considering purchases of brands instead of purchases of product categories.Keywords: binary factor analysis, deep belief net, market basket analysis, restricted Boltzmann machine, topic models
Procedia PDF Downloads 19918286 Diabetes Diagnosis Model Using Rough Set and K- Nearest Neighbor Classifier
Authors: Usiobaifo Agharese Rosemary, Osaseri Roseline Oghogho
Abstract:
Diabetes is a complex group of disease with a variety of causes; it is a disorder of the body metabolism in the digestion of carbohydrates food. The application of machine learning in the field of medical diagnosis has been the focus of many researchers and the use of recognition and classification model as a decision support tools has help the medical expert in diagnosis of diseases. Considering the large volume of medical data which require special techniques, experience, and high diagnostic skill in the diagnosis of diseases, the application of an artificial intelligent system to assist medical personnel in order to enhance their efficiency and accuracy in diagnosis will be an invaluable tool. In this study will propose a diabetes diagnosis model using rough set and K-nearest Neighbor classifier algorithm. The system consists of two modules: the feature extraction module and predictor module, rough data set is used to preprocess the attributes while K-nearest neighbor classifier is used to classify the given data. The dataset used for this model was taken for University of Benin Teaching Hospital (UBTH) database. Half of the data was used in the training while the other half was used in testing the system. The proposed model was able to achieve over 80% accuracy.Keywords: classifier algorithm, diabetes, diagnostic model, machine learning
Procedia PDF Downloads 33618285 Design, Development and Analysis of Combined Darrieus and Savonius Wind Turbine
Authors: Ashish Bhattarai, Bishnu Bhatta, Hem Raj Joshi, Nabin Neupane, Pankaj Yadav
Abstract:
This report concerns the design, development, and analysis of the combined Darrieus and Savonius wind turbine. Vertical Axis Wind Turbines (VAWT's) are of two type's viz. Darrieus (lift type) and Savonius (drag type). The problem associated with Darrieus is the lack of self-starting while Savonius has low efficiency. There are 3 straight Darrieus blades having the cross-section of NACA(National Advisory Committee of Aeronautics) 0018 placed circumferentially and a helically twisted Savonius blade to get even torque distribution. This unique design allows the use of Savonius as a method of self-starting the wind turbine, which the Darrieus cannot achieve on its own. All the parts of the wind turbine are designed in CAD software, and simulation data were obtained via CFD(Computational Fluid Dynamics) approach. Also, the design was imported to FlashForge Finder to 3D print the wind turbine profile and finally, testing was carried out. The plastic material used for Savonius was ABS(Acrylonitrile Butadiene Styrene) and that for Darrieus was PLA(Polylactic Acid). From the data obtained experimentally, the hybrid VAWT so fabricated has been found to operate at the low cut-in speed of 3 m/s and maximum power output has been found to be 7.5537 watts at the wind speed of 6 m/s. The maximum rpm of the rotor blade is recorded to be 431 rpm(rotation per minute) at the wind velocity of 6 m/s, signifying its potentiality of wind power production. Besides, the data so obtained from both the process when analyzed through graph plots has shown the similar nature slope wise. Also, the difference between the experimental and theoretical data obtained has shown mechanical losses. The objective is to eliminate the need for external motors for self-starting purposes and study the performance of the model. The testing of the model was carried out for different wind velocities.Keywords: VAWT, Darrieus, Savonius, helical blades, CFD, flash forge finder, ABS, PLA
Procedia PDF Downloads 20918284 Phytopathology Prediction in Dry Soil Using Artificial Neural Networks Modeling
Authors: F. Allag, S. Bouharati, M. Belmahdi, R. Zegadi
Abstract:
The rapid expansion of deserts in recent decades as a result of human actions combined with climatic changes has highlighted the necessity to understand biological processes in arid environments. Whereas physical processes and the biology of flora and fauna have been relatively well studied in marginally used arid areas, knowledge of desert soil micro-organisms remains fragmentary. The objective of this study is to conduct a diversity analysis of bacterial communities in unvegetated arid soils. Several biological phenomena in hot deserts related to microbial populations and the potential use of micro-organisms for restoring hot desert environments. Dry land ecosystems have a highly heterogeneous distribution of resources, with greater nutrient concentrations and microbial densities occurring in vegetated than in bare soils. In this work, we found it useful to use techniques of artificial intelligence in their treatment especially artificial neural networks (ANN). The use of the ANN model, demonstrate his capability for addressing the complex problems of uncertainty data.Keywords: desert soil, climatic changes, bacteria, vegetation, artificial neural networks
Procedia PDF Downloads 39518283 Validation Study of Radial Aircraft Engine Model
Authors: Lukasz Grabowski, Tytus Tulwin, Michal Geca, P. Karpinski
Abstract:
This paper presents the radial aircraft engine model which has been created in AVL Boost software. This model is a one-dimensional physical model of the engine, which enables us to investigate the impact of an ignition system design on engine performance (power, torque, fuel consumption). In addition, this model allows research under variable environmental conditions to reflect varied flight conditions (altitude, humidity, cruising speed). Before the simulation research the identifying parameters and validating of model were studied. In order to verify the feasibility to take off power of gasoline radial aircraft engine model, some validation study was carried out. The first stage of the identification was completed with reference to the technical documentation provided by manufacturer of engine and the experiments on the test stand of the real engine. The second stage involved a comparison of simulation results with the results of the engine stand tests performed on a WSK ’PZL-Kalisz’. The engine was loaded by a propeller in a special test bench. Identifying the model parameters referred to a comparison of the test results to the simulation in terms of: pressure behind the throttles, pressure in the inlet pipe, and time course for pressure in the first inlet pipe, power, and specific fuel consumption. Accordingly, the required coefficients and error of simulation calculation relative to the real-object experiments were determined. Obtained the time course for pressure and its value is compatible with the experimental results. Additionally the engine power and specific fuel consumption tends to be significantly compatible with the bench tests. The mapping error does not exceed 1.5%, which verifies positively the model of combustion and allows us to predict engine performance if the process of combustion will be modified. The next conducted tests verified completely model. The maximum mapping error for the pressure behind the throttles and the inlet pipe pressure is 4 %, which proves the model of the inlet duct in the engine with the charging compressor to be correct.Keywords: 1D-model, aircraft engine, performance, validation
Procedia PDF Downloads 33618282 Evaluation of Low-Reducible Sinter in Blast Furnace Technology by Mathematical Model Developed at Centre ENET, VSB: Technical University of Ostrava
Authors: S. Jursová, P. Pustějovská, S. Brožová, J. Bilík
Abstract:
The paper deals with possibilities of interpretation of iron ore reducibility tests. It presents a mathematical model developed at Centre ENET, VŠB–Technical University of Ostrava, Czech Republic for an evaluation of metallurgical material of blast furnace feedstock such as iron ore, sinter or pellets. According to the data from the test, the model predicts its usage in blast furnace technology and its effects on production parameters of shaft aggregate. At the beginning, the paper sums up the general concept and experience in mathematical modelling of iron ore reduction. It presents basic equation for the calculation and the main parts of the developed model. In the experimental part, there is an example of usage of the mathematical model. The paper describes the usage of data for some predictive calculation. There are presented material, method of carried test of iron ore reducibility. Then there are graphically interpreted effects of used material on carbon consumption, rate of direct reduction and the whole reduction process.Keywords: blast furnace technology, iron ore reduction, mathematical model, prediction of iron ore reduction
Procedia PDF Downloads 67418281 A Model for Operating Rooms Scheduling
Authors: Jose Francisco Ferreira Ribeiro, Alexandre Bevilacqua Leoneti, Andre Lucirton Costa
Abstract:
This paper presents a mathematical model in binary variables 0/1 to make the assignment of surgical procedures to the operating rooms in a hospital. The proposed mathematical model is based on the generalized assignment problem, which maximizes the sum of preferences for the use of the operating rooms by doctors, respecting the time available in each room. The corresponding program was written in Visual Basic of Microsoft Excel, and tested to schedule surgeries at St. Lydia Hospital in Ribeirao Preto, Brazil.Keywords: generalized assignment problem, logistics, optimization, scheduling
Procedia PDF Downloads 29218280 Voltage Sag Characteristics during Symmetrical and Asymmetrical Faults
Authors: Ioannis Binas, Marios Moschakis
Abstract:
Electrical faults in transmission and distribution networks can have great impact on the electrical equipment used. Fault effects depend on the characteristics of the fault as well as the network itself. It is important to anticipate the network’s behavior during faults when planning a new equipment installation, as well as troubleshooting. Moreover, working backwards, we could be able to estimate the characteristics of the fault when checking the perceived effects. Different transformer winding connections dominantly used in the Greek power transfer and distribution networks and the effects of 1-phase to neutral, phase-to-phase, 2-phases to neutral and 3-phase faults on different locations of the network were simulated in order to present voltage sag characteristics. The study was performed on a generic network with three steps down transformers on two voltage level buses (one 150 kV/20 kV transformer and two 20 kV/0.4 kV). We found that during faults, there are significant changes both on voltage magnitudes and on phase angles. The simulations and short-circuit analysis were performed using the PSCAD simulation package. This paper presents voltage characteristics calculated for the simulated network, with different approaches on the transformer winding connections during symmetrical and asymmetrical faults on various locations.Keywords: Phase angle shift, power quality, transformer winding connections, voltage sag propagation
Procedia PDF Downloads 13918279 Improving the Run Times of Existing and Historical Demand Models Using Simple Python Scripting
Authors: Abhijeet Ostawal, Parmjit Lall
Abstract:
The run times for a large strategic model that we were managing had become too long leading to delays in project delivery, increased costs and loss in productivity. Software developers are continuously working towards developing more efficient tools by changing their algorithms and processes. The issue faced by our team was how do you apply the latest technologies on validated existing models which are based on much older versions of software that do not have the latest software capabilities. The multi-model transport model that we had could only be run in sequential assignment order. Recent upgrades to the software now allowed the assignment to be run in parallel, a concept called parallelization. Parallelization is a Python script working only within the latest version of the software. A full model transfer to the latest version was not possible due to time, budget and the potential changes in trip assignment. This article is to show the method to adapt and update the Python script in such a way that it can be used in older software versions by calling the latest version and then recalling the old version for assignment model without affecting the results. Through a process of trial-and-error run time savings of up to 30-40% have been achieved. Assignment results were maintained within the older version and through this learning process we’ve applied this methodology to other even older versions of the software resulting in huge time savings, more productivity and efficiency for both client and consultant.Keywords: model run time, demand model, parallelisation, python scripting
Procedia PDF Downloads 11818278 Detection of Change Points in Earthquakes Data: A Bayesian Approach
Authors: F. A. Al-Awadhi, D. Al-Hulail
Abstract:
In this study, we applied the Bayesian hierarchical model to detect single and multiple change points for daily earthquake body wave magnitude. The change point analysis is used in both backward (off-line) and forward (on-line) statistical research. In this study, it is used with the backward approach. Different types of change parameters are considered (mean, variance or both). The posterior model and the conditional distributions for single and multiple change points are derived and implemented using BUGS software. The model is applicable for any set of data. The sensitivity of the model is tested using different prior and likelihood functions. Using Mb data, we concluded that during January 2002 and December 2003, three changes occurred in the mean magnitude of Mb in Kuwait and its vicinity.Keywords: multiple change points, Markov Chain Monte Carlo, earthquake magnitude, hierarchical Bayesian mode
Procedia PDF Downloads 45618277 Green It-Outsourcing Assurance Model for It-Outsourcing Vendors
Authors: Siffat Ullah Khan, Rahmat Ullah Khan, Rafiq Ahmad Khan, Habibullah Khan
Abstract:
Green IT or green computing has emerged as a fast growing business paradigm in recent years in order to develop energy-efficient Software and peripheral devices. With the constant evolution of technology and the world critical environmental status, all private and public information technology (IT) businesses are moving towards sustainability. We identified, through systematic literature review and questionnaire survey, 9 motivators, in total, faced by vendors in IT-Outsourcing relationship. Amongst these motivators 7 were ranked as critical motivators. We also identified 21, in total, practices for addressing these critical motivators. Based on these inputs we have developed Green IT-Outsourcing Assurance Model (GITAM) for IT-Outsourcing vendors. The model comprises four different levels. i.e. Initial, White, Green and Grey. Each level comprises different critical motivators and their relevant practices. We conclude that our model, GITAM, will assist IT-Outsourcing vendors in gauging their level in order to manage IT-Outsourcing activities in a green and sustainable fashion to assist the environment and to reduce the carbon emission. The model will assist vendors in improving their current level by suggesting various practices. The model will contribute to the body of knowledge in the field of Green IT.Keywords: Green IT-outsourcing Assurance Model (GITAM), Systematic Literature Review, Empirical Study, Case Study
Procedia PDF Downloads 25218276 Distribution and Comparative Diversity of Nematocera within Four Livestock Types in the Plain of Mitidja Algeria
Authors: Nebri Rachid, Berrouane Fatima, Doumandji Salah Eddine
Abstract:
During six months, from November 2013 to May 2014, census of Nematocera insects was conducted on four livestock: cattle, sheep, equine and cameline. The census, that took place in a station located in Mitidja plain, Algeria, revealed thirteen Nematocera species that had been observed and identified: Scatopse notata, Chironomus Sp., Sciara bicolor, Psychoda phalaenoïdes, Culex pipiens, Orthocladius Sp., Psycoda alternata, Trichocera regelationis, Culicoïdes Sp., Contarinia Sp., Ectaetia Sp., Tipula Sp., and Culicoïdes coprosus. A factorial correspondence analysis has been performed to study the distribution of the different species captured in colored traps that were placed in the four farms. The results showed the presence of three collections of Nematocera relating to the breeding type where the highest availability is in favor of the equine and the cattle. The analysis of the comparative diversity of Nematocera specimens revealed an indifferent taxonomic structure compared with the hosts. However, in terms of individuals, the supremacy is to the equine’s advantage. On the ecological arrival scale, Psycoda alternata, is undeniably the most predominant on the equines as well as on the cattle.Keywords: Algeria, availability, biodiversity, census, livestock, nematocera
Procedia PDF Downloads 45618275 Theoretical-Methodological Model to Study Vulnerability of Death in the Past from a Bioarchaeological Approach
Authors: Geraldine G. Granados Vazquez
Abstract:
Every human being is exposed to the risk of dying; wherein some of them are more susceptible than others depending on the cause. Therefore, the cause could be the hazard to die that a group or individual has, making this irreversible damage the condition of vulnerability. Risk is a dynamic concept; which means that it depends on the environmental, social, economic and political conditions. Thus vulnerability may only be evaluated in terms of relative parameters. This research is focusing specifically on building a model that evaluate the risk or propensity of death in past urban societies in connection with the everyday life of individuals, considering that death can be a consequence of two coexisting issues: hazard and the deterioration of the resistance to destruction. One of the most important discussions in bioarchaeology refers to health and life conditions in ancient groups; the researchers are looking for more flexible models that evaluate these topics. In that way, this research proposes a theoretical-methodological model that assess the vulnerability of death in past urban groups. This model pretends to be useful to evaluate the risk of death, considering their sociohistorical context, and their intrinsic biological features. This theoretical and methodological model, propose four areas to assess vulnerability. The first three areas use statistical methods or quantitative analysis. While the last and fourth area, which corresponds to the embodiment, is based on qualitative analysis. The four areas and their techniques proposed are a) Demographic dynamics. From the distribution of age at the time of death, the analysis of mortality will be performed using life tables. From here, four aspects may be inferred: population structure, fertility, mortality-survival, and productivity-migration, b) Frailty. Selective mortality and heterogeneity in frailty can be assessed through the relationship between characteristics and the age at death. There are two indicators used in contemporary populations to evaluate stress: height and linear enamel hypoplasias. Height estimates may account for the individual’s nutrition and health history in specific groups; while enamel hypoplasias are an account of the individual’s first years of life, c) Inequality. Space reflects various sectors of society, also in ancient cities. In general terms, the spatial analysis uses measures of association to show the relationship between frail variables and space, d) Embodiment. The story of everyone leaves some evidence on the body, even in the bones. That led us to think about the dynamic individual's relations in terms of time and space; consequently, the micro analysis of persons will assess vulnerability from the everyday life, where the symbolic meaning also plays a major role. In sum, using some Mesoamerica examples, as study cases, this research demonstrates that not only the intrinsic characteristics related to the age and sex of individuals are conducive to vulnerability, but also the social and historical context that determines their state of frailty before death. An attenuating factor for past groups is that some basic aspects –such as the role they played in everyday life– escape our comprehension, and are still under discussion.Keywords: bioarchaeology, frailty, Mesoamerica, vulnerability
Procedia PDF Downloads 22518274 Study of Polychlorinated Dibenzo-P-Dioxins and Dibenzofurans Dispersion in the Environment of a Municipal Solid Waste Incinerator
Authors: Gómez R. Marta, Martín M. Jesús María
Abstract:
The general aim of this paper identifies the areas of highest concentration of polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) around the incinerator through the use of dispersion models. Atmospheric dispersion models are useful tools for estimating and prevent the impact of emissions from a particular source in air quality. These models allow considering different factors that influence in air pollution: source characteristics, the topography of the receiving environment and weather conditions to predict the pollutants concentration. The PCDD/Fs, after its emission into the atmosphere, are deposited on water or land, near or far from emission source depending on the size of the associated particles and climatology. In this way, they are transferred and mobilized through environmental compartments. The modelling of PCDD/Fs was carried out with following tools: Atmospheric Dispersion Model Software (ADMS) and Surfer. ADMS is a dispersion model Gaussian plume, used to model the impact of air quality industrial facilities. And Surfer is a program of surfaces which is used to represent the dispersion of pollutants on a map. For the modelling of emissions, ADMS software requires the following input parameters: characterization of emission sources (source type, height, diameter, the temperature of the release, flow rate, etc.) meteorological and topographical data (coordinate system), mainly. The study area was set at 5 Km around the incinerator and the first population center nearest to focus PCDD/Fs emission is about 2.5 Km, approximately. Data were collected during one year (2013) both PCDD/Fs emissions of the incinerator as meteorology in the study area. The study has been carried out during period's average that legislation establishes, that is to say, the output parameters are taking into account the current legislation. Once all data required by software ADMS, described previously, are entered, and in order to make the representation of the spatial distribution of PCDD/Fs concentration and the areas affecting them, the modelling was proceeded. In general, the dispersion plume is in the direction of the predominant winds (Southwest and Northeast). Total levels of PCDD/Fs usually found in air samples, are from <2 pg/m3 for remote rural areas, from 2-15 pg/m3 in urban areas and from 15-200 pg/m3 for areas near to important sources, as can be an incinerator. The results of dispersion maps show that maximum concentrations are the order of 10-8 ng/m3, well below the values considered for areas close to an incinerator, as in this case.Keywords: atmospheric dispersion, dioxin, furan, incinerator
Procedia PDF Downloads 21718273 The Investigation of Oil Price Shocks by Using a Dynamic Stochastic General Equilibrium: The Case of Iran
Authors: Bahram Fathi, Karim Alizadeh, Azam Mohammadbagheri
Abstract:
The aim of this paper is to investigate the role of oil price shocks in explaining business cycles in Iran using a dynamic stochastic general equilibrium approach. This model incorporates both productivity and oil revenue shocks. The results indicate that productivity shocks are relatively more important to business cycles than oil shocks. The model with two shocks produces different values for volatility, but these values have the same ranking as that of the actual data for most variables. In addition, the actual data are close to the ratio of standard deviations to the output obtained from the model with two shocks. The results indicate that productivity shocks are relatively more important to business cycles than the oil shocks. The model with only a productivity shock produces the most similar figures in term of volatility magnitude to that of the actual data. Next, we use the Impulse Response Functions (IRF) to evaluate the capability of the model. The IRF shows no effect of an oil shock on the capital stocks and on labor hours, which is a feature of the model. When the log-linearized system of equations is solved numerically, investment and labor hours were not found to be functions of the oil shock. This research recommends using different techniques to compare the model’s robustness. One method by which to do this is to have all decision variables as a function of the oil shock by inducing the stationary to the model differently. Another method is to impose a bond adjustment cost. This study intends to fill that gap. To achieve this objective, we derive a DSGE model that allows for the world oil price and productivity shocks. Second, we calibrate the model to the Iran economy. Next, we compare the moments from the theoretical model with both single and multiple shocks with that obtained from the actual data to see the extent to which business cycles in Iran can be explained by total oil revenue shock. Then, we use an impulse response function to evaluate the role of world oil price shocks. Finally, I present implications of the findings and interpretations in accordance with economic theory.Keywords: oil price, shocks, dynamic stochastic general equilibrium, Iran
Procedia PDF Downloads 438