Search results for: interval forecasts
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 984

Search results for: interval forecasts

624 Industrial Assessment of the Exposed Rocks on Peris Anticline Kurdistan Region of Iraq for Cement Industry

Authors: Faroojan Khajeek Sisak Siakian, Aayda Dikran Abdulahad

Abstract:

The Peris Mountain is one of the main mountains in the Iraqi Kurdistan Region, it forms one of the long anticlines trending almost East – West. The exposed formations on the top of the mountain are Bekhme, and Shiranish, with carbonate rocks of different types and thicknesses. We selected the site for sampling to be relevant for a quarry taking into consideration the thickness of the exposed rocks, no overburden, favorable quarrying faces, hardness of the rocks, bedding nature, good extension of the outcrops, and a favorable place for construction of a cement plant. We sampled the exposed rocks on the top of the mountain where a road crosses the mountain, and a total of 15 samples were collected. The distance between sampling intervals was 5 m, and each sample was collected to represent the sampling interval. The samples were subjected to X-ray fluorescence spectroscopy (XRF) to indicate the main oxides percentages in each sample. The acquired results showed the studied rocks can be used in the cement industry.

Keywords: limestone, quarry, CaO, MgO, overburden

Procedia PDF Downloads 58
623 Economic Development Impacts of Connected and Automated Vehicles (CAV)

Authors: Rimon Rafiah

Abstract:

This paper will present a combination of two seemingly unrelated models, which are the one for estimating economic development impacts as a result of transportation investment and the other for increasing CAV penetration in order to reduce congestion. Measuring economic development impacts resulting from transportation investments is becoming more recognized around the world. Examples include the UK’s Wider Economic Benefits (WEB) model, Economic Impact Assessments in the USA, various input-output models, and additional models around the world. The economic impact model is based on WEB and is based on the following premise: investments in transportation will reduce the cost of personal travel, enabling firms to be more competitive, creating additional throughput (the same road allows more people to travel), and reducing the cost of travel of workers to a new workplace. This reduction in travel costs was estimated in out-of-pocket terms in a given localized area and was then translated into additional employment based on regional labor supply elasticity. This additional employment was conservatively assumed to be at minimum wage levels, translated into GDP terms, and from there into direct taxation (i.e., an increase in tax taken by the government). The CAV model is based on economic principles such as CAV usage, supply, and demand. Usage of CAVs can increase capacity using a variety of means – increased automation (known as Level I thru Level IV) and also by increased penetration and usage, which has been predicted to go up to 50% by 2030 according to several forecasts, with possible full conversion by 2045-2050. Several countries have passed policies and/or legislation on sales of gasoline-powered vehicles (none) starting in 2030 and later. Supply was measured via increased capacity on given infrastructure as a function of both CAV penetration and implemented technologies. The CAV model, as implemented in the USA, has shown significant savings in travel time and also in vehicle operating costs, which can be translated into economic development impacts in terms of job creation, GDP growth and salaries as well. The models have policy implications as well and can be adapted for use in Japan as well.

Keywords: CAV, economic development, WEB, transport economics

Procedia PDF Downloads 49
622 A New Concept for Deriving the Expected Value of Fuzzy Random Variables

Authors: Liang-Hsuan Chen, Chia-Jung Chang

Abstract:

Fuzzy random variables have been introduced as an imprecise concept of numeric values for characterizing the imprecise knowledge. The descriptive parameters can be used to describe the primary features of a set of fuzzy random observations. In fuzzy environments, the expected values are usually represented as fuzzy-valued, interval-valued or numeric-valued descriptive parameters using various metrics. Instead of the concept of area metric that is usually adopted in the relevant studies, the numeric expected value is proposed by the concept of distance metric in this study based on two characters (fuzziness and randomness) of FRVs. Comparing with the existing measures, although the results show that the proposed numeric expected value is same with those using the different metric, if only triangular membership functions are used. However, the proposed approach has the advantages of intuitiveness and computational efficiency, when the membership functions are not triangular types. An example with three datasets is provided for verifying the proposed approach.

Keywords: fuzzy random variables, distance measure, expected value, descriptive parameters

Procedia PDF Downloads 316
621 Fractal Analysis of Polyacrylamide-Graphene Oxide Composite Gels

Authors: Gülşen Akın Evingür, Önder Pekcan

Abstract:

The fractal analysis is a bridge between the microstructure and macroscopic properties of gels. Fractal structure is usually provided to define the complexity of crosslinked molecules. The complexity in gel systems is described by the fractal dimension (Df). In this study, polyacrylamide- graphene oxide (GO) composite gels were prepared by free radical crosslinking copolymerization. The fractal analysis of polyacrylamide- graphene oxide (GO) composite gels were analyzed in various GO contents during gelation and were investigated by using Fluorescence Technique. The analysis was applied to estimate Df s of the composite gels. Fractal dimension of the polymer composite gels were estimated based on the power law exponent values using scaling models. In addition, here we aimed to present the geometrical distribution of GO during gelation. And we observed that as gelation proceeded GO plates first organized themselves into 3D percolation cluster with Df=2.52, then goes to diffusion limited clusters with Df =1.4 and then lines up to Von Koch curve with random interval with Df=1.14. Here, our goal is to try to interpret the low conductivity and/or broad forbidden gap of GO doped PAAm gels, by the distribution of GO in the final form of the produced gel.

Keywords: composite gels, fluorescence, fractal, scaling

Procedia PDF Downloads 282
620 Thermal Degradation Kinetics of Field-Dried and Pelletized Switchgrass

Authors: Karen E. Supan

Abstract:

Thermal degradation kinetics of switchgrass (Panicum virgatum) from the field, as well as in a pellet form, are presented. Thermogravimetric analysis tests were performed at heating rates of 10-40 K min⁻¹ in an inert atmosphere. The activation energy and the pre-exponential factor were calculated using the Ozawa/Flynn/Wall method as suggested by the ASTM Standard Test Method for Decomposition Kinetics by Thermogravimetry. Four stages were seen in the degradation: dehydration, active pyrolysis of hemicellulose, active pyrolysis of cellulose, and passive pyrolysis. The derivative mass loss peak for active pyrolysis of cellulose in the field-dried sample was much higher than the pelletized. The range of activation energy in the 0.15 – 0.70 conversion interval was 191 – 242 kJ mol⁻¹ for the field-dried and 130-192 kJ mol⁻¹ for the pellets. The highest activation energies were achieved at 0.50 conversion and were 242 kJ mol⁻¹ and 192 kJ mol⁻¹ for the field-dried and pellets, respectively. The thermal degradation and activation energies were comparable to switchgrass and other biomass reported in the literature.

Keywords: biomass, switchgrass, thermal degradation, thermogravimetric analysis

Procedia PDF Downloads 87
619 Extractive Desulfurization of Atmospheric Gasoil with N,N-Dimethylformamide

Authors: Kahina Bedda, Boudjema Hamada

Abstract:

Environmental regulations have been introduced in many countries around the world to reduce the sulfur content of diesel fuel to ultra low levels with the intention of lowering diesel engine’s harmful exhaust emissions and improving air quality. Removal of sulfur containing compounds from diesel feedstocks to produce ultra low sulfur diesel fuel by extraction with selective solvents has received increasing attention in recent years. This is because the sulfur extraction technologies compared to the hydrotreating processes could reduce the cost of desulfurization substantially since they do not demand hydrogen, and are carried out at atmospheric pressure. In this work, the desulfurization of distillate gasoil by liquid-liquid extraction with N, N-dimethylformamide was investigated. This fraction was recovered from a mixture of Hassi Messaoud crude oils and Hassi R'Mel gas-condensate in Algiers refinery. The sulfur content of this cut is 281 ppm. Experiments were performed in six-stage with a ratio of solvent:feed equal to 3:1. The effect of the extraction temperature was investigated in the interval 30 ÷ 110°C. At 110°C the yield of refined gas oil was 82% and its sulfur content was 69 ppm.

Keywords: desulfurization, gasoil, N, N-dimethylformamide, sulfur content

Procedia PDF Downloads 356
618 Decomposing the Socio-Economic Inequalities in Utilization of Antenatal Care in South Asian Countries: Insight from Demographic and Health Survey

Authors: Jeetendra Yadav, Geetha Menon, Anita Pal, Rajkumar Verma

Abstract:

Even after encouraging maternal and child wellness programs at worldwide level, lower-middle income nations are not reached the goal set by the UN yet. This study quantified the contribution of socioeconomic determinants of inequality to the utilization of Antenatal Care in South Asian Countries. This study used data from Demographic Health Survey (DHS) of the selected countries were used, and Oaxaca decomposing were applied for socioeconomic inequalities in utilization of antenatal care. Finding from the multivariate analysis shows that mother’s age at the time of birth, birth order and interval, mother’s education, mass media exposure and economic status were significant determinants of the utilization of antenatal care services in South Asian countries. Considering, concentration index curve, the line of equity was greatest in Pakistan which followed by India and Nepal.

Keywords: antenatal care, decomposition, inequalities, South Asian countries

Procedia PDF Downloads 156
617 Study on Sharp V-Notch Problem under Dynamic Loading Condition Using Symplectic Analytical Singular Element

Authors: Xiaofei Hu, Zhiyu Cai, Weian Yao

Abstract:

V-notch problem under dynamic loading condition is considered in this paper. In the time domain, the precise time domain expanding algorithm is employed, in which a self-adaptive technique is carried out to improve computing accuracy. By expanding variables in each time interval, the recursive finite element formulas are derived. In the space domain, a Symplectic Analytical Singular Element (SASE) for V-notch problem is constructed addressing the stress singularity of the notch tip. Combining with the conventional finite elements, the proposed SASE can be used to solve the dynamic stress intensity factors (DSIFs) in a simple way. Numerical results show that the proposed SASE for V-notch problem subjected to dynamic loading condition is effective and efficient.

Keywords: V-notch, dynamic stress intensity factor, finite element method, precise time domain expanding algorithm

Procedia PDF Downloads 152
616 Application of Stochastic Models on the Portuguese Population and Distortion to Workers Compensation Pensioners Experience

Authors: Nkwenti Mbelli Njah

Abstract:

This research was motivated by a project requested by AXA on the topic of pensions payable under the workers compensation (WC) line of business. There are two types of pensions: the compulsorily recoverable and the not compulsorily recoverable. A pension is compulsorily recoverable for a victim when there is less than 30% of disability and the pension amount per year is less than six times the minimal national salary. The law defines that the mathematical provisions for compulsory recoverable pensions must be calculated by applying the following bases: mortality table TD88/90 and rate of interest 5.25% (maybe with rate of management). To manage pensions which are not compulsorily recoverable is a more complex task because technical bases are not defined by law and much more complex computations are required. In particular, companies have to predict the amount of payments discounted reflecting the mortality effect for all pensioners (this task is monitored monthly in AXA). The purpose of this research was thus to develop a stochastic model for the future mortality of the worker’s compensation pensioners of both the Portuguese market workers and AXA portfolio. Not only is past mortality modeled, also projections about future mortality are made for the general population of Portugal as well as for the two portfolios mentioned earlier. The global model was split in two parts: a stochastic model for population mortality which allows for forecasts, combined with a point estimate from a portfolio mortality model obtained through three different relational models (Cox Proportional, Brass Linear and Workgroup PLT). The one-year death probabilities for ages 0-110 for the period 2013-2113 are obtained for the general population and the portfolios. These probabilities are used to compute different life table functions as well as the not compulsorily recoverable reserves for each of the models required for the pensioners, their spouses and children under 21. The results obtained are compared with the not compulsory recoverable reserves computed using the static mortality table (TD 73/77) that is currently being used by AXA, to see the impact on this reserve if AXA adopted the dynamic tables.

Keywords: compulsorily recoverable, life table functions, relational models, worker’s compensation pensioners

Procedia PDF Downloads 142
615 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 94
614 Software Verification of Systematic Resampling for Optimization of Particle Filters

Authors: Osiris Terry, Kenneth Hopkinson, Laura Humphrey

Abstract:

Systematic resampling is the most popularly used resampling method in particle filters. This paper seeks to further the understanding of systematic resampling by defining a formula made up of variables from the sampling equation and the particle weights. The formula is then verified via SPARK, a software verification language. The verified systematic resampling formula states that the minimum/maximum number of possible samples taken of a particle is equal to the floor/ceiling value of particle weight divided by the sampling interval, respectively. This allows for the creation of a randomness spectrum that each resampling method can fall within. Methods on the lower end, e.g., systematic resampling, have less randomness and, thus, are quicker to reach an estimate. Although lower randomness allows for error by having a larger bias towards the size of the weight, having this bias creates vulnerabilities to the noise in the environment, e.g., jamming. Conclusively, this is the first step in characterizing each resampling method. This will allow target-tracking engineers to pick the best resampling method for their environment instead of choosing the most popularly used one.

Keywords: SPARK, software verification, resampling, systematic resampling, particle filter, tracking

Procedia PDF Downloads 53
613 Standard and Processing of Photodegradable Polyethylene

Authors: Nurul-Akidah M. Yusak, Rahmah Mohamed, Noor Zuhaira Abd Aziz

Abstract:

The introduction of degradable plastic materials into agricultural sectors has represented a promising alternative to promote green agriculture and environmental friendly of modern farming practices. Major challenges of developing degradable agricultural films are to identify the most feasible types of degradation mechanisms, composition of degradable polymers and related processing techniques. The incorrect choice of degradable mechanisms to be applied during the degradation process will cause premature losses of mechanical performance and strength. In order to achieve controlled process of agricultural film degradation, the compositions of degradable agricultural film also important in order to stimulate degradation reaction at required interval of time and to achieve sustainability of the modern agricultural practices. A set of photodegradable polyethylene based agricultural film was developed and produced, following the selective optimization of processing parameters of the agricultural film manufacturing system. Example of agricultural films application for oil palm seedlings cultivation is presented.

Keywords: photodegradable polyethylene, plasticulture, processing schemes

Procedia PDF Downloads 479
612 Effect of Recreational Soccer on Health Indices and Diseases Prevention

Authors: Avinash Kharel

Abstract:

Recreational soccer (RS) as a medium of small-sided soccer game (SSG) has an immense positive effect on physical health, mental health and wellbeing. The RS has reflected both acute responses and long-term effects of training on sedentary, trained and clinical population on any age, gender or health status. The enjoyable mode of training elicits greater adherence by optimising intrinsic motivation while offering health benefits that match those achieved by treadmill and cycle ergometer programmes both as continuous and interval forms of training. Additionally, recreational soccer is effective and efficient regimens with highlighted social, motivational and competitive components overcoming the barriers such as cost-efficiency, time-efficiency, assess to facilities and intrinsic motivation. Further, it can be applied as an effective broad-spectrum non-pharmacological treatment of lifestyle diseases producing a positive physiological response in healthy subjects, patients and elderly people regardless of age, gender or training experience.

Keywords: recreational soccer, health benefits, diseases prevention, physiology

Procedia PDF Downloads 60
611 Wind Tunnel Tests on Ground-Mounted and Roof-Mounted Photovoltaic Array Systems

Authors: Chao-Yang Huang, Rwey-Hua Cherng, Chung-Lin Fu, Yuan-Lung Lo

Abstract:

Solar energy is one of the replaceable choices to reduce the CO2 emission produced by conventional power plants in the modern society. As an island which is frequently visited by strong typhoons and earthquakes, it is an urgent issue for Taiwan to make an effort in revising the local regulations to strengthen the safety design of photovoltaic systems. Currently, the Taiwanese code for wind resistant design of structures does not have a clear explanation on photovoltaic systems, especially when the systems are arranged in arrayed format. Furthermore, when the arrayed photovoltaic system is mounted on the rooftop, the approaching flow is significantly altered by the building and led to different pressure pattern in the different area of the photovoltaic system. In this study, L-shape arrayed photovoltaic system is mounted on the ground of the wind tunnel and then mounted on the building rooftop. The system is consisted of 60 PV models. Each panel model is equivalent to a full size of 3.0 m in depth and 10.0 m in length. Six pressure taps are installed on the upper surface of the panel model and the other six are on the bottom surface to measure the net pressures. Wind attack angle is varied from 0° to 360° in a 10° interval for the worst concern due to wind direction. The sampling rate of the pressure scanning system is set as high enough to precisely estimate the peak pressure and at least 20 samples are recorded for good ensemble average stability. Each sample is equivalent to 10-minute time length in full scale. All the scale factors, including timescale, length scale, and velocity scale, are properly verified by similarity rules in low wind speed wind tunnel environment. The purpose of L-shape arrayed system is for the understanding the pressure characteristics at the corner area. Extreme value analysis is applied to obtain the design pressure coefficient for each net pressure. The commonly utilized Cook-and-Mayne coefficient, 78%, is set to the target non-exceedance probability for design pressure coefficients under Gumbel distribution. Best linear unbiased estimator method is utilized for the Gumbel parameter identification. Careful time moving averaging method is also concerned in data processing. Results show that when the arrayed photovoltaic system is mounted on the ground, the first row of the panels reveals stronger positive pressure than that mounted on the rooftop. Due to the flow separation occurring at the building edge, the first row of the panels on the rooftop is most in negative pressures; the last row, on the other hand, shows positive pressures because of the flow reattachment. Different areas also have different pressure patterns, which corresponds well to the regulations in ASCE7-16 describing the area division for design values. Several minor observations are found according to parametric studies, such as rooftop edge effect, parapet effect, building aspect effect, row interval effect, and so on. General comments are then made for the proposal of regulation revision in Taiwanese code.

Keywords: aerodynamic force coefficient, ground-mounted, roof-mounted, wind tunnel test, photovoltaic

Procedia PDF Downloads 103
610 Discrete Estimation of Spectral Density for Alpha Stable Signals Observed with an Additive Error

Authors: R. Sabre, W. Horrigue, J. C. Simon

Abstract:

This paper is interested in two difficulties encountered in practice when observing a continuous time process. The first is that we cannot observe a process over a time interval; we only take discrete observations. The second is the process frequently observed with a constant additive error. It is important to give an estimator of the spectral density of such a process taking into account the additive observation error and the choice of the discrete observation times. In this work, we propose an estimator based on the spectral smoothing of the periodogram by the polynomial Jackson kernel reducing the additive error. In order to solve the aliasing phenomenon, this estimator is constructed from observations taken at well-chosen times so as to reduce the estimator to the field where the spectral density is not zero. We show that the proposed estimator is asymptotically unbiased and consistent. Thus we obtain an estimate solving the two difficulties concerning the choice of the instants of observations of a continuous time process and the observations affected by a constant error.

Keywords: spectral density, stable processes, aliasing, periodogram

Procedia PDF Downloads 115
609 Estimating Precipitable Water Vapour Using the Global Positioning System and Radio Occultation over Ethiopian Regions

Authors: Asmamaw Yehun, Tsegaye Gogie, Martin Vermeer, Addisu Hunegnaw

Abstract:

The Global Positioning System (GPS) is a space-based radio positioning system, which is capable of providing continuous position, velocity, and time information to users anywhere on or near the surface of the Earth. The main objective of this work was to estimate the integrated precipitable water vapour (IPWV) using ground GPS and Low Earth Orbit (LEO) Radio Occultation (RO) to study spatial-temporal variability. For LEO-GPS RO, we used Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) datasets. We estimated the daily and monthly mean of IPWV using six selected ground-based GPS stations over a period of range from 2012 to 2016 (i.e. five-years period). The main perspective for selecting the range period from 2012 to 2016 is that, continuous data were available during these periods at all Ethiopian GPS stations. We studied temporal, seasonal, diurnal, and vertical variations of precipitable water vapour using GPS observables extracted from the precise geodetic GAMIT-GLOBK software package. Finally, we determined the cross-correlation of our GPS-derived IPWV values with those of the European Centre for Medium-Range Weather Forecasts (ECMWF) ERA-40 Interim reanalysis and of the second generation National Oceanic and Atmospheric Administration (NOAA) model ensemble Forecast System Reforecast (GEFS/R) for validation and static comparison. There are higher values of the IPWV range from 30 to 37.5 millimetres (mm) in Gambela and Southern Regions of Ethiopia. Some parts of Tigray, Amhara, and Oromia regions had low IPWV ranges from 8.62 to 15.27 mm. The correlation coefficient between GPS-derived IPWV with ECMWF and GEFS/R exceeds 90%. We conclude that there are highly temporal, seasonal, diurnal, and vertical variations of precipitable water vapour in the study area.

Keywords: GNSS, radio occultation, atmosphere, precipitable water vapour

Procedia PDF Downloads 58
608 Data Analytics of Electronic Medical Records Shows an Age-Related Differences in Diagnosis of Coronary Artery Disease

Authors: Maryam Panahiazar, Andrew M. Bishara, Yorick Chern, Roohallah Alizadehsani, Dexter Hadleye, Ramin E. Beygui

Abstract:

Early detection plays a crucial role in enhancing the outcome for a patient with coronary artery disease (CAD). We utilized a big data analytics platform on ~23,000 patients with CAD from a total of 960,129 UCSF patients in 8 years. We traced the patients from their first encounter with a physician to diagnose and treat CAD. Characteristics such as demographic information, comorbidities, vital, lab tests, medications, and procedures are included. There are statistically significant gender-based differences in patients younger than 60 years old from the time of the first physician encounter to coronary artery bypass grafting (CABG) with a p-value=0.03. There are no significant differences between the patients between 60 and 80 years old (p-value=0.8) and older than 80 (p-value=0.4) with a 95% confidence interval. This recognition would affect significant changes in the guideline for referral of the patients for diagnostic tests expeditiously to improve the outcome by avoiding the delay in treatment.

Keywords: electronic medical records, coronary artery disease, data analytics, young women

Procedia PDF Downloads 124
607 Application of Transform Fourier for Dynamic Control of Structures with Global Positioning System

Authors: J. M. de Luis Ruiz, P. M. Sierra García, R. P. García, R. P. Álvarez, F. P. García, E. C. López

Abstract:

Given the evolution of viaducts, structural health monitoring requires more complex techniques to define their state. two alternatives can be distinguished: experimental and operational modal analysis. Although accelerometers or Global Positioning System (GPS) have been applied for the monitoring of structures under exploitation, the dynamic monitoring during the stage of construction is not common. This research analyzes whether GPS data can be applied to certain dynamic geometric controls of evolving structures. The fundamentals of this work were applied to the New Bridge of Cádiz (Spain), a worldwide milestone in bridge building. GPS data were recorded with an interval of 1 second during the erection of segments and turned to the frequency domain with Fourier transform. The vibration period and amplitude were contrasted with those provided by the finite element model, with differences of less than 10%, which is admissible. This process provides a vibration record of the structure with GPS, avoiding specific equipment.

Keywords: Fourier transform, global position system, operational modal analysis, structural health monitoring

Procedia PDF Downloads 216
606 Analysis of Exponential Distribution under Step Stress Partially Accelerated Life Testing Plan Using Adaptive Type-I Hybrid Progressive Censoring Schemes with Competing Risks Data

Authors: Ahmadur Rahman, Showkat Ahmad Lone, Ariful Islam

Abstract:

In this article, we have estimated the parameters for the failure times of units based on the sampling technique adaptive type-I progressive hybrid censoring under the step-stress partially accelerated life tests for competing risk. The failure times of the units are assumed to follow an exponential distribution. Maximum likelihood estimation technique is used to estimate the unknown parameters of the distribution and tampered coefficient. Confidence interval also obtained for the parameters. A simulation study is performed by using Monte Carlo Simulation method to check the authenticity of the model and its assumptions.

Keywords: adaptive type-I hybrid progressive censoring, competing risks, exponential distribution, simulation, step-stress partially accelerated life tests

Procedia PDF Downloads 319
605 Assessment of Ground Water Potential Zone: A Case Study of Paramakudi Taluk, Ramanathapuram, Tamilnadu, India

Authors: Shri Devi

Abstract:

This paper was conducted to see the ground water potential zones in Paramakudi taluk, Ramanathapuram,Tamilnadu India with a total areal extent of 745 sq. km. The various thematic map have been prepared for the study such as soil, geology, geomorphology, drainage, land use of the particular study area using the Toposheet of 1: 50000. The digital elevation model (DEM) has been generated from contour interval of 10m and also the slope was prepared. The ground water potential zone of the region was obtained using the weighted overlay analysis for which all the thematic maps were overlayed in arc gis 10.2. For the particular output the ranking has been given for all the parameters of each thematic layer with different weightage such as 25% was given to soil, 25% to geomorphology and land use land cover also 25%, slope 15%, lineament with 5% and drainage streams with 5 percentage. Using these entire potential zone maps was prepared which was overlayed with the village map to check the region which has good, moderate and low groundwater potential zone.

Keywords: GIS, ground water, Paramakudi, weighted overlay analysis

Procedia PDF Downloads 316
604 Developing New Algorithm and Its Application on Optimal Control of Pumps in Water Distribution Network

Authors: R. Rajabpour, N. Talebbeydokhti, M. H. Ahmadi

Abstract:

In recent years, new techniques for solving complex problems in engineering are proposed. One of these techniques is JPSO algorithm. With innovative changes in the nature of the jump algorithm JPSO, it is possible to construct a graph-based solution with a new algorithm called G-JPSO. In this paper, a new algorithm to solve the optimal control problem Fletcher-Powell and optimal control of pumps in water distribution network was evaluated. Optimal control of pumps comprise of optimum timetable operation (status on and off) for each of the pumps at the desired time interval. Maximum number of status on and off for each pumps imposed to the objective function as another constraint. To determine the optimal operation of pumps, a model-based optimization-simulation algorithm was developed based on G-JPSO and JPSO algorithms. The proposed algorithm results were compared well with the ant colony algorithm, genetic and JPSO results. This shows the robustness of proposed algorithm in finding near optimum solutions with reasonable computational cost.

Keywords: G-JPSO, operation, optimization, pumping station, water distribution networks

Procedia PDF Downloads 375
603 Behavioral Finance: Anomalies at Real Markets, Weekday Effect

Authors: Vera Jancurova

Abstract:

The financial theory is dominated by the believe that weekday effect has disappeared from current markets. The purpose of this article is to study anomalies, especially weekday effect, at real markets that disrupt the efficiency of financial markets. The research is based on the analyses of historical daily exchange rates of significant world indices to determine the presence of weekday effects on financial markets. The methodology used for the study is based on the analyzes of daily averages of particular indexes for different time periods. Average daily gains were analyzed for their whole time interval and then for particular five and ten years periods with the aim to detect the presence on current financial markets. The results confirm the presence of weekday effect at the most significant indices - for example: Nasdaq, S & P 500, FTSE 100 and the Hang Seng. It was confirmed that in the last ten years, the weekend effect disappeared from financial markets. However in last year’s the indicators show that weekday effect is coming back. The study shows that weekday effect has to be taken into consideration on financial markets, especially in the past years.

Keywords: indices, anomalies, behavioral finance, weekday effect

Procedia PDF Downloads 308
602 New Machine Learning Optimization Approach Based on Input Variables Disposition Applied for Time Series Prediction

Authors: Hervice Roméo Fogno Fotsoa, Germaine Djuidje Kenmoe, Claude Vidal Aloyem Kazé

Abstract:

One of the main applications of machine learning is the prediction of time series. But a more accurate prediction requires a more optimal model of machine learning. Several optimization techniques have been developed, but without considering the input variables disposition of the system. Thus, this work aims to present a new machine learning architecture optimization technique based on their optimal input variables disposition. The validations are done on the prediction of wind time series, using data collected in Cameroon. The number of possible dispositions with four input variables is determined, i.e., twenty-four. Each of the dispositions is used to perform the prediction, with the main criteria being the training and prediction performances. The results obtained from a static architecture and a dynamic architecture of neural networks have shown that these performances are a function of the input variable's disposition, and this is in a different way from the architectures. This analysis revealed that it is necessary to take into account the input variable's disposition for the development of a more optimal neural network model. Thus, a new neural network training algorithm is proposed by introducing the search for the optimal input variables disposition in the traditional back-propagation algorithm. The results of the application of this new optimization approach on the two single neural network architectures are compared with the previously obtained results step by step. Moreover, this proposed approach is validated in a collaborative optimization method with a single objective optimization technique, i.e., genetic algorithm back-propagation neural networks. From these comparisons, it is concluded that each proposed model outperforms its traditional model in terms of training and prediction performance of time series. Thus the proposed optimization approach can be useful in improving the accuracy of time series forecasts. This proves that the proposed optimization approach can be useful in improving the accuracy of time series prediction based on machine learning.

Keywords: input variable disposition, machine learning, optimization, performance, time series prediction

Procedia PDF Downloads 69
601 An Experimental Study on Some Conventional and Hybrid Models of Fuzzy Clustering

Authors: Jeugert Kujtila, Kristi Hoxhalli, Ramazan Dalipi, Erjon Cota, Ardit Murati, Erind Bedalli

Abstract:

Clustering is a versatile instrument in the analysis of collections of data providing insights of the underlying structures of the dataset and enhancing the modeling capabilities. The fuzzy approach to the clustering problem increases the flexibility involving the concept of partial memberships (some value in the continuous interval [0, 1]) of the instances in the clusters. Several fuzzy clustering algorithms have been devised like FCM, Gustafson-Kessel, Gath-Geva, kernel-based FCM, PCM etc. Each of these algorithms has its own advantages and drawbacks, so none of these algorithms would be able to perform superiorly in all datasets. In this paper we will experimentally compare FCM, GK, GG algorithm and a hybrid two-stage fuzzy clustering model combining the FCM and Gath-Geva algorithms. Firstly we will theoretically dis-cuss the advantages and drawbacks for each of these algorithms and we will describe the hybrid clustering model exploiting the advantages and diminishing the drawbacks of each algorithm. Secondly we will experimentally compare the accuracy of the hybrid model by applying it on several benchmark and synthetic datasets.

Keywords: fuzzy clustering, fuzzy c-means algorithm (FCM), Gustafson-Kessel algorithm, hybrid clustering model

Procedia PDF Downloads 481
600 Some Observations on the Analysis of Four Performances of the Allemande from J.S. Bach's Partita for Solo Flute (BWV 1013) in Terms of Zipf's Law

Authors: Douglas W. Scott

Abstract:

The Allemande from J. S. Bach's Partita for solo flute (BWV 1013) presents many unique challenges for any flautist, especially in terms of segmentation analysis required to select breathing places in the first half. Without claiming to identify a 'correct' solution to this problem, this paper analyzes the section in terms of a set of techniques based around a statistical property commonly (if not ubiquitously) found in music, namely Zipf’s law. Specifically, the paper considers violations of this expected profile at various levels of analysis, an approach which has yielded interesting insights in previous studies. The investigation is then grounded by considering four actual solutions to the problem found in recordings made by different flautists, which opens up the possibility of expanding Zipfian analysis to include a consideration of inter-onset-intervals (IOIs). It is found that significant deviations from the expected Zipfian distributions can reveal and highlight stylistic choices made by different performers.

Keywords: inter-onset-interval, Partita for solo flute, BWV 1013, segmentation analysis, Zipf’s law

Procedia PDF Downloads 149
599 The Analysis of Exhaust Emission from Single Cylinder Non-Mobile Spark Ignition Engine Using Ethanol-Gasoline Blend as Fuel

Authors: Iyiola Olusola Oluwaleye, Ogbevire Umukoro

Abstract:

In view of the prevailing pollution problems and its consequences on the environment, efforts are being made to lower the concentration of toxic components in combustion products and decreasing fossil fuel consumption by using renewable alternative fuels. In this work, the impact of ethanol-gasoline blend on the exhaust emission of a single cylinder non-mobile spark ignition engine was investigated. Gasoline was blended with 5 – 20% of ethanol sourced from the open market (bought off the shelf) in an interval of 5%. The results of the emission characteristics of the exhaust gas from the combustion of the ethanol-gasoline blends showed that increasing the percentage of ethanol in the blend decreased CO emission by between 2.12% and 52.29% and HC emissions by between12.14% and 53.24%, but increased CO2 and NOx emissions by between 25% to 56% and 59% to 60% respectively. E15 blend is preferred above other blends at no-load and across all the load variations. However its NOx emission was the highest when compared with other samples. This will negatively affect human health and the environment but this drawback can be remedied by adequate treatment with appropriate additives.

Keywords: blends, emission, ethanol, gasoline, spark ignition engine

Procedia PDF Downloads 169
598 Mechanical Properties and Microstructural Analysis of Al6061-Red Mud Composites

Authors: M. Gangadharappa, M. Ravi Kumar, H. N. Reddappa

Abstract:

The mechanical properties and morphological analysis of Al6061-Red mud particulate composites were investigated. The compositions of the composite include a matrix of Al6061 and the red mud particles of 53-75 micron size as reinforcement ranging from 0% to 12% at an interval of 2%. Stir casting technique was used to fabricate Al6061-Red mud composites. Density measurement, estimation of percentage porosity, tensile properties, fracture toughness, hardness value, impact energy, percentage elongation and percentage reduction in area. Further, the microstructures and SEM examinations were investigated to characterize the composites produced. The result shows that a uniform dispersion of the red mud particles along the grain boundaries of the Al6061 alloy. The tensile strength and hardness values increases with the addition of Red mud particles, but there is a slight decrease in the impact energy values, values of percentage elongation and percentage reduction in area as the reinforcement increases. From these results of investigation, we concluded that the red mud, an industrial waste can be used to enhance the properties of Al6061 alloy for engineering applications.

Keywords: Al6061, red mud, tensile strength, hardness and microstructures

Procedia PDF Downloads 540
597 Expression of ACSS2 Genes in Peripheral Blood Mononuclear Cells of Patients with Alzheimer’s Disease

Authors: Ali Bayram, Burak Uz, Remzi Yiğiter

Abstract:

The impairment of lipid metabolism in the central nervous system has been suggested as a critical factor of Alzheimer’s disease (AD) pathogenesis. Homo sapiens acyl-coenyme A synthetase short-chain family member 2 (ACSS2) gene encodes the enzyme acetyl-Coenzyme A synthetase (AMP forming; AceCS) providing acetyl-coenzyme A (Ac-CoA) for various physiological processes, such as cholesterol and fatty acid synthesis, as well as the citric acid cycle. We investigated ACSS2, transcript variant 1 (ACSS2*1), mRNA levels in the peripheral blood mononuclear cells (PBMC) of patients with AD and compared them with the controls. The study group comprised 50 patients with the diagnosis of AD who have applied to Gaziantep University Faculty of Medicine, and Department of Neurology. 49 healthy individuals without any neurodegenerative disease are included as controls. ACSS2 mRNA expression in PBMC of AD/control patients was 0.495 (95% confidence interval: 0.410-0.598), p= .000000001902). Further studies are needed to better clarify this association.

Keywords: Alzheimer’s disease, ACSS2 Genes, mRNA expression, RT-PCR

Procedia PDF Downloads 358
596 Spatial Analysis of Park and Ride Users’ Dynamic Accessibility to Train Station: A Case Study in Perth

Authors: Ting (Grace) Lin, Jianhong (Cecilia) Xia, Todd Robinson

Abstract:

Accessibility analysis, examining people’s ability to access facilities and destinations, is a fundamental assessment for transport planning, policy making, and social exclusion research. Dynamic accessibility which measures accessibility in real-time traffic environment has been an advanced accessibility indicator in transport research. It is also a useful indicator to help travelers to understand travel time daily variability, assists traffic engineers to monitor traffic congestions, and finally develop effective strategies in order to mitigate traffic congestions. This research involved real-time traffic information by collecting travel time data with 15-minute interval via the TomTom® API. A framework for measuring dynamic accessibility was then developed based on the gravity theory and accessibility dichotomy theory through space and time interpolation. Finally, the dynamic accessibility can be derived at any given time and location under dynamic accessibility spatial analysis framework.

Keywords: dynamic accessibility, hot spot, transport research, TomTom® API

Procedia PDF Downloads 358
595 Wind Farm Power Performance Verification Using Non-Parametric Statistical Inference

Authors: M. Celeska, K. Najdenkoski, V. Dimchev, V. Stoilkov

Abstract:

Accurate determination of wind turbine performance is necessary for economic operation of a wind farm. At present, the procedure to carry out the power performance verification of wind turbines is based on a standard of the International Electrotechnical Commission (IEC). In this paper, nonparametric statistical inference is applied to designing a simple, inexpensive method of verifying the power performance of a wind turbine. A statistical test is explained, examined, and the adequacy is tested over real data. The methods use the information that is collected by the SCADA system (Supervisory Control and Data Acquisition) from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. The study has used data on the monthly output of wind farm in the Republic of Macedonia, and the time measuring interval was from January 1, 2016, to December 31, 2016. At the end, it is concluded whether the power performance of a wind turbine differed significantly from what would be expected. The results of the implementation of the proposed methods showed that the power performance of the specific wind farm under assessment was acceptable.

Keywords: canonical correlation analysis, power curve, power performance, wind energy

Procedia PDF Downloads 308