Search results for: Baseline estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1172

Search results for: Baseline estimation

302 Human Growth Curve Estimation through a Combination of Longitudinal and Cross-sectional Data

Authors: Sedigheh Mirzaei S., Debasis Sengupta

Abstract:

Parametric models have been quite popular for studying human growth, particularly in relation to biological parameters such as peak size velocity and age at peak size velocity. Longitudinal data are generally considered to be vital for fittinga parametric model to individual-specific data, and for studying the distribution of these biological parameters in a human population. However, cross-sectional data are easier to obtain than longitudinal data. In this paper, we present a method of combining longitudinal and cross-sectional data for the purpose of estimating the distribution of the biological parameters. We demonstrate, through simulations in the special case ofthePreece Baines model, how estimates based on longitudinal data can be improved upon by harnessing the information contained in cross-sectional data.We study the extent of improvement for different mixes of the two types of data, and finally illustrate the use of the method through data collected by the Indian Statistical Institute.

Keywords: Preece-Baines growth model, MCMC method, Mixed effect model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2139
301 Bandwidth Estimation Algorithms for the Dynamic Adaptation of Voice Codec

Authors: Davide Pierattoni, Ivan Macor, Pier Luca Montessoro

Abstract:

In the recent years multimedia traffic and in particular VoIP services are growing dramatically. We present a new algorithm to control the resource utilization and to optimize the voice codec selection during SIP call setup on behalf of the traffic condition estimated on the network path. The most suitable methodologies and the tools that perform realtime evaluation of the available bandwidth on a network path have been integrated with our proposed algorithm: this selects the best codec for a VoIP call in function of the instantaneous available bandwidth on the path. The algorithm does not require any explicit feedback from the network, and this makes it easily deployable over the Internet. We have also performed intensive tests on real network scenarios with a software prototype, verifying the algorithm efficiency with different network topologies and traffic patterns between two SIP PBXs. The promising results obtained during the experimental validation of the algorithm are now the basis for the extension towards a larger set of multimedia services and the integration of our methodology with existing PBX appliances.

Keywords: Integrated voice-data communication, computernetwork performance, resource optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1693
300 Identification of Aircraft Gas Turbine Engines Temperature Condition

Authors: Pashayev A., Askerov D., C. Ardil, Sadiqov R., Abdullayev P.

Abstract:

Groundlessness of application probability-statistic methods are especially shown at an early stage of the aviation GTE technical condition diagnosing, when the volume of the information has property of the fuzzy, limitations, uncertainty and efficiency of application of new technology Soft computing at these diagnosing stages by using the fuzzy logic and neural networks methods. It is made training with high accuracy of multiple linear and nonlinear models (the regression equations) received on the statistical fuzzy data basis. At the information sufficiency it is offered to use recurrent algorithm of aviation GTE technical condition identification on measurements of input and output parameters of the multiple linear and nonlinear generalized models at presence of noise measured (the new recursive least squares method (LSM)). As application of the given technique the estimation of the new operating aviation engine D30KU-154 technical condition at height H=10600 m was made.

Keywords: Identification of a technical condition, aviation gasturbine engine, fuzzy logic and neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1660
299 Identification of Aircraft Gas Turbine Engine's Temperature Condition

Authors: Pashayev A., Askerov D., C. Ardil, Sadiqov R., Abdullayev P.

Abstract:

Groundlessness of application probability-statistic methods are especially shown at an early stage of the aviation GTE technical condition diagnosing, when the volume of the information has property of the fuzzy, limitations, uncertainty and efficiency of application of new technology Soft computing at these diagnosing stages by using the fuzzy logic and neural networks methods. It is made training with high accuracy of multiple linear and nonlinear models (the regression equations) received on the statistical fuzzy data basis. At the information sufficiency it is offered to use recurrent algorithm of aviation GTE technical condition identification on measurements of input and output parameters of the multiple linear and nonlinear generalized models at presence of noise measured (the new recursive least squares method (LSM)). As application of the given technique the estimation of the new operating aviation engine D30KU-154 technical condition at height H=10600 m was made.

Keywords: Identification of a technical condition, aviation gasturbine engine, fuzzy logic and neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1672
298 Estimation of the Upper Tail Dependence Coefficient for Insurance Loss Data Using an Empirical Copula-Based Approach

Authors: Adrian O’Hagan, Robert McLoughlin

Abstract:

Considerable focus in the world of insurance risk quantification is placed on modeling loss values from lines of business (LOBs) that possess upper tail dependence. Copulas such as the Joe, Gumbel and Student-t copula may be used for this purpose. The copula structure imparts a desired level of tail dependence on the joint distribution of claims from the different LOBs. Alternatively, practitioners may possess historical or simulated data that already exhibit upper tail dependence, through the impact of catastrophe events such as hurricanes or earthquakes. In these circumstances, it is not desirable to induce additional upper tail dependence when modeling the joint distribution of the loss values from the individual LOBs. Instead, it is of interest to accurately assess the degree of tail dependence already present in the data. The empirical copula and its associated upper tail dependence coefficient are presented in this paper as robust, efficient means of achieving this goal.

Keywords: Empirical copula, extreme events, insurance loss reserving, upper tail dependence coefficient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4847
297 A Hybrid Scheme for on-Line Diagnostic Decision Making Using Optimal Data Representation and Filtering Technique

Authors: Hyun-Woo Cho

Abstract:

The early diagnostic decision making in industrial processes is absolutely necessary to produce high quality final products. It helps to provide early warning for a special event in a process, and finding its assignable cause can be obtained. This work presents a hybrid diagnostic schmes for batch processes. Nonlinear representation of raw process data is combined with classification tree techniques. The nonlinear kernel-based dimension reduction is executed for nonlinear classification decision boundaries for fault classes. In order to enhance diagnosis performance for batch processes, filtering of the data is performed to get rid of the irrelevant information of the process data. For the diagnosis performance of several representation, filtering, and future observation estimation methods, four diagnostic schemes are evaluated. In this work, the performance of the presented diagnosis schemes is demonstrated using batch process data.

Keywords: Diagnostics, batch process, nonlinear representation, data filtering, multivariate statistical approach

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1316
296 Discrete Estimation of Spectral Density for Alpha Stable Signals Observed with an Additive Error

Authors: R. Sabre, W. Horrigue, J. C. Simon

Abstract:

This paper is interested in two difficulties encountered in practice when observing a continuous time process. The first is that we cannot observe a process over a time interval; we only take discrete observations. The second is the process frequently observed with a constant additive error. It is important to give an estimator of the spectral density of such a process taking into account the additive observation error and the choice of the discrete observation times. In this work, we propose an estimator based on the spectral smoothing of the periodogram by the polynomial Jackson kernel reducing the additive error. In order to solve the aliasing phenomenon, this estimator is constructed from observations taken at well-chosen times so as to reduce the estimator to the field where the spectral density is not zero. We show that the proposed estimator is asymptotically unbiased and consistent. Thus we obtain an estimate solving the two difficulties concerning the choice of the instants of observations of a continuous time process and the observations affected by a constant error.

Keywords: Spectral density, stable processes, aliasing, periodogram.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 663
295 Energy Communities from Municipality Level to Province Level: A Comparison Using Autoregressive Integrated Moving Average Model

Authors: Amro Issam Hamed Attia Ramadan, Marco Zappatore, Pasquale Balena, Antonella Longo

Abstract:

Considering the energy crisis that is hitting Europe, it becomes increasingly necessary to change energy policies to depend less on fossil fuels and replace them with energy from renewable sources. This has triggered the urge to use clean energy, not only to satisfy energy needs and fulfill the required consumption, but also to decrease the danger of climatic changes due to harmful emissions. Many countries have already started creating energy communities based on renewable energy sources. The first step to understanding energy needs in any place is to perfectly know the consumption. In this work, we aim to estimate electricity consumption for a municipality that makes up part of a rural area located in southern Italy using forecast models that allow for the estimation of electricity consumption for the next 10 years, and we then apply the same model to the province where the municipality is located and estimate the future consumption for the same period to examine whether it is possible to start from the municipality level to reach the province level when creating energy communities.

Keywords: ARIMA, electricity consumption, forecasting models, time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 282
294 Predicting Extrusion Process Parameters Using Neural Networks

Authors: Sachin Man Bajimaya, SangChul Park, Gi-Nam Wang

Abstract:

The objective of this paper is to estimate realistic principal extrusion process parameters by means of artificial neural network. Conventionally, finite element analysis is used to derive process parameters. However, the finite element analysis of the extrusion model does not consider the manufacturing process constraints in its modeling. Therefore, the process parameters obtained through such an analysis remains highly theoretical. Alternatively, process development in industrial extrusion is to a great extent based on trial and error and often involves full-size experiments, which are both expensive and time-consuming. The artificial neural network-based estimation of the extrusion process parameters prior to plant execution helps to make the actual extrusion operation more efficient because more realistic parameters may be obtained. And so, it bridges the gap between simulation and real manufacturing execution system. In this work, a suitable neural network is designed which is trained using an appropriate learning algorithm. The network so trained is used to predict the manufacturing process parameters.

Keywords: Artificial Neural Network (ANN), Indirect Extrusion, Finite Element Analysis, MES.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2368
293 Prediction of the Thermal Parameters of a High-Temperature Metallurgical Reactor Using Inverse Heat Transfer

Authors: Mohamed Hafid, Marcel Lacroix

Abstract:

This study presents an inverse analysis for predicting the thermal conductivities and the heat flux of a high-temperature metallurgical reactor simultaneously. Once these thermal parameters are predicted, the time-varying thickness of the protective phase-change bank that covers the inside surface of the brick walls of a metallurgical reactor can be calculated. The enthalpy method is used to solve the melting/solidification process of the protective bank. The inverse model rests on the Levenberg-Marquardt Method (LMM) combined with the Broyden method (BM). A statistical analysis for the thermal parameter estimation is carried out. The effect of the position of the temperature sensors, total number of measurements and measurement noise on the accuracy of inverse predictions is investigated. Recommendations are made concerning the location of temperature sensors.

Keywords: Inverse heat transfer, phase change, metallurgical reactor, Levenberg–Marquardt method, Broyden method, bank thickness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1693
292 The Effects of Detector Spacing on Travel Time Prediction on Freeways

Authors: Piyali Chaudhuri, Peter T. Martin, Aleksandar Z. Stevanovic, Chongkai Zhu

Abstract:

Loop detectors report traffic characteristics in real time. They are at the core of traffic control process. Intuitively, one would expect that as density of detection increases, so would the quality of estimates derived from detector data. However, as detector deployment increases, the associated operating and maintenance cost increases. Thus, traffic agencies often need to decide where to add new detectors and which detectors should continue receiving maintenance, given their resource constraints. This paper evaluates the effect of detector spacing on freeway travel time estimation. A freeway section (Interstate-15) in Salt Lake City metropolitan region is examined. The research reveals that travel time accuracy does not necessarily deteriorate with increased detector spacing. Rather, the actual location of detectors has far greater influence on the quality of travel time estimates. The study presents an innovative computational approach that delivers optimal detector locations through a process that relies on Genetic Algorithm formulation.

Keywords: Detector, Freeway, Genetic algorithm, Travel timeestimate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1669
291 Effect of Density on the Shear Modulus and Damping Ratio of Saturated Sand in Small Strain

Authors: M. Kakavand, S. A. Naeini

Abstract:

Dynamic properties of soil in small strains, especially for geotechnical engineers, are important for describing the behavior of soil and estimation of the earth structure deformations and structures, especially significant structures. This paper presents the effect of density on the shear modulus and damping ratio of saturated clean sand at various isotropic confining pressures. For this purpose, the specimens were compared with two different relative densities, loose Dr = 30% and dense Dr = 70%. Dynamic parameters were attained from a series of consolidated undrained fixed – free type torsional resonant column tests in small strain. Sand No. 161 is selected for this paper. The experiments show that by increasing sand density and confining pressure, the shear modulus increases and the damping ratio decreases.

Keywords: Dynamic properties, shear modulus, damping ratio, clean sand, density, confining pressure, resonant column/torsional simple shear.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 885
290 Air Dispersion Model for Prediction Fugitive Landfill Gaseous Emission Impact in Ambient Atmosphere

Authors: Moustafa Osman Mohammed

Abstract:

This paper will explore formation of HCl aerosol at atmospheric boundary layers and encourages the uptake of environmental modeling systems (EMSs) as a practice evaluation of gaseous emissions (“framework measures”) from small and medium-sized enterprises (SMEs). The conceptual model predicts greenhouse gas emissions to ecological points beyond landfill site operations. It focuses on incorporation traditional knowledge into baseline information for both measurement data and the mathematical results, regarding parameters influence model variable inputs. The paper has simplified parameters of aerosol processes based on the more complex aerosol process computations. The simple model can be implemented to both Gaussian and Eulerian rural dispersion models. Aerosol processes considered in this study were (i) the coagulation of particles, (ii) the condensation and evaporation of organic vapors, and (iii) dry deposition. The chemical transformation of gas-phase compounds is taken into account photochemical formulation with exposure effects according to HCl concentrations as starting point of risk assessment. The discussion set out distinctly aspect of sustainability in reflection inputs, outputs, and modes of impact on the environment. Thereby, models incorporate abiotic and biotic species to broaden the scope of integration for both quantification impact and assessment risks. The later environmental obligations suggest either a recommendation or a decision of what is a legislative should be achieved for mitigation measures of landfill gas (LFG) ultimately.

Keywords: Air dispersion model, landfill management, spatial analysis, environmental impact and risk assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1558
289 An In-depth Experimental Study of Wax Deposition in Pipelines

Authors: M. L. Arias, J. D’Adamo, M. N. Novosad, P. A. Raffo, H. P. Burbridge, G. O. Artana

Abstract:

Shale oils are highly paraffinic and, consequently, can create wax deposits that foul pipelines during transportation. Several factors must be considered when designing pipelines or treatment programs that prevent wax deposition: including chemical species in crude oils, flowrates, pipes diameters and temperature. This paper describes the wax deposition study carried out within the framework of YPF Tecnolgía S.A. (Y-TEC) flow assurance projects, as part of the process to achieve a better understanding on wax deposition issues. Laboratory experiments were performed on a medium size, 1 inch diameter, wax deposition loop of 15 meters long equipped with a solid detector system, online microscope to visualize crystals, temperature, and pressure sensors along the loop pipe. A baseline test was performed with diesel with no added paraffin or additive content. Tests were undertaken with different temperatures of circulating and cooling fluid at different flow conditions. Then, a solution formed with a paraffin incorporated to the diesel was considered. Tests varying flowrate and cooling rate were again run. Viscosity, density, WAT (Wax Appearance Temperature) with DSC (Differential Scanning Calorimetry), pour point and cold finger measurements were carried out to determine physical properties of the working fluids. The results obtained in the loop were analyzed through momentum balance and heat transfer models. To determine possible paraffin deposition scenarios temperature and pressure loop output signals were studied. They were compared with WAT static laboratory methods.

Keywords: Paraffin deposition, wax, oil pipelines, experimental pipe loop.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 162
288 Knowledge Representation Based On Interval Type-2 CFCM Clustering

Authors: Myung-Won Lee, Keun-Chang Kwak

Abstract:

This paper is concerned with knowledge representation and extraction of fuzzy if-then rules using Interval Type-2 Context-based Fuzzy C-Means clustering (IT2-CFCM) with the aid of fuzzy granulation. This proposed clustering algorithm is based on information granulation in the form of IT2 based Fuzzy C-Means (IT2-FCM) clustering and estimates the cluster centers by preserving the homogeneity between the clustered patterns from the IT2 contexts produced in the output space. Furthermore, we can obtain the automatic knowledge representation in the design of Radial Basis Function Networks (RBFN), Linguistic Model (LM), and Adaptive Neuro-Fuzzy Networks (ANFN) from the numerical input-output data pairs. We shall focus on a design of ANFN in this paper. The experimental results on an estimation problem of energy performance reveal that the proposed method showed a good knowledge representation and performance in comparison with the previous works.

Keywords: IT2-FCM, IT2-CFCM, context-based fuzzy clustering, adaptive neuro-fuzzy network, knowledge representation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2617
287 Effect of Specimen Thickness on Probability Distribution of Grown Crack Size in Magnesium Alloys

Authors: Seon Soon Choi

Abstract:

The fatigue crack growth is stochastic because of the fatigue behavior having an uncertainty and a randomness. Therefore, it is necessary to determine the probability distribution of a grown crack size at a specific fatigue crack propagation life for maintenance of structure as well as reliability estimation. The essential purpose of this study is to present the good probability distribution fit for the grown crack size at a specified fatigue life in a rolled magnesium alloy under different specimen thickness conditions. Fatigue crack propagation experiments are carried out in laboratory air under three conditions of specimen thickness using AZ31 to investigate a stochastic crack growth behavior. The goodness-of-fit test for probability distribution of a grown crack size under different specimen thickness conditions is performed by Anderson-Darling test. The effect of a specimen thickness on variability of a grown crack size is also investigated.

Keywords: Crack size, Fatigue crack propagation, Magnesium alloys, Probability distribution, Specimen thickness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1854
286 Video Super-Resolution Using Classification ANN

Authors: Ming-Hui Cheng, Jyh-Horng Jeng

Abstract:

In this study, a classification-based video super-resolution method using artificial neural network (ANN) is proposed to enhance low-resolution (LR) to high-resolution (HR) frames. The proposed method consists of four main steps: classification, motion-trace volume collection, temporal adjustment, and ANN prediction. A classifier is designed based on the edge properties of a pixel in the LR frame to identify the spatial information. To exploit the spatio-temporal information, a motion-trace volume is collected using motion estimation, which can eliminate unfathomable object motion in the LR frames. In addition, temporal lateral process is employed for volume adjustment to reduce unnecessary temporal features. Finally, ANN is applied to each class to learn the complicated spatio-temporal relationship between LR and HR frames. Simulation results show that the proposed method successfully improves both peak signal-to-noise ratio and perceptual quality.

Keywords: Super-resolution, classification, spatio-temporal information, artificial neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1805
285 Applicability of Diatom-Based Water Quality Assessment Indices in Dari Stream, Isparta- Turkey

Authors: Hasan Kalyoncu, Burcu Şerbetci

Abstract:

Diatoms are an important group of aquatic ecosystems and diatom-based indices are increasingly becoming important tools for the assessment of ecological conditions in lotic systems. Although the studies are very limited about Turkish rivers, diatom indices were used for monitoring rivers in different basins. In the present study, we used OMNIDIA program for estimation of stream quality. Some indices have less sensitive (IDP, WAT, LOBO, GENRE, TID, CEE, PT), intermediate sensitivities (IDSE, DESCY, IPS, DI-CH, SLA, IDAP), the others higher sensitivities (SID, IBD, SHE, EPI-D). Among the investigated diatom communities, only a few taxa indicated alfa-mesosaprobity and polysaprobity. Most of the sites were characterized by a great relative contribution of eutraphent and tolerant ones as well as oligosaprobic and betamesosaprobic diatoms. In general, SID and IBD indices gave the best results. This study suggests that the structure of benthic diatom communities and diatom indices, especially SID, can be applied for monitoring rivers in Southern Turkey. 

Keywords: Diatom, Darı stream, OMNIDIA, Turkey, Water quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2921
284 Feature Point Detection by Combining Advantages of Intensity-based Approach and Edge-based Approach

Authors: Sungho Kim, Chaehoon Park, Yukyung Choi, Soon Kwon, In So Kweon

Abstract:

In this paper, a novel corner detection method is presented to stably extract geometrically important corners. Intensity-based corner detectors such as the Harris corner can detect corners in noisy environments but has inaccurate corner position and misses the corners of obtuse angles. Edge-based corner detectors such as Curvature Scale Space can detect structural corners but show unstable corner detection due to incomplete edge detection in noisy environments. The proposed image-based direct curvature estimation can overcome limitations in both inaccurate structural corner detection of the Harris corner detector (intensity-based) and the unstable corner detection of Curvature Scale Space caused by incomplete edge detection. Various experimental results validate the robustness of the proposed method.

Keywords: Feature, intensity, contour, hybrid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1831
283 An Optical Flow Based Segmentation Method for Objects Extraction

Authors: C. Lodato, S. Lopes

Abstract:

This paper describes a segmentation algorithm based on the cooperation of an optical flow estimation method with edge detection and region growing procedures. The proposed method has been developed as a pre-processing stage to be used in methodologies and tools for video/image indexing and retrieval by content. The addressed problem consists in extracting whole objects from background for producing images of single complete objects from videos or photos. The extracted images are used for calculating the object visual features necessary for both indexing and retrieval processes. The first task of the algorithm exploits the cues from motion analysis for moving area detection. Objects and background are then refined using respectively edge detection and region growing procedures. These tasks are iteratively performed until objects and background are completely resolved. The developed method has been applied to a variety of indoor and outdoor scenes where objects of different type and shape are represented on variously textured background.

Keywords: Motion Detection, Object Extraction, Optical Flow, Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1894
282 The Effects of a Circuit Training Program on Muscle Strength, Agility, Anaerobic Performance and Cardiovascular Endurance

Authors: Wirat Sonchan, Pratoom Moungmee, Anek Sootmongkol

Abstract:

This study aimed to examine the effects of a circuit training program on muscle strength, agility, anaerobic performance and cardiovascular endurance. The study involved 24 freshmen (age 18.87+0.68 yr.) male students of the Faculty of Sport Science, Burapha University. They sample study were randomly divided into two groups: Circuit Training group (CT; n=12) and a Control group (C; n=12). Baseline data on height, weight, muscle strength (hand grip dynamometer and leg strength dynamometer), agility (agility T-Test), and anaerobic performance (Running-based Anaerobic Sprint Test) and cardiovascular endurance (20 m Endurance Shuttle Run Test) were collected. The circuit training program included one circuit of eight stations of 30/60 seconds of work/rest interval with two cycles in Week 1-4, and 60/90 seconds of work/rest interval with three cycles in Week 5-8, performed three times per week. Data were analyzed using paired t-tests and independent sample t-test. Statistically significance level was set at 0.05. The results show that after 8 weeks of a training program, muscle strength, agility, anaerobic capacity and cardiovascular endurance increased significantly in the CT Group (p < 0.05), while significant increase was not observed in the C Group (p < 0.05). The results of this study suggest that the circuit training program improved muscle strength, agility, anaerobic capacity and cardiovascular endurance of the study subjects. This program may be used as a guideline for selecting a set of exercise to improve physical fitness.

Keywords: Cardiovascular endurance, circuit training, physical fitness, anaerobic performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2516
281 On the EM Algorithm and Bootstrap Approach Combination for Improving Satellite Image Fusion

Authors: Tijani Delleji, Mourad Zribi, Ahmed Ben Hamida

Abstract:

This paper discusses EM algorithm and Bootstrap approach combination applied for the improvement of the satellite image fusion process. This novel satellite image fusion method based on estimation theory EM algorithm and reinforced by Bootstrap approach was successfully implemented and tested. The sensor images are firstly split by a Bayesian segmentation method to determine a joint region map for the fused image. Then, we use the EM algorithm in conjunction with the Bootstrap approach to develop the bootstrap EM fusion algorithm, hence producing the fused targeted image. We proposed in this research to estimate the statistical parameters from some iterative equations of the EM algorithm relying on a reference of representative Bootstrap samples of images. Sizes of those samples are determined from a new criterion called 'hybrid criterion'. Consequently, the obtained results of our work show that using the Bootstrap EM (BEM) in image fusion improve performances of estimated parameters which involve amelioration of the fused image quality; and reduce the computing time during the fusion process.

Keywords: Satellite image fusion, Bayesian segmentation, Bootstrap approach, EM algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2260
280 Time Series Forecasting Using Various Deep Learning Models

Authors: Jimeng Shi, Mahek Jain, Giri Narasimhan

Abstract:

Time Series Forecasting (TSF) is used to predict the target variables at a future time point based on the learning from previous time points. To keep the problem tractable, learning methods use data from a fixed length window in the past as an explicit input. In this paper, we study how the performance of predictive models change as a function of different look-back window sizes and different amounts of time to predict into the future. We also consider the performance of the recent attention-based transformer models, which had good success in the image processing and natural language processing domains. In all, we compare four different deep learning methods (Recurrent Neural Network (RNN), Long Short-term Memory (LSTM), Gated Recurrent Units (GRU), and Transformer) along with a baseline method. The dataset (hourly) we used is the Beijing Air Quality Dataset from the website of University of California, Irvine (UCI), which includes a multivariate time series of many factors measured on an hourly basis for a period of 5 years (2010-14). For each model, we also report on the relationship between the performance and the look-back window sizes and the number of predicted time points into the future. Our experiments suggest that Transformer models have the best performance with the lowest Mean   Absolute Errors (MAE = 14.599, 23.273) and Root Mean Square Errors (RSME = 23.573, 38.131) for most of our single-step and multi-steps predictions. The best size for the look-back window to predict 1 hour into the future appears to be one day, while 2 or 4 days perform the best to predict 3 hours into the future.

Keywords: Air quality prediction, deep learning algorithms, time series forecasting, look-back window.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1170
279 Estimation of the Drought Index Based on the Climatic Projections of Precipitation of the Uruguay River Basin

Authors: José Leandro Melgar Néris, Claudinéia Brazil, Luciane Teresa Salvi, Isabel Cristina Damin

Abstract:

The impact the climate change is not recent, the main variable in the hydrological cycle is the sequence and shortage of a drought, which has a significant impact on the socioeconomic, agricultural and environmental spheres. This study aims to characterize and quantify, based on precipitation climatic projections, the rainy and dry events in the region of the Uruguay River Basin, through the Standardized Precipitation Index (SPI). The database is the image that is part of the Intercomparison of Model Models, Phase 5 (CMIP5), which provides condition prediction models, organized according to the Representative Routes of Concentration (CPR). Compared to the normal set of climates in the Uruguay River Watershed through precipitation projections, seasonal precipitation increases for all proposed scenarios, with a low climate trend. From the data of this research, the idea is that this article can be used to support research and the responsible bodies can use it as a subsidy for mitigation measures in other hydrographic basins.

Keywords: Drought index, climatic projections, precipitation of the Uruguay River Basin, Standardized Precipitation Index.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 596
278 Economic Loss due to Ganoderma Disease in Oil Palm

Authors: K. Assis, K. P. Chong, A. S. Idris, C. M. Ho

Abstract:

Oil palm or Elaeis guineensis is considered as the golden crop in Malaysia. But oil palm industry in this country is now facing with the most devastating disease called as Ganoderma Basal Stem Rot disease. The objective of this paper is to analyze the economic loss due to this disease. There were three commercial oil palm sites selected for collecting the required data for economic analysis. Yield parameter used to measure the loss was the total weight of fresh fruit bunch in six months. The predictors include disease severity, change in disease severity, number of infected neighbor palms, age of palm, planting generation, topography, and first order interaction variables. The estimation model of yield loss was identified by using backward elimination based regression method. Diagnostic checking was conducted on the residual of the best yield loss model. The value of mean absolute percentage error (MAPE) was used to measure the forecast performance of the model. The best yield loss model was then used to estimate the economic loss by using the current monthly price of fresh fruit bunch at mill gate.

Keywords: Ganoderma, oil palm, regression model, yield loss, economic loss.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3237
277 The Accuracy of the Flight Derivative Estimates Derived from Flight Data

Authors: Jung-hoon Lee, Eung Tai Kim, Byung-hee Chang, In-hee Hwang, Dae-sung Lee

Abstract:

The accuracy of estimated stability and control derivatives of a light aircraft from flight test data were evaluated. The light aircraft, named ChangGong-91, is the first certified aircraft from the Korean government. The output error method, which is a maximum likelihood estimation technique and considers measurement noise only, was used to analyze the aircraft responses measures. The multi-step control inputs were applied in order to excite the short period mode for the longitudinal and Dutch-roll mode for the lateral-directional motion. The estimated stability/control derivatives of Chan Gong-91 were analyzed for the assessment of handling qualities comparing them with those of similar aircraft. The accuracy of the flight derivative estimates derived from flight test measurement was examined in engineering judgment, scatter and Cramer-Rao bound, which turned out to be satisfactory with minor defects..

Keywords: Light Aircraft, Flight Test, Accuracy, Engineering Judgment, Scatter, Cramer-Rao Bound

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1952
276 System Security Impact on the Dynamic Characteristics of Measurement Sensors in Smart Grids

Authors: Yiyang Su, Jörg Neumann, Jan Wetzlich, Florian Thiel

Abstract:

Smart grid is a term used to describe the next generation power grid. New challenges such as integration of renewable and decentralized energy sources, the requirement for continuous grid estimation and optimization, as well as the use of two-way flows of energy have been brought to the power gird. In order to achieve efficient, reliable, sustainable, as well as secure delivery of electric power more and more information and communication technologies are used for the monitoring and the control of power grids. Consequently, the need for cybersecurity is dramatically increased and has converged into several standards which will be presented here. These standards for the smart grid must be designed to satisfy both performance and reliability requirements. An in depth investigation of the effect of retrospectively embedded security in existing grids on it’s dynamic behavior is required. Therefore, a retrofitting plan for existing meters is offered, and it’s performance in a test low voltage microgrid is investigated. As a result of this, integration of security measures into measurement architectures of smart grids at the design phase is strongly recommended.

Keywords: Cyber security, performance, protocols, security standards, smart grid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 884
275 The Recreation Technique Model from the Perspective of Environmental Quality Elements

Authors: G. Gradinaru, S. Olteanu

Abstract:

The quality improvements of the environmental elements could increase the recreational opportunities in a certain area (destination). The technique of the need for recreation focuses on choosing certain destinations for recreational purposes. The basic exchange taken into consideration is the one between the satisfaction gained after staying in that area and the value expressed in money and time allocated. The number of tourists in the respective area, the duration of staying and the money spent including transportation provide information on how individuals rank the place or certain aspects of the area (such as the quality of the environmental elements). For the statistical analysis of the environmental benefits offered by an area through the need of recreation technique, the following stages are suggested: - characterization of the reference area based on the statistical variables considered; - estimation of the environmental benefit through comparing the reference area with other similar areas (having the same environmental characteristics), from the perspective of the statistical variables considered. The model compared in recreation technique faced with a series of difficulties which refers to the reference area and correct transformation of time in money.

Keywords: Comparison in recreation technique, the quality of the environmental elements, statistical analysis model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1089
274 Color Image Segmentation Using SVM Pixel Classification Image

Authors: K. Sakthivel, R. Nallusamy, C. Kavitha

Abstract:

The goal of image segmentation is to cluster pixels into salient image regions. Segmentation could be used for object recognition, occlusion boundary estimation within motion or stereo systems, image compression, image editing, or image database lookup. In this paper, we present a color image segmentation using support vector machine (SVM) pixel classification. Firstly, the pixel level color and texture features of the image are extracted and they are used as input to the SVM classifier. These features are extracted using the homogeneity model and Gabor Filter. With the extracted pixel level features, the SVM Classifier is trained by using FCM (Fuzzy C-Means).The image segmentation takes the advantage of both the pixel level information of the image and also the ability of the SVM Classifier. The Experiments show that the proposed method has a very good segmentation result and a better efficiency, increases the quality of the image segmentation compared with the other segmentation methods proposed in the literature.

Keywords: Image Segmentation, Support Vector Machine, Fuzzy C–Means, Pixel Feature, Texture Feature, Homogeneity model, Gabor Filter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6747
273 Absorbed Dose Estimation of 177Lu-DOTATOC in Adenocarcinoma Breast Cancer Bearing Mice

Authors: S. Zolghadri, M. Mousavi-Daramoroudi, H. Yousefnia, F. Abbasi-Davani

Abstract:

In this study, the absorbed dose of human organs after injection of 177Lu-DOTATOC was studied based on the biodistribution of the complex in adenocarcinoma breast cancer bearing mice. For this purpose, the biodistribution of the radiolabelled complex was studied and compartmental modeling was applied to calculate the absorbed dose with high precision. As expected, 177Lu-DOTATOC illustrated a notable specific uptake in tumor and pancreas, organs with high level of somatostatin receptor on their surface and the effectiveness of the radio-conjugate for targeting of the breast adenocarcinoma tumors was indicated. The elicited results of modeling were the exponential equations, and those are utilized for obtaining the cumulated activity data by taking their integral. The results also exemplified that non-target absorbed-doses such as the liver, spleen and pancreas were approximately 0.008, 0.004, and 0.039, respectively. While these values were so much lower than target (tumor) absorbed-dose, it seems due to this low toxicity, this complex is a good agent for therapy.

Keywords: Breast cancer, compartmental modeling, 177Lu, dosimetry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 746