Search results for: guaranteed estimates
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 310

Search results for: guaranteed estimates

280 Consumption Insurance against the Chronic Illness: Evidence from Thailand

Authors: Yuthapoom Thanakijborisut

Abstract:

This paper studies consumption insurance against the chronic illness in Thailand. The study estimates the impact of household consumption in the chronic illness on consumption growth. Chronic illness is the health care costs of a person or a household’s decision in treatment for the long term; the causes and effects of the household’s ability for smooth consumption. The chronic illnesses are measured in health status when at least one member within the household faces the chronic illness. The data used is from the Household Social Economic Panel Survey conducted during 2007 and 2012. The survey collected data from approximately 6,000 households from every province, both inside and outside municipal areas in Thailand. The study estimates the change in household consumption by using an ordinary least squares (OLS) regression model. The result shows that the members within the household facing the chronic illness would reduce the consumption by around 4%. This case indicates that consumption insurance in Thailand is quite sufficient against chronic illness.

Keywords: Consumption insurance, chronic illness, health care, Thailand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1047
279 Kernel Matching versus Inverse Probability Weighting: A Comparative Study

Authors: Andy Handouyahia, Tony Haddad, Frank Eaton

Abstract:

Recent quasi-experimental evaluation of the Canadian Active Labour Market Policies (ALMP) by Human Resources and Skills Development Canada (HRSDC) has provided an opportunity to examine alternative methods to estimating the incremental effects of Employment Benefits and Support Measures (EBSMs) on program participants. The focus of this paper is to assess the efficiency and robustness of inverse probability weighting (IPW) relative to kernel matching (KM) in the estimation of program effects. To accomplish this objective, the authors compare pairs of 1,080 estimates, along with their associated standard errors, to assess which type of estimate is generally more efficient and robust. In the interest of practicality, the authorsalso document the computationaltime it took to produce the IPW and KM estimates, respectively.

Keywords: Treatment effect, causal inference, observational studies, Propensity score based matching, Kernel Matching, Inverse Probability Weighting, Estimation methods for incremental effect.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6859
278 Empirical Survey of the Solar System Based on the Fusion of GPS and Image Processing

Authors: S. Divya Gnanarathinam, S. Sundaramurthy

Abstract:

The tremendous increase in the population of the world creates the immediate need for the energy resources. All the people in the world need the sustainable energy resources which have low costs. Solar energy is appraised as one of the main energy resources in warm countries. The areas in the west of India like Rajasthan, Gujarat, etc. are immensely rich in solar energy resources. This paper deals with the development of dual axis solar tracker using Arduino board. Depending on the astronomical estimates of the sun from the GPS and sensor image processing outcomes, a methodology is proposed to locate the position of the sun to obtain the maximum solar energy. Based on the outcomes, the solar tracking system figures out whether to use image processing outcomes or astronomical estimates to attain the maximum efficiency of the solar panel. Finally, the experimental values obtained from the solar tracker for both the sunny and the rainy days are being tabulated.

Keywords: Dual axis solar tracker, Arduino board, LDR sensors, global positioning system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1550
277 A Bayesian Hierarchical 13COBT to Correct Estimates Associated with a Delayed Gastric Emptying

Authors: Leslie J.C.Bluck, Sarah J.Jackson, Georgios Vlasakakis, Adrian Mander

Abstract:

The use of a Bayesian Hierarchical Model (BHM) to interpret breath measurements obtained during a 13C Octanoic Breath Test (13COBT) is demonstrated. The statistical analysis was implemented using WinBUGS, a commercially available computer package for Bayesian inference. A hierarchical setting was adopted where poorly defined parameters associated with a delayed Gastric Emptying (GE) were able to "borrow" strength from global distributions. This is proved to be a sufficient tool to correct model's failures and data inconsistencies apparent in conventional analyses employing a Non-linear least squares technique (NLS). Direct comparison of two parameters describing gastric emptying ng ( tlag -lag phase, t1/ 2 -half emptying time) revealed a strong correlation between the two methods. Despite our large dataset ( n = 164 ), Bayesian modeling was fast and provided a successful fitting for all subjects. On the contrary, NLS failed to return acceptable estimates in cases where GE was delayed.

Keywords: Bayesian hierarchical analysis, 13COBT, Gastricemptying, WinBUGS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1411
276 The Effects of Detector Spacing on Travel Time Prediction on Freeways

Authors: Piyali Chaudhuri, Peter T. Martin, Aleksandar Z. Stevanovic, Chongkai Zhu

Abstract:

Loop detectors report traffic characteristics in real time. They are at the core of traffic control process. Intuitively, one would expect that as density of detection increases, so would the quality of estimates derived from detector data. However, as detector deployment increases, the associated operating and maintenance cost increases. Thus, traffic agencies often need to decide where to add new detectors and which detectors should continue receiving maintenance, given their resource constraints. This paper evaluates the effect of detector spacing on freeway travel time estimation. A freeway section (Interstate-15) in Salt Lake City metropolitan region is examined. The research reveals that travel time accuracy does not necessarily deteriorate with increased detector spacing. Rather, the actual location of detectors has far greater influence on the quality of travel time estimates. The study presents an innovative computational approach that delivers optimal detector locations through a process that relies on Genetic Algorithm formulation.

Keywords: Detector, Freeway, Genetic algorithm, Travel timeestimate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1621
275 The Accuracy of the Flight Derivative Estimates Derived from Flight Data

Authors: Jung-hoon Lee, Eung Tai Kim, Byung-hee Chang, In-hee Hwang, Dae-sung Lee

Abstract:

The accuracy of estimated stability and control derivatives of a light aircraft from flight test data were evaluated. The light aircraft, named ChangGong-91, is the first certified aircraft from the Korean government. The output error method, which is a maximum likelihood estimation technique and considers measurement noise only, was used to analyze the aircraft responses measures. The multi-step control inputs were applied in order to excite the short period mode for the longitudinal and Dutch-roll mode for the lateral-directional motion. The estimated stability/control derivatives of Chan Gong-91 were analyzed for the assessment of handling qualities comparing them with those of similar aircraft. The accuracy of the flight derivative estimates derived from flight test measurement was examined in engineering judgment, scatter and Cramer-Rao bound, which turned out to be satisfactory with minor defects..

Keywords: Light Aircraft, Flight Test, Accuracy, Engineering Judgment, Scatter, Cramer-Rao Bound

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1916
274 Improving Production Traits for El-Salam and Mandarah Chicken Strains by Crossing II-Estimation of Crossbreeding Effects on Egg Production and Egg Quality Traits

Authors: Ayman E. Taha, Fawzy A. Abd El-Ghany

Abstract:

A crossbreeding experiment was carried out between two Egyptian strains of chickens namely Mandarah (MM) and El-Salam (SS). The two purebred strains and their reciprocal crosses (MS and SM) were used to estimate the effect of crossing on egg laying and egg quality parameters, direct additive and maternal additive effects as well as heterosis and direct heterosis percentages for studied traits. Results revealed that SM cross recorded the highest significant averages for most of egg production traits including body weight at sexual maturity (BW1), egg numbers at first 90 days, 42 weeks and 65 weeks of age (EN1, EN2 and EN3; respectively), egg weight at 90 days, 42 weeks of age (EW1 and EW2), egg mass at 90 days, 42 weeks and 65 weeks of age (EM1, EM2 and EM3; respectively), feed conversion ratio to egg production at 90 days , 42 weeks and 65 weeks of age (FCR1, FCR2 and FCR3; respectively), fertility and commercial hatchability percentages. Moreover, SM line reached the age sexual maturity (ASM) and period to the first ten eggs (Pf10 egg) at earlier age than other lines. On the other hand, crossing did not well improve egg quality parameters. Estimates and percentages of direct additive effect (GI) were negative for most of the studied traits except for EN1, EN2, EN3, FCR3, fertility, scientific and commercial hatchability percentages that were positive. But Estimates and percentages of maternal heterosis (Gm) were positive for all the studied traits of egg production, except for BW2, BW3, ASM, Pf10, FCR1, FCR2, FCR3 and scientific hatchability that were negative. Also, positive estimates and percentages of heterosis were recorded for most of egg production and egg quality traits. It was concluded that using of SS strain as a sire line and MM strain as a dam line resulting in best new commercial egg line (SM) which is of great concern for poultry breeder in Egypt.

Keywords: Mandarahand El-Salam chickens, Crossing, Egg production, Egg quality, Crossbreeding components.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2823
273 A Sequential Approach to Random-Effects Meta-Analysis

Authors: Samson Henry Dogo, Allan Clark, Elena Kulinskaya

Abstract:

The objective of meta-analysis is to combine results from several independent studies in order to create generalization and provide evidence base for decision making. But recent studies show that the magnitude of effect size estimates reported in many areas of research significantly changed over time and this can impair the results and conclusions of meta-analysis. A number of sequential methods have been proposed for monitoring the effect size estimates in meta-analysis. However they are based on statistical theory applicable only to fixed effect model (FEM) of meta-analysis. For random-effects model (REM), the analysis incorporates the heterogeneity variance, τ 2 and its estimation create complications. In this paper we study the use of a truncated CUSUM-type test with asymptotically valid critical values for sequential monitoring in REM. Simulation results show that the test does not control the Type I error well, and is not recommended. Further work required to derive an appropriate test in this important area of applications.

Keywords: Meta-analysis, random-effects model, sequential testing, temporal changes in effect sizes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2380
272 Operational Risk – Scenario Analysis

Authors: Milan Rippel, Petr Teply

Abstract:

This paper focuses on operational risk measurement techniques and on economic capital estimation methods. A data sample of operational losses provided by an anonymous Central European bank is analyzed using several approaches. Loss Distribution Approach and scenario analysis method are considered. Custom plausible loss events defined in a particular scenario are merged with the original data sample and their impact on capital estimates and on the financial institution is evaluated. Two main questions are assessed – What is the most appropriate statistical method to measure and model operational loss data distribution? and What is the impact of hypothetical plausible events on the financial institution? The g&h distribution was evaluated to be the most suitable one for operational risk modeling. The method based on the combination of historical loss events modeling and scenario analysis provides reasonable capital estimates and allows for the measurement of the impact of extreme events on banking operations.

Keywords: operational risk, scenario analysis, economic capital, loss distribution approach, extreme value theory, stress testing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2378
271 First Cracking Moments of Hybrid Fiber Reinforced Polymer-Steel Reinforced Concrete Beams

Authors: Saruhan Kartal, Ilker Kalkan

Abstract:

The present paper reports the cracking moment estimates of a set of steel-reinforced, Fiber Reinforced Polymer (FRP)-reinforced and hybrid steel-FRP reinforced concrete beams, calculated from different analytical formulations in the codes, together with the experimental cracking load values. A total of three steel-reinforced, four FRP-reinforced, 12 hybrid FRP-steel over-reinforced and five hybrid FRP-steel under-reinforced concrete beam tests were analyzed within the scope of the study. Glass FRP (GFRP) and Basalt FRP (BFRP) bars were used in the beams as FRP bars. In under-reinforced hybrid beams, rupture of the FRP bars preceded crushing of concrete, while concrete crushing preceded FRP rupture in over-reinforced beams. In both types, steel yielding took place long before the FRP rupture and concrete crushing. The cracking moment mainly depends on two quantities, namely the moment of inertia of the section at the initiation of cracking and the flexural tensile strength of concrete, i.e. the modulus of rupture. In the present study, two different definitions of uncracked moment of inertia, i.e. the gross and the uncracked transformed moments of inertia, were adopted. Two analytical equations for the modulus of rupture (ACI 318M and Eurocode 2) were utilized in the calculations as well as the experimental tensile strength of concrete from prismatic specimen tests. The ACI 318M modulus of rupture expression produced cracking moment estimates closer to the experimental cracking moments of FRP-reinforced and hybrid FRP-steel reinforced concrete beams when used in combination with the uncracked transformed moment of inertia, yet the Eurocode 2 modulus of rupture expression gave more accurate cracking moment estimates in steel-reinforced concrete beams. All of the analytical definitions produced analytical values considerably different from the experimental cracking load values of the solely FRP-reinforced concrete beam specimens.

Keywords: Cracking moment, four-point bending, hybrid use of reinforcement, polymer reinforcement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 731
270 A Self-stabilizing Algorithm for Maximum Popular Matching of Strictly Ordered Preference Lists

Authors: Zhengnan Shi

Abstract:

In this paper, we consider the problem of Popular Matching of strictly ordered preference lists. A Popular Matching is not guaranteed to exist in any network. We propose an IDbased, constant space, self-stabilizing algorithm that converges to a Maximum Popular Matching an optimum solution, if one exist. We show that the algorithm stabilizes in O(n5) moves under any scheduler (daemon).

Keywords: self-stabilization, popular matching, algorithm, distributed computing, fault tolerance

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1146
269 Seismic Directionality Effects on In-Structure Response Spectra in Seismic Probabilistic Risk Assessment

Authors: S. Jarernprasert, E. Bazan-Zurita, P. C. Rizzo

Abstract:

Currently, seismic probabilistic risk assessments (SPRA) for nuclear facilities use In-Structure Response Spectra (ISRS) in the calculation of fragilities for systems and components. ISRS are calculated via dynamic analyses of the host building subjected to two orthogonal components of horizontal ground motion. Each component is defined as the median motion in any horizontal direction. Structural engineers applied the components along selected X and Y Cartesian axes. The ISRS at different locations in the building are also calculated in the X and Y directions. The choice of the directions of X and Y are not specified by the ground motion model with respect to geographic coordinates, and are rather arbitrarily selected by the structural engineer. Normally, X and Y coincide with the “principal” axes of the building, in the understanding that this practice is generally conservative. For SPRA purposes, however, it is desirable to remove any conservatism in the estimates of median ISRS. This paper examines the effects of the direction of horizontal seismic motion on the ISRS on typical nuclear structure. We also evaluate the variability of ISRS calculated along different horizontal directions. Our results indicate that some central measures of the ISRS provide robust estimates that are practically independent of the selection of the directions of the horizontal Cartesian axes.

Keywords: Seismic, Directionality, In-Structure Response Spectra, Probabilistic Risk Assessment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2492
268 Estimating Bridge Deterioration for Small Data Sets Using Regression and Markov Models

Authors: Yina F. Muñoz, Alexander Paz, Hanns De La Fuente-Mella, Joaquin V. Fariña, Guilherme M. Sales

Abstract:

The primary approach for estimating bridge deterioration uses Markov-chain models and regression analysis. Traditional Markov models have problems in estimating the required transition probabilities when a small sample size is used. Often, reliable bridge data have not been taken over large periods, thus large data sets may not be available. This study presents an important change to the traditional approach by using the Small Data Method to estimate transition probabilities. The results illustrate that the Small Data Method and traditional approach both provide similar estimates; however, the former method provides results that are more conservative. That is, Small Data Method provided slightly lower than expected bridge condition ratings compared with the traditional approach. Considering that bridges are critical infrastructures, the Small Data Method, which uses more information and provides more conservative estimates, may be more appropriate when the available sample size is small. In addition, regression analysis was used to calculate bridge deterioration. Condition ratings were determined for bridge groups, and the best regression model was selected for each group. The results obtained were very similar to those obtained when using Markov chains; however, it is desirable to use more data for better results.

Keywords: Concrete bridges, deterioration, Markov chains, probability matrix.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1389
267 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data

Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer

Abstract:

This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.

Keywords: Non-stationary, BINARMA(1, 1) model, Poisson Innovations, CML

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 549
266 Development of a Software about Calculating the Production Parameters in Knitted Garment Plants

Authors: Ender Bulgun, Arzu Vuruskan

Abstract:

Apparel product development is an important stage in the life cycle of a product. Shortening this stage will help to reduce the costs of a garment. The aim of this study is to examine the production parameters in knitwear apparel companies by defining the unit costs, and developing a software to calculate the unit costs of garments and make the cost estimates. In this study, with the help of a questionnaire, different companies- systems of unit cost estimating and cost calculating were tried to be analyzed. Within the scope of the questionnaire, the importance of cost estimating process for apparel companies and the expectations from a new cost estimating program were investigated. According to the results of the questionnaire, it was seen that the majority of companies which participated to the questionnaire use manual cost calculating methods or simple Microsoft Excel spreadsheets to make cost estimates. Furthermore, it was discovered that many companies meet with difficulties in archiving the cost data for future use and as a solution to that problem, it is thought that prior to making a cost estimate, sub units of garment costs which are fabric, accessory and the labor costs should be analyzed and added to the database of the programme beforehand. Another specification of the cost estimating unit prepared in this study is that the programme was designed to consist of two main units, one of which makes the product specification and the other makes the cost calculation. The programme is prepared as a web-based application in order that the supplier, the manufacturer and the customer can have the opportunity to communicate through the same platform.

Keywords: Apparel, cost estimating, design archive.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2936
265 Trimmed Mean as an Adaptive Robust Estimator of a Location Parameter for Weibull Distribution

Authors: Carolina B. Baguio

Abstract:

One of the purposes of the robust method of estimation is to reduce the influence of outliers in the data, on the estimates. The outliers arise from gross errors or contamination from distributions with long tails. The trimmed mean is a robust estimate. This means that it is not sensitive to violation of distributional assumptions of the data. It is called an adaptive estimate when the trimming proportion is determined from the data rather than being fixed a “priori-. The main objective of this study is to find out the robustness properties of the adaptive trimmed means in terms of efficiency, high breakdown point and influence function. Specifically, it seeks to find out the magnitude of the trimming proportion of the adaptive trimmed mean which will yield efficient and robust estimates of the parameter for data which follow a modified Weibull distribution with parameter λ = 1/2 , where the trimming proportion is determined by a ratio of two trimmed means defined as the tail length. Secondly, the asymptotic properties of the tail length and the trimmed means are also investigated. Finally, a comparison is made on the efficiency of the adaptive trimmed means in terms of the standard deviation for the trimming proportions and when these were fixed a “priori". The asymptotic tail lengths defined as the ratio of two trimmed means and the asymptotic variances were computed by using the formulas derived. While the values of the standard deviations for the derived tail lengths for data of size 40 simulated from a Weibull distribution were computed for 100 iterations using a computer program written in Pascal language. The findings of the study revealed that the tail lengths of the Weibull distribution increase in magnitudes as the trimming proportions increase, the measure of the tail length and the adaptive trimmed mean are asymptotically independent as the number of observations n becomes very large or approaching infinity, the tail length is asymptotically distributed as the ratio of two independent normal random variables, and the asymptotic variances decrease as the trimming proportions increase. The simulation study revealed empirically that the standard error of the adaptive trimmed mean using the ratio of tail lengths is relatively smaller for different values of trimming proportions than its counterpart when the trimming proportions were fixed a 'priori'.

Keywords: Adaptive robust estimate, asymptotic efficiency, breakdown point, influence function, L-estimates, location parameter, tail length, Weibull distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2033
264 Feedback Stabilization Based on Observer and Guaranteed Cost Control for Lipschitz Nonlinear Systems

Authors: A. Thabet, G. B. H. Frej, M. Boutayeb

Abstract:

This paper presents a design of dynamic feedback control based on observer for a class of large scale Lipschitz nonlinear systems. The use of Differential Mean Value Theorem (DMVT) is to introduce a general condition on the nonlinear functions. To ensure asymptotic stability, sufficient conditions are expressed in terms of linear matrix inequalities (LMIs). High performances are shown through real time implementation with ARDUINO Duemilanove board to the one-link flexible joint robot.

Keywords: Feedback stabilization, DMVT, Lipschitz nonlinear systems, nonlinear observer, real time implementation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1313
263 Complex Wavelet Transform Based Image Denoising and Zooming Under the LMMSE Framework

Authors: T. P. Athira, Gibin Chacko George

Abstract:

This paper proposes a dual tree complex wavelet transform (DT-CWT) based directional interpolation scheme for noisy images. The problems of denoising and interpolation are modelled as to estimate the noiseless and missing samples under the same framework of optimal estimation. Initially, DT-CWT is used to decompose an input low-resolution noisy image into low and high frequency subbands. The high-frequency subband images are interpolated by linear minimum mean square estimation (LMMSE) based interpolation, which preserves the edges of the interpolated images. For each noisy LR image sample, we compute multiple estimates of it along different directions and then fuse those directional estimates for a more accurate denoised LR image. The estimation parameters calculated in the denoising processing can be readily used to interpolate the missing samples. The inverse DT-CWT is applied on the denoised input and interpolated high frequency subband images to obtain the high resolution image. Compared with the conventional schemes that perform denoising and interpolation in tandem, the proposed DT-CWT based noisy image interpolation method can reduce many noise-caused interpolation artifacts and preserve well the image edge structures. The visual and quantitative results show that the proposed technique outperforms many of the existing denoising and interpolation methods.

Keywords: Dual-tree complex wavelet transform (DT-CWT), denoising, interpolation, optimal estimation, super resolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2128
262 Allometric Models for Biomass Estimation in Savanna Woodland Area, Niger State, Nigeria

Authors: Abdullahi Jibrin, Aishetu Abdulkadir

Abstract:

The development of allometric models is crucial to accurate forest biomass/carbon stock assessment. The aim of this study was to develop a set of biomass prediction models that will enable the determination of total tree aboveground biomass for savannah woodland area in Niger State, Nigeria. Based on the data collected through biometric measurements of 1816 trees and destructive sampling of 36 trees, five species specific and one site specific models were developed. The sample size was distributed equally between the five most dominant species in the study site (Vitellaria paradoxa, Irvingia gabonensis, Parkia biglobosa, Anogeissus leiocarpus, Pterocarpus erinaceous). Firstly, the equations were developed for five individual species. Secondly these five species were mixed and were used to develop an allometric equation of mixed species. Overall, there was a strong positive relationship between total tree biomass and the stem diameter. The coefficient of determination (R2 values) ranging from 0.93 to 0.99 P < 0.001 were realised for the models; with considerable low standard error of the estimates (SEE) which confirms that the total tree above ground biomass has a significant relationship with the dbh. F-test values for the biomass prediction models were also significant at p < 0.001 which indicates that the biomass prediction models are valid. This study recommends that for improved biomass estimates in the study site, the site specific biomass models should preferably be used instead of using generic models.

Keywords: Allometriy, biomass, carbon stock, model, regression equation, woodland, inventory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2735
261 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test

Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman

Abstract:

At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. This finding needs to be confirmed with a greater number of stations across other Australian states.

Keywords: Floods, FLIKE, probability distributions, flood frequency, outlier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3265
260 Security of Mobile Agent in Ad hoc Network using Threshold Cryptography

Authors: S.M. Sarwarul Islam Rizvi, Zinat Sultana, Bo Sun, Md. Washiqul Islam

Abstract:

In a very simple form a Mobile Agent is an independent piece of code that has mobility and autonomy behavior. One of the main advantages of using Mobile Agent in a network is - it reduces network traffic load. In an, ad hoc network Mobile Agent can be used to protect the network by using agent based IDS or IPS. Besides, to deploy dynamic software in the network or to retrieve information from network nodes Mobile Agent can be useful. But in an ad hoc network the Mobile Agent itself needs some security. Security services should be guaranteed both for Mobile Agent and for Agent Server. In this paper to protect the Mobile Agent and Agent Server in an ad hoc network we have proposed a solution which is based on Threshold Cryptography, a new vibe in the cryptographic world where trust is distributed among multiple nodes in the network.

Keywords: Ad hoc network, Mobile Agent, Security, Threats, Threshold Cryptography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1922
259 Motion Planning and Posture Control of the General 3-Trailer System

Authors: K. Raghuwaiya, B. Sharma, J. Vanualailai

Abstract:

This paper presents a set of artificial potential field functions that improves upon, in general, the motion planning and posture control, with theoretically guaranteed point and posture stabilities, convergence and collision avoidance properties of the general3-trailer system in a priori known environment. We basically design and inject two new concepts; ghost walls and the distance optimization technique (DOT) to strengthen point and posture stabilities, in the sense of Lyapunov, of our dynamical model. This new combination of techniques emerges as a convenient mechanism for obtaining feasible orientations at the target positions with an overall reduction in the complexity of the navigation laws. Simulations are provided to demonstrate the effectiveness of the controls laws.

Keywords: Artificial potential fields, 3-trailer systems, motion planning, posture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2109
258 Adaptive Impedance Control for Unknown Non-Flat Environment

Authors: Norsinnira Zainul Azlan, Hiroshi Yamaura

Abstract:

This paper presents a new adaptive impedance control strategy, based on Function Approximation Technique (FAT) to compensate for unknown non-flat environment shape or time-varying environment location. The target impedance in the force controllable direction is modified by incorporating adaptive compensators and the uncertainties are represented by FAT, allowing the update law to be derived easily. The force error feedback is utilized in the estimation and the accurate knowledge of the environment parameters are not required by the algorithm. It is shown mathematically that the stability of the controller is guaranteed based on Lyapunov theory. Simulation results presented to demonstrate the validity of the proposed controller.

Keywords: Adaptive impedance control, Function Approximation Technique (FAT), impedance control, unknown environment position.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1540
257 Quality of Service in Multioperator GPON Access Networks with Triple-Play Services

Authors: Germán Santos-Boada, Jordi Domingo-Pascual

Abstract:

Recently, in some places, optical-fibre access networks have been used with GPON technology belonging to organizations (in most cases public bodies) that act as neutral operators. These operators simultaneously provide network services to various telecommunications operators that offer integrated voice, data and television services. This situation creates new problems related to quality of service, since the interests of the users are intermingled with the interests of the operators. In this paper, we analyse this problem and consider solutions that make it possible to provide guaranteed quality of service for voice over IP, data services and interactive digital television.

Keywords: GPON networks, multioperator, quality of service, triple-play services.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3383
256 Likelihood Estimation for Stochastic Epidemics with Heterogeneous Mixing Populations

Authors: Yilun Shang

Abstract:

We consider a heterogeneously mixing SIR stochastic epidemic process in populations described by a general graph. Likelihood theory is developed to facilitate statistic inference for the parameters of the model under complete observation. We show that these estimators are asymptotically Gaussian unbiased estimates by using a martingale central limit theorem.

Keywords: statistic inference, maximum likelihood, epidemicmodel, heterogeneous mixing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1370
255 Takagi-Sugeno Fuzzy Control of Induction Motor

Authors: Allouche Moez, Souissi Mansour, Chaabane Mohamed, Mehdi Driss

Abstract:

This paper deals with the synthesis of fuzzy state feedback controller of induction motor with optimal performance. First, the Takagi-Sugeno (T-S) fuzzy model is employed to approximate a non linear system in the synchronous d-q frame rotating with electromagnetic field-oriented. Next, a fuzzy controller is designed to stabilise the induction motor and guaranteed a minimum disturbance attenuation level for the closed-loop system. The gains of fuzzy control are obtained by solving a set of Linear Matrix Inequality (LMI). Finally, simulation results are given to demonstrate the controller-s effectiveness.

Keywords: Rejection disturbance, fuzzy modelling, open-loop control, Fuzzy feedback controller, fuzzy observer, Linear Matrix Inequality (LMI)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1858
254 Adaptive Neural Network Control of Autonomous Underwater Vehicles

Authors: Ahmad Forouzantabar, Babak Gholami, Mohammad Azadi

Abstract:

An adaptive neural network controller for autonomous underwater vehicles (AUVs) is presented in this paper. The AUV model is highly nonlinear because of many factors, such as hydrodynamic drag, damping, and lift forces, Coriolis and centripetal forces, gravity and buoyancy forces, as well as forces from thruster. In this regards, a nonlinear neural network is used to approximate the nonlinear uncertainties of AUV dynamics, thus overcoming some limitations of conventional controllers and ensure good performance. The uniform ultimate boundedness of AUV tracking errors and the stability of the proposed control system are guaranteed based on Lyapunov theory. Numerical simulation studies for motion control of an AUV are performed to demonstrate the effectiveness of the proposed controller.

Keywords: Autonomous Underwater Vehicle (AUV), Neural Network Controller, Composite Adaptation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2478
253 Computable Function Representations Using Effective Chebyshev Polynomial

Authors: Mohammed A. Abutheraa, David Lester

Abstract:

We show that Chebyshev Polynomials are a practical representation of computable functions on the computable reals. The paper presents error estimates for common operations and demonstrates that Chebyshev Polynomial methods would be more efficient than Taylor Series methods for evaluation of transcendental functions.

Keywords: Approximation Theory, Chebyshev Polynomial, Computable Functions, Computable Real Arithmetic, Integration, Numerical Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3035
252 Pervasive Differentiated Services: A QoS Model for Pervasive Systems

Authors: Sherif G. Aly

Abstract:

In this article, we introduce a mechanism by which the same concept of differentiated services used in network transmission can be applied to provide quality of service levels to pervasive systems applications. The classical DiffServ model, including marking and classification, assured forwarding, and expedited forwarding, are all utilized to create quality of service guarantees for various pervasive applications requiring different levels of quality of service. Through a collection of various sensors, personal devices, and data sources, the transmission of contextsensitive data can automatically occur within a pervasive system with a given quality of service level. Triggers, initiators, sources, and receivers are four entities labeled in our mechanism. An explanation of the role of each is provided, and how quality of service is guaranteed.

Keywords: Pervasive systems, quality of service, differentiated services, mobile devices.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1452
251 Constitutional Complaint as an Instrument of Fulfilling the Worker ׳s Rights in Croatian Legal System

Authors: Dragana Bjelić, Mirela Mezak Stastny

Abstract:

This paper begins with formal defining of human rights and freedoms, and the basic document regarding the said subject is undoubtedly French Declaration of the Rights of Man and of the Citizen from 789. This paper furthermore parses legal sources relevant for the workers' rights in legal system of the Republic of Croatia, international contracts and the Labour Act, which is also a master bill regarding workers' rights The authors are also dealing with issues of Constitutional Court of the Republic of Croatia and its' position in judicial system of the Republic of Croatia, as well as with the specifics of Constitutional Complaint, and the crucial part of the paper is based on the research conducted with an aim to determine implementation of rights and liberties guaranteed by the articles 54. and 55. of the Constitution of the Republic of Croatia by means of Constitutional Complaint.

Keywords: a right to work, a freedom of work, Constitutional Court of Republic of Croatia, Constitutional Complaint.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1528