Search results for: maximizer of the posterior marginal estimate
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 890

Search results for: maximizer of the posterior marginal estimate

740 A Robust and Adaptive Unscented Kalman Filter for the Air Fine Alignment of the Strapdown Inertial Navigation System/GPS

Authors: Jian Shi, Baoguo Yu, Haonan Jia, Meng Liu, Ping Huang

Abstract:

Adapting to the flexibility of war, a large number of guided weapons launch from aircraft. Therefore, the inertial navigation system loaded in the weapon needs to undergo an alignment process in the air. This article proposes the following methods to the problem of inaccurate modeling of the system under large misalignment angles, the accuracy reduction of filtering caused by outliers, and the noise changes in GPS signals: first, considering the large misalignment errors of Strapdown Inertial Navigation System (SINS)/GPS, a more accurate model is made rather than to make a small-angle approximation, and the Unscented Kalman Filter (UKF) algorithms are used to estimate the state; then, taking into account the impact of GPS noise changes on the fine alignment algorithm, the innovation adaptive filtering algorithm is introduced to estimate the GPS’s noise in real-time; at the same time, in order to improve the anti-interference ability of the air fine alignment algorithm, a robust filtering algorithm based on outlier detection is combined with the air fine alignment algorithm to improve the robustness of the algorithm. The algorithm can improve the alignment accuracy and robustness under interference conditions, which is verified by simulation.

Keywords: Air alignment, fine alignment, inertial navigation system, integrated navigation system, UKF.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 541
739 Optimization of the Characteristic Straight Line Method by a “Best Estimate“ of Observed, Normal Orthometric Elevation Differences

Authors: Mahmoud M. S. Albattah

Abstract:

In this paper, to optimize the “Characteristic Straight Line Method" which is used in the soil displacement analysis, a “best estimate" of the geodetic leveling observations has been achieved by taking in account the concept of 'Height systems'. This concept has been discussed in detail and consequently the concept of “height". In landslides dynamic analysis, the soil is considered as a mosaic of rigid blocks. The soil displacement has been monitored and analyzed by using the “Characteristic Straight Line Method". Its characteristic components have been defined constructed from a “best estimate" of the topometric observations. In the measurement of elevation differences, we have used the most modern leveling equipment available. Observational procedures have also been designed to provide the most effective method to acquire data. In addition systematic errors which cannot be sufficiently controlled by instrumentation or observational techniques are minimized by applying appropriate corrections to the observed data: the level collimation correction minimizes the error caused by nonhorizontality of the leveling instrument's line of sight for unequal sight lengths, the refraction correction is modeled to minimize the refraction error caused by temperature (density) variation of air strata, the rod temperature correction accounts for variation in the length of the leveling rod' s Invar/LO-VAR® strip which results from temperature changes, the rod scale correction ensures a uniform scale which conforms to the international length standard and the introduction of the concept of the 'Height systems' where all types of height (orthometric, dynamic, normal, gravity correction, and equipotential surface) have been investigated. The “Characteristic Straight Line Method" is slightly more convenient than the “Characteristic Circle Method". It permits to evaluate a displacement of very small magnitude even when the displacement is of an infinitesimal quantity. The inclination of the landslide is given by the inverse of the distance reference point O to the “Characteristic Straight Line". Its direction is given by the bearing of the normal directed from point O to the Characteristic Straight Line (Fig..6). A “best estimate" of the topometric observations was used to measure the elevation of points carefully selected, before and after the deformation. Gross errors have been eliminated by statistical analyses and by comparing the heights within local neighborhoods. The results of a test using an area where very interesting land surface deformation occurs are reported. Monitoring with different options and qualitative comparison of results based on a sufficient number of check points are presented.

Keywords: Characteristic straight line method, dynamic height, landslides, orthometric height, systematic errors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1565
738 Influence of Fibre Content on Crack Propagation Rate in Fibre-Reinforced Concrete Beams

Authors: Amir M. Alani, Morteza Aboutalebi, Martin J. King

Abstract:

Experimental study on the influence of fibre content on crack behaviour and propagation in synthetic-fibre reinforced beams has been reported in this paper. The tensile behaviour of metallic fibre concrete is evaluated in terms of residual flexural tensile strength values determined from the load-crack mouth opening displacement curve or load-deflection curve obtained by applying a centre-point load on a simply supported notched prism. The results achieved demonstrate that an increase in fibre content has an almost negligible effect on compressive and tensile splitting properties, causes a marginal increment in flexural tensile strength and increasesthe Re3 value.

Keywords: Fibre-Reinforced Concrete, Crack, Flexural Test, Ductility, Fibre Content, Experimental Study.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3730
737 A Renovated Cook's Distance Based On The Buckley-James Estimate In Censored Regression

Authors: Nazrina Aziz, Dong Q. Wang

Abstract:

There have been various methods created based on the regression ideas to resolve the problem of data set containing censored observations, i.e. the Buckley-James method, Miller-s method, Cox method, and Koul-Susarla-Van Ryzin estimators. Even though comparison studies show the Buckley-James method performs better than some other methods, it is still rarely used by researchers mainly because of the limited diagnostics analysis developed for the Buckley-James method thus far. Therefore, a diagnostic tool for the Buckley-James method is proposed in this paper. It is called the renovated Cook-s Distance, (RD* i ) and has been developed based on the Cook-s idea. The renovated Cook-s Distance (RD* i ) has advantages (depending on the analyst demand) over (i) the change in the fitted value for a single case, DFIT* i as it measures the influence of case i on all n fitted values Yˆ∗ (not just the fitted value for case i as DFIT* i) (ii) the change in the estimate of the coefficient when the ith case is deleted, DBETA* i since DBETA* i corresponds to the number of variables p so it is usually easier to look at a diagnostic measure such as RD* i since information from p variables can be considered simultaneously. Finally, an example using Stanford Heart Transplant data is provided to illustrate the proposed diagnostic tool.

Keywords: Buckley-James estimators, censored regression, censored data, diagnostic analysis, product-limit estimator, renovated Cook's Distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1437
736 The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination

Authors: O. Abiodun Adeyinka, B. Adeyemo Adesesan

Abstract:

The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.

Keywords: Logistic Regression LoR, Kernel Density Estimator KDE, Handwriting, Confidence Interval, Repeatability, Reproducibility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 471
735 Parametric Approach for Reserve Liability Estimate in Mortgage Insurance

Authors: Rajinder Singh, Ram Valluru

Abstract:

Chain Ladder (CL) method, Expected Loss Ratio (ELR) method and Bornhuetter-Ferguson (BF) method, in addition to more complex transition-rate modeling, are commonly used actuarial reserving methods in general insurance. There is limited published research about their relative performance in the context of Mortgage Insurance (MI). In our experience, these traditional techniques pose unique challenges and do not provide stable claim estimates for medium to longer term liabilities. The relative strengths and weaknesses among various alternative approaches revolve around: stability in the recent loss development pattern, sufficiency and reliability of loss development data, and agreement/disagreement between reported losses to date and ultimate loss estimate. CL method results in volatile reserve estimates, especially for accident periods with little development experience. The ELR method breaks down especially when ultimate loss ratios are not stable and predictable. While the BF method provides a good tradeoff between the loss development approach (CL) and ELR, the approach generates claim development and ultimate reserves that are disconnected from the ever-to-date (ETD) development experience for some accident years that have more development experience. Further, BF is based on subjective a priori assumption. The fundamental shortcoming of these methods is their inability to model exogenous factors, like the economy, which impact various cohorts at the same chronological time but at staggered points along their life-time development. This paper proposes an alternative approach of parametrizing the loss development curve and using logistic regression to generate the ultimate loss estimate for each homogeneous group (accident year or delinquency period). The methodology was tested on an actual MI claim development dataset where various cohorts followed a sigmoidal trend, but levels varied substantially depending upon the economic and operational conditions during the development period spanning over many years. The proposed approach provides the ability to indirectly incorporate such exogenous factors and produce more stable loss forecasts for reserving purposes as compared to the traditional CL and BF methods.

Keywords: Actuarial loss reserving techniques, logistic regression, parametric function, volatility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 416
734 Population Structure of European Pond Turtles, Emys orbicularis (Linnaeus, 1758) in Narta Lagoon (Vlora Bay, Albania)

Authors: Enerit Saçdanaku, Idriz Haxhiu

Abstract:

In this study was monitored the population of the European Pond Turtle, Emys orbicularis (Linnaeus, 1758) in the area of Narta Lagoon, Vlora Bay (Albania), from August to October 2014. A total of 54 individuals of E. orbicularis were studied using different methodologies. Curved Carapace Length (CCL), Plastron Length (PL) and Curved Carapace Width (CCW) were measured for each individual of E. orbicularis and were statistically analyzed. All captured turtles were separated in seven different size – classes based on their carapace length (CCL). Each individual of E. orbicularis was marked by notching the carapace (marginal scutes). Form all individuals captured resulted that 37 were females (68.5%), 14 males (25.9%), 3 juveniles (5.5%), while 18 individuals of E. orbicularis were recaptured for the first and some for the second time.

Keywords: Emys orbicularis, female, juvenile, male, population, size – classes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1783
733 Targeting the Life Cycle Stages of the Diamond Back Moth (Plutella xylostella) with Three Different Parasitoid Wasps

Authors: F. O. Faithpraise, J. Idung, C. R. Chatwin, R. C. D. Young, P. Birch

Abstract:

A continuous time model of the interaction between crop insect pests and naturally beneficial pest enemies is created using a set of simultaneous, non-linear, ordinary differential equations incorporating natural death rates based on the Weibull distribution. The crop pest is present in all its life-cycle stages of: egg, larva, pupa and adult. The beneficial insects, parasitoid wasps, may be present in either or all parasitized: eggs, larva and pupa. Population modelling is used to estimate the quantity of the natural pest enemies that should be introduced into the pest infested environment to suppress the pest population density to an economically acceptable level within a prescribed number of days. The results obtained illustrate the effect of different combinations of parasitoid wasps, using the Pascal distribution to estimate their success in parasitizing different pest developmental stages, to deliver pest control to a sustainable level. Effective control, within a prescribed number of days, is established by the deployment of two or all three species of wasps, which partially destroy pest: egg, larvae and pupae stages. The selected scenarios demonstrate effective sustainable control of the pest in less than thirty days.

Keywords: Biological control, Diamondback moth, Parasitoid wasps, Population modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3056
732 Gaussian Particle Flow Bernoulli Filter for Single Target Tracking

Authors: Hyeongbok Kim, Lingling Zhao, Xiaohong Su, Junjie Wang

Abstract:

The Bernoulli filter is a precise Bayesian filter for single target tracking based on the random finite set theory. The standard Bernoulli filter often underestimates the number of the targets. This study proposes a Gaussian particle flow (GPF) Bernoulli filter employing particle flow to migrate particles from prior to posterior positions to improve the performance of the standard Bernoulli filter. By employing the particle flow filter, the computational speed of the Bernoulli filters is significantly improved. In addition, the GPF Bernoulli filter provides more accurate estimation compared with that of the standard Bernoulli filter. Simulation results confirm the improved tracking performance and computational speed in two- and three-dimensional scenarios compared with other algorithms.

Keywords: Bernoulli filter, particle filter, particle flow filter, random finite sets, target tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 345
731 Comparison of Different Techniques to Estimate Surface Soil Moisture

Authors: S. Farid F. Mojtahedi, Ali Khosravi, Behnaz Naeimian, S. Adel A. Hosseini

Abstract:

Land subsidence is a gradual settling or sudden sinking of the land surface from changes that take place underground. There are different causes of land subsidence; most notably, ground-water overdraft and severe weather conditions. Subsidence of the land surface due to ground water overdraft is caused by an increase in the intergranular pressure in unconsolidated aquifers, which results in a loss of buoyancy of solid particles in the zone dewatered by the falling water table and accordingly compaction of the aquifer. On the other hand, exploitation of underground water may result in significant changes in degree of saturation of soil layers above the water table, increasing the effective stress in these layers, and considerable soil settlements. This study focuses on estimation of soil moisture at surface using different methods. Specifically, different methods for the estimation of moisture content at the soil surface, as an important term to solve Richard’s equation and estimate soil moisture profile are presented, and their results are discussed through comparison with field measurements obtained from Yanco1 station in south-eastern Australia. Surface soil moisture is not easy to measure at the spatial scale of a catchment. Due to the heterogeneity of soil type, land use, and topography, surface soil moisture may change considerably in space and time.

Keywords: Artificial neural network, empirical method, remote sensing, surface soil moisture, unsaturated soil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2134
730 A Family Cars- Life Cycle Cost (LCC)-Oriented Hybrid Modelling Approach Combining ANN and CBR

Authors: Xiaochuan Chen, Jianguo Yang, Beizhi Li

Abstract:

Design for cost (DFC) is a method that reduces life cycle cost (LCC) from the angle of designers. Multiple domain features mapping (MDFM) methodology was given in DFC. Using MDFM, we can use design features to estimate the LCC. From the angle of DFC, the design features of family cars were obtained, such as all dimensions, engine power and emission volume. At the conceptual design stage, cars- LCC were estimated using back propagation (BP) artificial neural networks (ANN) method and case-based reasoning (CBR). Hamming space was used to measure the similarity among cases in CBR method. Levenberg-Marquardt (LM) algorithm and genetic algorithm (GA) were used in ANN. The differences of LCC estimation model between CBR and artificial neural networks (ANN) were provided. ANN and CBR separately each method has its shortcomings. By combining ANN and CBR improved results accuracy was obtained. Firstly, using ANN selected some design features that affect LCC. Then using LCC estimation results of ANN could raise the accuracy of LCC estimation in CBR method. Thirdly, using ANN estimate LCC errors and correct errors in CBR-s estimation results if the accuracy is not enough accurate. Finally, economically family cars and sport utility vehicle (SUV) was given as LCC estimation cases using this hybrid approach combining ANN and CBR.

Keywords: case-based reasoning, life cycle cost (LCC), artificialneural networks (ANN), family cars

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1959
729 Characterization and Development of Anthropomorphic Phantoms Liver for Use in Nuclear Medicine

Authors: Ferreira F. C. L., Souza D. N., Rodrigues T. M. A., Cunha C. J., Dullius M. A., Andrade J. E., Sousa A. H., Vieira J. P. C., Carvalho Júnior A. B., Santos L. P. B., Passos R. O.

Abstract:

The objective this study was to characterize and develop anthropomorphic liver phantoms in tomography hepatic procedures for quality control and improvement professionals in nuclear medicine. For the conformation of the anthropomorphic phantom was used in plaster and acrylic. We constructed three phantoms representing processes with liver cirrhosis. The phantoms were filled with 99mTc diluted with water to obtain the scintigraphic images. Tomography images were analyzed anterior and posterior phantom representing a body with a greater degree cirrhotic. It was noted that the phantoms allow the acquisition of images similar to real liver with cirrhosis. Simulations of hemangiomas may contribute to continued professional education of nuclear medicine, on the question of image acquisition, allowing of the study parameters such of the matrix, energy window and count statistics.

Keywords: Nuclear medicine, liver phantom, control quality

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1677
728 A Model for Estimation of Efforts in Development of Software Systems

Authors: Parvinder S. Sandhu, Manisha Prashar, Pourush Bassi, Atul Bisht

Abstract:

Software effort estimation is the process of predicting the most realistic use of effort required to develop or maintain software based on incomplete, uncertain and/or noisy input. Effort estimates may be used as input to project plans, iteration plans, budgets. There are various models like Halstead, Walston-Felix, Bailey-Basili, Doty and GA Based models which have already used to estimate the software effort for projects. In this study Statistical Models, Fuzzy-GA and Neuro-Fuzzy (NF) Inference Systems are experimented to estimate the software effort for projects. The performances of the developed models were tested on NASA software project datasets and results are compared with the Halstead, Walston-Felix, Bailey-Basili, Doty and Genetic Algorithm Based models mentioned in the literature. The result shows that the NF Model has the lowest MMRE and RMSE values. The NF Model shows the best results as compared with the Fuzzy-GA based hybrid Inference System and other existing Models that are being used for the Effort Prediction with lowest MMRE and RMSE values.

Keywords: Neuro-Fuzzy Model, Halstead Model, Walston-Felix Model, Bailey-Basili Model, Doty Model, GA Based Model, Genetic Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3226
727 Decision Tree for Competing Risks Survival Probability in Breast Cancer Study

Authors: N. A. Ibrahim, A. Kudus, I. Daud, M. R. Abu Bakar

Abstract:

Competing risks survival data that comprises of more than one type of event has been used in many applications, and one of these is in clinical study (e.g. in breast cancer study). The decision tree method can be extended to competing risks survival data by modifying the split function so as to accommodate two or more risks which might be dependent on each other. Recently, researchers have constructed some decision trees for recurrent survival time data using frailty and marginal modelling. We further extended the method for the case of competing risks. In this paper, we developed the decision tree method for competing risks survival time data based on proportional hazards for subdistribution of competing risks. In particular, we grow a tree by using deviance statistic. The application of breast cancer data is presented. Finally, to investigate the performance of the proposed method, simulation studies on identification of true group of observations were executed.

Keywords: Competing risks, Decision tree, Simulation, Subdistribution Proportional Hazard.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2373
726 An Observer-Based Direct Adaptive Fuzzy Sliding Control with Adjustable Membership Functions

Authors: Alireza Gholami, Amir H. D. Markazi

Abstract:

In this paper, an observer-based direct adaptive fuzzy sliding mode (OAFSM) algorithm is proposed. In the proposed algorithm, the zero-input dynamics of the plant could be unknown. The input connection matrix is used to combine the sliding surfaces of individual subsystems, and an adaptive fuzzy algorithm is used to estimate an equivalent sliding mode control input directly. The fuzzy membership functions, which were determined by time consuming try and error processes in previous works, are adjusted by adaptive algorithms. The other advantage of the proposed controller is that the input gain matrix is not limited to be diagonal, i.e. the plant could be over/under actuated provided that controllability and observability are preserved. An observer is constructed to directly estimate the state tracking error, and the nonlinear part of the observer is constructed by an adaptive fuzzy algorithm. The main advantage of the proposed observer is that, the measured outputs is not limited to the first entry of a canonical-form state vector. The closed-loop stability of the proposed method is proved using a Lyapunov-based approach. The proposed method is applied numerically on a multi-link robot manipulator, which verifies the performance of the closed-loop control. Moreover, the performance of the proposed algorithm is compared with some conventional control algorithms.

Keywords: Adaptive algorithm, fuzzy systems, membership functions, observer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 779
725 2D Human Motion Regeneration with Stick Figure Animation Using Accelerometers

Authors: Alpha Agape Gopalai, S. M. N. Arosha Senanayake

Abstract:

This paper explores the opportunity of using tri-axial wireless accelerometers for supervised monitoring of sports movements. A motion analysis system for the upper extremities of lawn bowlers in particular is developed. Accelerometers are placed on parts of human body such as the chest to represent the shoulder movements, the back to capture the trunk motion, back of the hand, the wrist and one above the elbow, to capture arm movements. These sensors placement are carefully designed in order to avoid restricting bowler-s movements. Data is acquired from these sensors in soft-real time using virtual instrumentation; the acquired data is then conditioned and converted into required parameters for motion regeneration. A user interface was also created to facilitate in the acquisition of data, and broadcasting of commands to the wireless accelerometers. All motion regeneration in this paper deals with the motion of the human body segment in the X and Y direction, looking into the motion of the anterior/ posterior and lateral directions respectively.

Keywords: Motion Regeneration, Virtual Instrumentation, Wireless Accelerometers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1827
724 A Rule-based Approach for Anomaly Detection in Subscriber Usage Pattern

Authors: Rupesh K. Gopal, Saroj K. Meher

Abstract:

In this report we present a rule-based approach to detect anomalous telephone calls. The method described here uses subscriber usage CDR (call detail record) data sampled over two observation periods: study period and test period. The study period contains call records of customers- non-anomalous behaviour. Customers are first grouped according to their similar usage behaviour (like, average number of local calls per week, etc). For customers in each group, we develop a probabilistic model to describe their usage. Next, we use maximum likelihood estimation (MLE) to estimate the parameters of the calling behaviour. Then we determine thresholds by calculating acceptable change within a group. MLE is used on the data in the test period to estimate the parameters of the calling behaviour. These parameters are compared against thresholds. Any deviation beyond the threshold is used to raise an alarm. This method has the advantage of identifying local anomalies as compared to techniques which identify global anomalies. The method is tested for 90 days of study data and 10 days of test data of telecom customers. For medium to large deviations in the data in test window, the method is able to identify 90% of anomalous usage with less than 1% false alarm rate.

Keywords: Subscription fraud, fraud detection, anomalydetection, maximum likelihood estimation, rule based systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2812
723 Monotonicity of Dependence Concepts from Independent Random Vector into Dependent Random Vector

Authors: Guangpu Chen

Abstract:

When the failure function is monotone, some monotonic reliability methods are used to gratefully simplify and facilitate the reliability computations. However, these methods often work in a transformed iso-probabilistic space. To this end, a monotonic simulator or transformation is needed in order that the transformed failure function is still monotone. This note proves at first that the output distribution of failure function is invariant under the transformation. And then it presents some conditions under which the transformed function is still monotone in the newly obtained space. These concern the copulas and the dependence concepts. In many engineering applications, the Gaussian copulas are often used to approximate the real word copulas while the available information on the random variables is limited to the set of marginal distributions and the covariances. So this note catches an importance on the conditional monotonicity of the often used transformation from an independent random vector into a dependent random vector with Gaussian copulas.

Keywords: Monotonic, Rosenblatt, Nataf transformation, dependence concepts, completely positive matrices, Gaussiancopulas

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1210
722 Human Intraocular Thermal Field in Action with Different Boundary Conditions Considering Aqueous Humor and Vitreous Humor Fluid Flow

Authors: Dara Singh, Keikhosrow Firouzbakhsh, Mohammad Taghi Ahmadian

Abstract:

In this study, a validated 3D finite volume model of human eye is developed to study the fluid flow and heat transfer in the human eye at steady state conditions. For this purpose, discretized bio-heat transfer equation coupled with Boussinesq equation is analyzed with different anatomical, environmental, and physiological conditions. It is demonstrated that the fluid circulation is formed as a result of thermal gradients in various regions of eye. It is also shown that posterior region of the human eye is less affected by the ambient conditions compared to the anterior segment which is sensitive to the ambient conditions and also to the way the gravitational field is defined compared to the geometry of the eye making the circulations and the thermal field complicated in transient states. The effect of variation in material and boundary conditions guides us to the conclusion that thermal field of a healthy and non-healthy eye can be distinguished via computer simulations.

Keywords: Bio-heat, Boussinesq, conduction, convection, eye.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 869
721 Trimmed Mean as an Adaptive Robust Estimator of a Location Parameter for Weibull Distribution

Authors: Carolina B. Baguio

Abstract:

One of the purposes of the robust method of estimation is to reduce the influence of outliers in the data, on the estimates. The outliers arise from gross errors or contamination from distributions with long tails. The trimmed mean is a robust estimate. This means that it is not sensitive to violation of distributional assumptions of the data. It is called an adaptive estimate when the trimming proportion is determined from the data rather than being fixed a “priori-. The main objective of this study is to find out the robustness properties of the adaptive trimmed means in terms of efficiency, high breakdown point and influence function. Specifically, it seeks to find out the magnitude of the trimming proportion of the adaptive trimmed mean which will yield efficient and robust estimates of the parameter for data which follow a modified Weibull distribution with parameter λ = 1/2 , where the trimming proportion is determined by a ratio of two trimmed means defined as the tail length. Secondly, the asymptotic properties of the tail length and the trimmed means are also investigated. Finally, a comparison is made on the efficiency of the adaptive trimmed means in terms of the standard deviation for the trimming proportions and when these were fixed a “priori". The asymptotic tail lengths defined as the ratio of two trimmed means and the asymptotic variances were computed by using the formulas derived. While the values of the standard deviations for the derived tail lengths for data of size 40 simulated from a Weibull distribution were computed for 100 iterations using a computer program written in Pascal language. The findings of the study revealed that the tail lengths of the Weibull distribution increase in magnitudes as the trimming proportions increase, the measure of the tail length and the adaptive trimmed mean are asymptotically independent as the number of observations n becomes very large or approaching infinity, the tail length is asymptotically distributed as the ratio of two independent normal random variables, and the asymptotic variances decrease as the trimming proportions increase. The simulation study revealed empirically that the standard error of the adaptive trimmed mean using the ratio of tail lengths is relatively smaller for different values of trimming proportions than its counterpart when the trimming proportions were fixed a 'priori'.

Keywords: Adaptive robust estimate, asymptotic efficiency, breakdown point, influence function, L-estimates, location parameter, tail length, Weibull distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2072
720 Comparative Study of Seismic Isolation as Retrofit Method for Historical Constructions

Authors: Carlos H. Cuadra

Abstract:

Seismic isolation can be used as a retrofit method for historical buildings with the advantage that minimum intervention on super-structure is required. However, selection of isolation devices depends on weight and stiffness of upper structure. In this study, two buildings are considered for analyses to evaluate the applicability of this retrofitting methodology. Both buildings are located at Akita prefecture in the north part of Japan. One building is a wooden structure that corresponds to the old council meeting hall of Noshiro city. The second building is a brick masonry structure that was used as house of a foreign mining engineer and it is located at Ani town. Ambient vibration measurements were performed on both buildings to estimate their dynamic characteristics. Then, target period of vibration of isolated systems is selected as 3 seconds is selected to estimate required stiffness of isolation devices. For wooden structure, which is a light construction, it was found that natural rubber isolators in combination with friction bearings are suitable for seismic isolation. In case of masonry building elastomeric isolator can be used for its seismic isolation. Lumped mass systems are used for seismic response analysis and it is verified in both cases that seismic isolation can be used as retrofitting method of historical construction. However, in the case of the light building, most of the weight corresponds to the reinforced concrete slab that is required to install isolation devices.

Keywords: Historical building, finite element method, masonry structure, seismic isolation, wooden structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 725
719 Spatio-Temporal Analysis and Mapping of Malaria in Thailand

Authors: Krisada Lekdee, Sunee Sammatat, Nittaya Boonsit

Abstract:

This paper proposes a GLMM with spatial and temporal effects for malaria data in Thailand. A Bayesian method is used for parameter estimation via Gibbs sampling MCMC. A conditional autoregressive (CAR) model is assumed to present the spatial effects. The temporal correlation is presented through the covariance matrix of the random effects. The malaria quarterly data have been extracted from the Bureau of Epidemiology, Ministry of Public Health of Thailand. The factors considered are rainfall and temperature. The result shows that rainfall and temperature are positively related to the malaria morbidity rate. The posterior means of the estimated morbidity rates are used to construct the malaria maps. The top 5 highest morbidity rates (per 100,000 population) are in Trat (Q3, 111.70), Chiang Mai (Q3, 104.70), Narathiwat (Q4, 97.69), Chiang Mai (Q2, 88.51), and Chanthaburi (Q3, 86.82). According to the DIC criterion, the proposed model has a better performance than the GLMM with spatial effects but without temporal terms.

Keywords: Bayesian method, generalized linear mixed model (GLMM), malaria, spatial effects, temporal correlation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2147
718 Human Motion Regeneration in 2-Dimension as Stick Figure Animation with Accelerometers

Authors: Alpha Agape Gopalai, Darwin Gouwanda, S.M.N. Arosha Senanayake

Abstract:

This paper explores the opportunity of using tri-axial wireless accelerometers for supervised monitoring of sports movements. A motion analysis system for the upper extremities of lawn bowlers in particular is developed. Accelerometers are placed on parts of human body such as the chest to represent the shoulder movements, the back to capture the trunk motion, back of the hand, the wrist and one above the elbow, to capture arm movements. These sensors placement are carefully designed in order to avoid restricting bowler-s movements. Data is acquired from these sensors in soft-real time using virtual instrumentation; the acquired data is then conditioned and converted into required parameters for motion regeneration. A user interface was also created to facilitate in the acquisition of data, and broadcasting of commands to the wireless accelerometers. All motion regeneration in this paper deals with the motion of the human body segment in the X and Y direction, looking into the motion of the anterior/ posterior and lateral directions respectively.

Keywords: Motion Regeneration, Virtual Instrumentation, Wireless Accelerometers

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1729
717 Instant Location Detection of Objects Moving at High-Speedin C-OTDR Monitoring Systems

Authors: Andrey V. Timofeev

Abstract:

The practical efficient approach is suggested to estimate the high-speed objects instant bounds in C-OTDR monitoring systems. In case of super-dynamic objects (trains, cars) is difficult to obtain the adequate estimate of the instantaneous object localization because of estimation lag. In other words, reliable estimation coordinates of monitored object requires taking some time for data observation collection by means of C-OTDR system, and only if the required sample volume will be collected the final decision could be issued. But it is contrary to requirements of many real applications. For example, in rail traffic management systems we need to get data of the dynamic objects localization in real time. The way to solve this problem is to use the set of statistical independent parameters of C-OTDR signals for obtaining the most reliable solution in real time. The parameters of this type we can call as «signaling parameters» (SP). There are several the SP’s which carry information about dynamic objects instant localization for each of COTDR channels. The problem is that some of these parameters are very sensitive to dynamics of seismoacoustic emission sources, but are non-stable. On the other hand, in case the SP is very stable it becomes insensitive as rule. This report contains describing of the method for SP’s co-processing which is designed to get the most effective dynamic objects localization estimates in the C-OTDR monitoring system framework.

Keywords: C-OTDR-system, co-processing of signaling parameters, high-speed objects localization, multichannel monitoring systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1907
716 Estimation of Attenuation and Phase Delay in Driving Voltage Waveform of a Digital-Noiseless, Ultra-High-Speed Image Sensor

Authors: V. T. S. Dao, T. G. Etoh, C. Vo Le, H. D. Nguyen, K. Takehara, T. Akino, K. Nishi

Abstract:

Since 2004, we have been developing an in-situ storage image sensor (ISIS) that captures more than 100 consecutive images at a frame rate of 10 Mfps with ultra-high sensitivity as well as the video camera for use with this ISIS. Currently, basic research is continuing in an attempt to increase the frame rate up to 100 Mfps and above. In order to suppress electro-magnetic noise at such high frequency, a digital-noiseless imaging transfer scheme has been developed utilizing solely sinusoidal driving voltages. This paper presents highly efficient-yet-accurate expressions to estimate attenuation as well as phase delay of driving voltages through RC networks of an ultra-high-speed image sensor. Elmore metric for a fundamental RC chain is employed as the first-order approximation. By application of dimensional analysis to SPICE data, we found a simple expression that significantly improves the accuracy of the approximation. Similarly, another simple closed-form model to estimate phase delay through fundamental RC networks is also obtained. Estimation error of both expressions is much less than previous works, only less 2% for most of the cases . The framework of this analysis can be extended to address similar issues of other VLSI structures.

Keywords: Dimensional Analysis, ISIS, Digital-noiseless, RC network, Attenuation, Phase Delay, Elmore model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1453
715 Towards Creation of Sustainable Enclaves for Small and Medium-Size Enterprises in Kumasi, Ghana

Authors: Paul Amoateng, Patrick B. Cobbinah, Kwasi Ofori-Kumah

Abstract:

Although the importance of small and medium-size enterprises (SMEs) to local development is globally recognized, less attention is given to their design, development and promotion particularly in developing countries. The main focus of this paper is to examine the process of designing, developing and promoting SMEs in developing countries. Results of a study conducted in a SMEs’ enclave in Kumasi (Ghana) are presented and discussed. Results show that although SMEs in developing countries remain a major source of livelihood for many individuals, their potential contribution to local development can be enhanced and sustained through the creation of common geographical enclaves for related SMEs. Findings indicated that the concentration of SMEs involved in wood processing in one location in Kumasi has reduced the cost of production (e.g., transportation), and resulted in marginal increase in sales for many SMEs, despite the widespread challenges of lack of access to credit and low promotion of products.

Keywords: Developing countries, Kumasi, local development, small and medium-size enterprises.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2370
714 The Design Optimization for Sound Absorption Material of Multi-Layer Structure

Authors: Un-Hwan Park, Jun-Hyeok Heo, In-Sung Lee, Tae-Hyeon Oh, Dae-Kyu Park

Abstract:

Sound absorbing material is used as automotive interior material. Sound absorption coefficient should be predicted to design it. But it is difficult to predict sound absorbing coefficient because it is comprised of several material layers. So, its targets are achieved through many experimental tunings. It causes a lot of cost and time. In this paper, we propose the process to estimate the sound absorption coefficient with multi-layer structure. In order to estimate the coefficient, physical properties of each material are used. These properties also use predicted values by Foam-X software using the sound absorption coefficient data measured by impedance tube. Since there are many physical properties and the measurement equipment is expensive, the values predicted by software are used. Through the measurement of the sound absorption coefficient of each material, its physical properties are calculated inversely. The properties of each material are used to calculate the sound absorption coefficient of the multi-layer material. Since the absorption coefficient of multi-layer can be calculated, optimization design is possible through simulation. Then, we will compare and analyze the calculated sound absorption coefficient with the data measured by scaled reverberation chamber and impedance tubes for a prototype. If this method is used when developing automotive interior materials with multi-layer structure, the development effort can be reduced because it can be optimized by simulation. So, cost and time can be saved.

Keywords: Optimization design, multi-layer nonwoven, sound absorption coefficient, scaled reverberation chamber, impedance tubes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1000
713 Short Term Tests on Performance Evaluation of Water-washed and Dry-washed Biodiesel from Used Cooking Oil

Authors: Shumani Ramuhaheli, Christopher C. Enweremadu, Hilary L. Rutto

Abstract:

In this study, biodiesel from used cooking oil was produced as purified by washing with water (water wash) and amberlite (dry wash). The work presents the results of short term tests on performance characteristics of diesel engine using both biodiesel-fuel samples. In this investigation, the water wash biodiesel and dry wash biodiesel and diesel were compared for performance using a four-cylinder diesel engine. The torque, brake power, specific fuel consumption and brake thermal efficiency were analyzed. The tests showed that in all cases, dry wash biodiesel performed marginally poorer compared to water wash biodiesel. Except for brake thermal efficiency, diesel fuel had better engine performance characteristics compared to the biodiesel-fuel samples. According to these results, dry washing of biodiesel has a marginal effect on engine performance.

Keywords: Biodiesel, engine performance, used cooking oil, water wash, dry wash.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2088
712 The Distance between a Point and a Bezier Curveon a Bezier Surface

Authors: Wen-Haw Chen, Sheng-Gwo Chen

Abstract:

The distance between two objects is an important problem in CAGD, CAD and CG etc. It will be presented in this paper that a simple and quick method to estimate the distance between a point and a Bezier curve on a Bezier surface.

Keywords: Geodesic-like curve, distance, projection, Bezier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2565
711 Fuzzy Numbers and MCDM Methods for Portfolio Optimization

Authors: Thi T. Nguyen, Lee N. Gordon-Brown

Abstract:

A new deployment of the multiple criteria decision making (MCDM) techniques: the Simple Additive Weighting (SAW), and the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) for portfolio allocation, is demonstrated in this paper. Rather than exclusive reference to mean and variance as in the traditional mean-variance method, the criteria used in this demonstration are the first four moments of the portfolio distribution. Each asset is evaluated based on its marginal impacts to portfolio higher moments that are characterized by trapezoidal fuzzy numbers. Then centroid-based defuzzification is applied to convert fuzzy numbers to the crisp numbers by which SAW and TOPSIS can be deployed. Experimental results suggest the similar efficiency of these MCDM approaches to selecting dominant assets for an optimal portfolio under higher moments. The proposed approaches allow investors flexibly adjust their risk preferences regarding higher moments via different schemes adapting to various (from conservative to risky) kinds of investors. The other significant advantage is that, compared to the mean-variance analysis, the portfolio weights obtained by SAW and TOPSIS are consistently well-diversified.

Keywords: Fuzzy numbers, SAW, TOPSIS, portfolio optimization, higher moments, risk management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2143