Search results for: effort estimation
2979 Non-Linear Regression Modeling for Composite Distributions
Authors: Mostafa Aminzadeh, Min Deng
Abstract:
Modeling loss data is an important part of actuarial science. Actuaries use models to predict future losses and manage financial risk, which can be beneficial for marketing purposes. In the insurance industry, small claims happen frequently while large claims are rare. Traditional distributions such as Normal, Exponential, and inverse-Gaussian are not suitable for describing insurance data, which often show skewness and fat tails. Several authors have studied classical and Bayesian inference for parameters of composite distributions, such as Exponential-Pareto, Weibull-Pareto, and Inverse Gamma-Pareto. These models separate small to moderate losses from large losses using a threshold parameter. This research introduces a computational approach using a nonlinear regression model for loss data that relies on multiple predictors. Simulation studies were conducted to assess the accuracy of the proposed estimation method. The simulations confirmed that the proposed method provides precise estimates for regression parameters. It's important to note that this approach can be applied to datasets if goodness-of-fit tests confirm that the composite distribution under study fits the data well. To demonstrate the computations, a real data set from the insurance industry is analyzed. A Mathematica code uses the Fisher information algorithm as an iteration method to obtain the maximum likelihood estimation (MLE) of regression parameters.Keywords: maximum likelihood estimation, fisher scoring method, non-linear regression models, composite distributions
Procedia PDF Downloads 362978 A Semiparametric Approach to Estimate the Mode of Continuous Multivariate Data
Authors: Tiee-Jian Wu, Chih-Yuan Hsu
Abstract:
Mode estimation is an important task, because it has applications to data from a wide variety of sources. We propose a semi-parametric approach to estimate the mode of an unknown continuous multivariate density function. Our approach is based on a weighted average of a parametric density estimate using the Box-Cox transform and a non-parametric kernel density estimate. Our semi-parametric mode estimate improves both the parametric- and non-parametric- mode estimates. Specifically, our mode estimate solves the non-consistency problem of parametric mode estimates (at large sample sizes) and reduces the variability of non-parametric mode estimates (at small sample sizes). The performance of our method at practical sample sizes is demonstrated by simulation examples and two real examples from the fields of climatology and image recognition.Keywords: Box-Cox transform, density estimation, mode seeking, semiparametric method
Procedia PDF Downloads 2852977 Lipschitz Classifiers Ensembles: Usage for Classification of Target Events in C-OTDR Monitoring Systems
Authors: Andrey V. Timofeev
Abstract:
This paper introduces an original method for guaranteed estimation of the accuracy of an ensemble of Lipschitz classifiers. The solution was obtained as a finite closed set of alternative hypotheses, which contains an object of classification with a probability of not less than the specified value. Thus, the classification is represented by a set of hypothetical classes. In this case, the smaller the cardinality of the discrete set of hypothetical classes is, the higher is the classification accuracy. Experiments have shown that if the cardinality of the classifiers ensemble is increased then the cardinality of this set of hypothetical classes is reduced. The problem of the guaranteed estimation of the accuracy of an ensemble of Lipschitz classifiers is relevant in the multichannel classification of target events in C-OTDR monitoring systems. Results of suggested approach practical usage to accuracy control in C-OTDR monitoring systems are present.Keywords: Lipschitz classifiers, confidence set, C-OTDR monitoring, classifiers accuracy, classifiers ensemble
Procedia PDF Downloads 4922976 GIS Application in Surface Runoff Estimation for Upper Klang River Basin, Malaysia
Authors: Suzana Ramli, Wardah Tahir
Abstract:
Estimation of surface runoff depth is a vital part in any rainfall-runoff modeling. It leads to stream flow calculation and later predicts flood occurrences. GIS (Geographic Information System) is an advanced and opposite tool used in simulating hydrological model due to its realistic application on topography. The paper discusses on calculation of surface runoff depth for two selected events by using GIS with Curve Number method for Upper Klang River basin. GIS enables maps intersection between soil type and land use that later produces curve number map. The results show good correlation between simulated and observed values with more than 0.7 of R2. Acceptable performance of statistical measurements namely mean error, absolute mean error, RMSE, and bias are also deduced in the paper.Keywords: surface runoff, geographic information system, curve number method, environment
Procedia PDF Downloads 2822975 Nonparametric Sieve Estimation with Dependent Data: Application to Deep Neural Networks
Authors: Chad Brown
Abstract:
This paper establishes general conditions for the convergence rates of nonparametric sieve estimators with dependent data. We present two key results: one for nonstationary data and another for stationary mixing data. Previous theoretical results often lack practical applicability to deep neural networks (DNNs). Using these conditions, we derive convergence rates for DNN sieve estimators in nonparametric regression settings with both nonstationary and stationary mixing data. The DNN architectures considered adhere to current industry standards, featuring fully connected feedforward networks with rectified linear unit activation functions, unbounded weights, and a width and depth that grows with sample size.Keywords: sieve extremum estimates, nonparametric estimation, deep learning, neural networks, rectified linear unit, nonstationary processes
Procedia PDF Downloads 442974 An Efficient Fundamental Matrix Estimation for Moving Object Detection
Authors: Yeongyu Choi, Ju H. Park, S. M. Lee, Ho-Youl Jung
Abstract:
In this paper, an improved method for estimating fundamental matrix is proposed. The method is applied effectively to monocular camera based moving object detection. The method consists of corner points detection, moving object’s motion estimation and fundamental matrix calculation. The corner points are obtained by using Harris corner detector, motions of moving objects is calculated from pyramidal Lucas-Kanade optical flow algorithm. Through epipolar geometry analysis using RANSAC, the fundamental matrix is calculated. In this method, we have improved the performances of moving object detection by using two threshold values that determine inlier or outlier. Through the simulations, we compare the performances with varying the two threshold values.Keywords: corner detection, optical flow, epipolar geometry, RANSAC
Procedia PDF Downloads 4092973 Controlling the Expense of Political Contests Using a Modified N-Players Tullock’s Model
Abstract:
This work introduces a generalization of the classical Tullock’s model of one-stage contests under complete information with multiple unlimited numbers of contestants. In classical Tullock’s model, the contest winner is not necessarily the highest bidder. Instead, the winner is determined according to a draw in which the winning probabilities are the relative contestants’ efforts. The Tullock modeling fits well political contests, in which the winner is not necessarily the highest effort contestant. This work presents a modified model which uses a simple non-discriminating rule, namely, a parameter to influence the total costs planned for an election, for example, the contest designer can control the contestants' efforts. The winner pays a fee, and the losers are reimbursed the same amount. Our proposed model includes a mechanism that controls the efforts exerted and balances competition, creating a tighter, less predictable and more interesting contest. Additionally, the proposed model follows the fairness criterion in the sense that it does not alter the contestants' probabilities of winning compared to the classic Tullock’s model. We provide an analytic solution for the contestant's optimal effort and expected reward.Keywords: contests, Tullock's model, political elections, control expenses
Procedia PDF Downloads 1452972 Cycle Number Estimation Method on Fatigue Crack Initiation Using Voronoi Tessellation and the Tanaka Mura Model
Authors: Mohammad Ridzwan Bin Abd Rahim, Siegfried Schmauder, Yupiter HP Manurung, Peter Binkele, Meor Iqram B. Meor Ahmad, Kiarash Dogahe
Abstract:
This paper deals with the short crack initiation of the material P91 under cyclic loading at two different temperatures, concluded with the estimation of the short crack initiation Wöhler (S/N) curve. An artificial but representative model microstructure was generated using Voronoi tessellation and the Finite Element Method, and the non-uniform stress distribution was calculated accordingly afterward. The number of cycles needed for crack initiation is estimated on the basis of the stress distribution in the model by applying the physically-based Tanaka-Mura model. Initial results show that the number of cycles to generate crack initiation is strongly correlated with temperature.Keywords: short crack initiation, P91, Wöhler curve, Voronoi tessellation, Tanaka-Mura model
Procedia PDF Downloads 1012971 Residual Lifetime Estimation for Weibull Distribution by Fusing Expert Judgements and Censored Data
Authors: Xiang Jia, Zhijun Cheng
Abstract:
The residual lifetime of a product is the operation time between the current time and the time point when the failure happens. The residual lifetime estimation is rather important in reliability analysis. To predict the residual lifetime, it is necessary to assume or verify a particular distribution that the lifetime of the product follows. And the two-parameter Weibull distribution is frequently adopted to describe the lifetime in reliability engineering. Due to the time constraint and cost reduction, a life testing experiment is usually terminated before all the units have failed. Then the censored data is usually collected. In addition, other information could also be obtained for reliability analysis. The expert judgements are considered as it is common that the experts could present some useful information concerning the reliability. Therefore, the residual lifetime is estimated for Weibull distribution by fusing the censored data and expert judgements in this paper. First, the closed-forms concerning the point estimate and confidence interval for the residual lifetime under the Weibull distribution are both presented. Next, the expert judgements are regarded as the prior information and how to determine the prior distribution of Weibull parameters is developed. For completeness, the cases that there is only one, and there are more than two expert judgements are both focused on. Further, the posterior distribution of Weibull parameters is derived. Considering that it is difficult to derive the posterior distribution of residual lifetime, a sample-based method is proposed to generate the posterior samples of Weibull parameters based on the Monte Carlo Markov Chain (MCMC) method. And these samples are used to obtain the Bayes estimation and credible interval for the residual lifetime. Finally, an illustrative example is discussed to show the application. It demonstrates that the proposed method is rather simple, satisfactory, and robust.Keywords: expert judgements, information fusion, residual lifetime, Weibull distribution
Procedia PDF Downloads 1422970 Effective Dose and Size Specific Dose Estimation with and without Tube Current Modulation for Thoracic Computed Tomography Examinations: A Phantom Study
Authors: S. Gharbi, S. Labidi, M. Mars, M. Chelli, F. Ladeb
Abstract:
The purpose of this study is to reduce radiation dose for chest CT examination by including Tube Current Modulation (TCM) to a standard CT protocol. A scan of an anthropomorphic male Alderson phantom was performed on a 128-slice scanner. The estimation of effective dose (ED) in both scans with and without mAs modulation was done via multiplication of Dose Length Product (DLP) to a conversion factor. Results were compared to those measured with a CT-Expo software. The size specific dose estimation (SSDE) values were obtained by multiplication of the volume CT dose index (CTDIvol) with a conversion size factor related to the phantom’s effective diameter. Objective assessment of image quality was performed with Signal to Noise Ratio (SNR) measurements in phantom. SPSS software was used for data analysis. Results showed including CARE Dose 4D; ED was lowered by 48.35% and 51.51% using DLP and CT-expo, respectively. In addition, ED ranges between 7.01 mSv and 6.6 mSv in case of standard protocol, while it ranges between 3.62 mSv and 3.2 mSv with TCM. Similar results are found for SSDE; dose was higher without TCM of 16.25 mGy and was lower by 48.8% including TCM. The SNR values calculated were significantly different (p=0.03<0.05). The highest one is measured on images acquired with TCM and reconstructed with Filtered back projection (FBP). In conclusion, this study proves the potential of TCM technique in SSDE and ED reduction and in conserving image quality with high diagnostic reference level for thoracic CT examinations.Keywords: anthropomorphic phantom, computed tomography, CT-expo, radiation dose
Procedia PDF Downloads 2222969 Design Flood Estimation in Satluj Basin-Challenges for Sunni Dam Hydro Electric Project, Himachal Pradesh-India
Authors: Navneet Kalia, Lalit Mohan Verma, Vinay Guleria
Abstract:
Introduction: Design Flood studies are essential for effective planning and functioning of water resource projects. Design flood estimation for Sunni Dam Hydro Electric Project located in State of Himachal Pradesh, India, on the river Satluj, was a big challenge in view of the river flowing in the Himalayan region from Tibet to India, having a large catchment area of varying topography, climate, and vegetation. No Discharge data was available for the part of the river in Tibet, whereas, for India, it was available only at Khab, Rampur, and Luhri. The estimation of Design Flood using standard methods was not possible. This challenge was met using two different approaches for upper (snow-fed) and lower (rainfed) catchment using Flood Frequency Approach and Hydro-metrological approach. i) For catchment up to Khab Gauging site (Sub-Catchment, C1), Flood Frequency approach was used. Around 90% of the catchment area (46300 sqkm) up to Khab is snow-fed which lies above 4200m. In view of the predominant area being snow-fed area, 1 in 10000 years return period flood estimated using Flood Frequency analysis at Khab was considered as Probable Maximum Flood (PMF). The flood peaks were taken from daily observed discharges at Khab, which were increased by 10% to make them instantaneous. Design Flood of 4184 cumec thus obtained was considered as PMF at Khab. ii) For catchment between Khab and Sunni Dam (Sub-Catchment, C2), Hydro-metrological approach was used. This method is based upon the catchment response to the rainfall pattern observed (Probable Maximum Precipitation - PMP) in a particular catchment area. The design flood computation mainly involves the estimation of a design storm hyetograph and derivation of the catchment response function. A unit hydrograph is assumed to represent the response of the entire catchment area to a unit rainfall. The main advantage of the hydro-metrological approach is that it gives a complete flood hydrograph which allows us to make a realistic determination of its moderation effect while passing through a reservoir or a river reach. These studies were carried out to derive PMF for the catchment area between Khab and Sunni Dam site using a 1-day and 2-day PMP values of 232 and 416 cm respectively. The PMF so obtained was 12920.60 cumec. Final Result: As the Catchment area up to Sunni Dam has been divided into 2 sub-catchments, the Flood Hydrograph for the Catchment C1 has been routed through the connecting channel reach (River Satluj) using Muskingum method and accordingly, the Design Flood was computed after adding the routed flood ordinates with flood ordinates of catchment C2. The total Design Flood (i.e. 2-Day PMF) with a peak of 15473 cumec was obtained. Conclusion: Even though, several factors are relevant while deciding the method to be used for design flood estimation, data availability and the purpose of study are the most important factors. Since, generally, we cannot wait for the hydrological data of adequate quality and quantity to be available, flood estimation has to be done using whatever data is available. Depending upon the type of data available for a particular catchment, the method to be used is to be selected.Keywords: design flood, design storm, flood frequency, PMF, PMP, unit hydrograph
Procedia PDF Downloads 3272968 Volume Estimation of Trees: An Exploratory Study on Rosewood Logging Within Forest Transition and Savannah Ecological Zones of Ghana
Authors: Albert Kwabena Osei Konadu
Abstract:
One of the endemic forest species of the savannah transition zones enlisted by the Convention of International Treaty for Endangered Species (CITES) in Appendix II is the Rosewood, also known as Pterocarpus erinaceus or Krayie. Its economic viability has made it increasingly popular and in high demand. Ghana’s forest resource management regime for these ecozones is mainly on conservation and very little on resource utilization. Consequently, commercial logging management standards are at teething stage and not fully developed, leading to a deficiency in the monitoring of logging operations and quantification of harvested trees volumes. Tree information form (TIF); a volume estimation and tracking regime, has proven to be an effective sustainable management tool for regulating timber resource extraction in the high forest zones of the country. This work aims to generate TIF that can track and capture requisite parameters to accurately estimate the volume of harvested rosewood within forest savannah transition zones. Tree information forms were created on three scenarios of individual billets, stacked billets and conveying vessel basis. The study was limited by the usage of regulators assigned volume as benchmark and also fraught with potential volume measurement error in the stacked billet scenario due to the existence of spaces within packed billets. These TIFs were field-tested to deduce the most viable option for the tracking and estimation of harvested volumes of rosewood using the smallian and cubic volume estimation formula. Overall, four districts were covered with individual billets, stacked billets and conveying vessel scenarios registering mean volumes of 25.83m3,45.08m3 and 32.6m3, respectively. These adduced volumes were validated by benchmarking to assigned volumes of the Forestry Commission of Ghana and known standard volumes of conveying vessels. The results did indicate an underestimation of extracted volumes under the quotas regime, a situation that could lead to unintended overexploitation of the species. The research revealed conveying vessels route is the most viable volume estimation and tracking regime for the sustainable management of the Pterocarpous erinaceus species as it provided a more practical volume estimate and data extraction protocol.Keywords: cubic volume formula, smallian volume formula, pterocarpus erinaceus, tree information form, forest transition and savannah zones, harvested tree volume
Procedia PDF Downloads 442967 Applications of Analytical Probabilistic Approach in Urban Stormwater Modeling in New Zealand
Authors: Asaad Y. Shamseldin
Abstract:
Analytical probabilistic approach is an innovative approach for urban stormwater modeling. It can provide information about the long-term performance of a stormwater management facility without being computationally very demanding. This paper explores the application of the analytical probabilistic approach in New Zealand. The paper presents the results of a case study aimed at development of an objective way of identifying what constitutes a rainfall storm event and the estimation of the corresponding statistical properties of storms using two selected automatic rainfall stations located in the Auckland region in New Zealand. The storm identification and the estimation of the storm statistical properties are regarded as the first step in the development of the analytical probabilistic models. The paper provides a recommendation about the definition of the storm inter-event time to be used in conjunction with the analytical probabilistic approach.Keywords: hydrology, rainfall storm, storm inter-event time, New Zealand, stormwater management
Procedia PDF Downloads 3442966 On the Optimality of Blocked Main Effects Plans
Authors: Rita SahaRay, Ganesh Dutta
Abstract:
In this article, experimental situations are considered where a main effects plan is to be used to study m two-level factors using n runs which are partitioned into b blocks, not necessarily of same size. Assuming the block sizes to be even for all blocks, for the case n ≡ 2 (mod 4), optimal designs are obtained with respect to type 1 and type 2 optimality criteria in the class of designs providing estimation of all main effects orthogonal to the block effects. In practice, such orthogonal estimation of main effects is often a desirable condition. In the wider class of all available m two level even sized blocked main effects plans, where the factors do not occur at high and low levels equally often in each block, E-optimal designs are also characterized. Simple construction methods based on Hadamard matrices and Kronecker product for these optimal designs are presented.Keywords: design matrix, Hadamard matrix, Kronecker product, type 1 criteria, type 2 criteria
Procedia PDF Downloads 3662965 Estimation of Stress-Strength Parameter for Burr Type XII Distribution Based on Progressive Type-II Censoring
Authors: A. M. Abd-Elfattah, M. H. Abu-Moussa
Abstract:
In this paper, the estimation of stress-strength parameter R = P(Y < X) is considered when X; Y the strength and stress respectively are two independent random variables of Burr Type XII distribution. The samples taken for X and Y are progressively censoring of type II. The maximum likelihood estimator (MLE) of R is obtained when the common parameter is unknown. But when the common parameter is known the MLE, uniformly minimum variance unbiased estimator (UMVUE) and the Bayes estimator of R = P(Y < X) are obtained. The exact condence interval of R based on MLE is obtained. The performance of the proposed estimators is compared using the computer simulation.Keywords: Burr Type XII distribution, progressive type-II censoring, stress-strength model, unbiased estimator, maximum-likelihood estimator, uniformly minimum variance unbiased estimator, confidence intervals, Bayes estimator
Procedia PDF Downloads 4572964 Results of EPR Dosimetry Study of Population Residing in the Vicinity of the Uranium Mines and Uranium Processing Plant
Authors: K. Zhumadilov, P. Kazymbet, A. Ivannikov, M. Bakhtin, A. Akylbekov, K. Kadyrzhanov, A. Morzabayev, M. Hoshi
Abstract:
The aim of the study is to evaluate the possible excess of dose received by uranium processing plant workers. The possible excess of dose of workers was evaluated with comparison with population pool (Stepnogorsk) and control pool (Astana city). The measured teeth samples were extracted according to medical indications. In total, twenty-seven tooth enamel samples were analyzed from the residents of Stepnogorsk city (180 km from Astana city, Kazakhstan). About 6 tooth samples were collected from the workers of uranium processing plant. The results of tooth enamel dose estimation show us small influence of working conditions to workers, the maximum excess dose is less than 100 mGy. This is pilot study of EPR dose estimation and for a final conclusion additional sample is required.Keywords: EPR dose, workers, uranium mines, tooth samples
Procedia PDF Downloads 4122963 A Method to Determine Cutting Force Coefficients in Turning Using Mechanistic Approach
Authors: T. C. Bera, A. Bansal, D. Nema
Abstract:
During performing turning operation, cutting force plays a significant role in metal cutting process affecting tool-work piece deflection, vibration and eventually part quality. The present research work aims to develop a mechanistic cutting force model and to study the mechanistic constants used in the force model in case of turning operation. The proposed model can be used for the reliable and accurate estimation of the cutting forces establishing relationship of various force components (cutting force and feed force) with uncut chip thickness. The accurate estimation of cutting force is required to improve thin-walled part accuracy by controlling the tool-work piece deflection induced surface errors and tool-work piece vibration.Keywords: turning, cutting forces, cutting constants, uncut chip thickness
Procedia PDF Downloads 5232962 Use of Dendrochronology in Estimation of Creep Velocity and Its Dependence on the Bulk Density of Soils
Authors: Mohammad Amjad Sabir, Ishtiaq Khan, Shahid Ali, Umar Shabbir, Aneel Ahmad
Abstract:
Creep, being the main silt contributor to the rivers, is a slow, downhill flow of soils. The creep velocity is measured in millimeters to a couple of centimeters per year and is determined with the help of tilt caused by creep in the vertical objects and needs at least ten years to get a reliable creep velocity. This project was devised to calculate creep velocity using dendrochronology and looking for the difference of creep velocity registered by different trees on the same slope. It was concluded that dendrochronology provides a very reliable procedure of creep velocity estimation if ‘J’ shaped trees are studied for their horizontal movement and age. The age of these trees was measured using tree coring, and the horizontal movement was measured with a conventional tape. Using this procedure it does not require decades and additionally the data reveals the creep velocity for up to 150 years and even more instead of just a decade. It was also concluded that the creep velocity does not only depend on bulk density of soil hence no pronounced effect of bulk density was detected.Keywords: creep velocity, Galiyat, Pakistan, dendrochronology, Nagri Bala
Procedia PDF Downloads 3152961 A Generalized Family of Estimators for Estimation of Unknown Population Variance in Simple Random Sampling
Authors: Saba Riaz, Syed A. Hussain
Abstract:
This paper is addressing the estimation method of the unknown population variance of the variable of interest. A new generalized class of estimators of the finite population variance has been suggested using the auxiliary information. To improve the precision of the proposed class, known population variance of the auxiliary variable has been used. Mathematical expressions for the biases and the asymptotic variances of the suggested class are derived under large sample approximation. Theoretical and numerical comparisons are made to investigate the performances of the proposed class of estimators. The empirical study reveals that the suggested class of estimators performs better than the usual estimator, classical ratio estimator, classical product estimator and classical linear regression estimator. It has also been found that the suggested class of estimators is also more efficient than some recently published estimators.Keywords: study variable, auxiliary variable, finite population variance, bias, asymptotic variance, percent relative efficiency
Procedia PDF Downloads 2262960 Detection of Concrete Reinforcement Damage Using Piezoelectric Materials: Analytical and Experimental Study
Authors: C. P. Providakis, G. M. Angeli, M. J. Favvata, N. A. Papadopoulos, C. E. Chalioris, C. G. Karayannis
Abstract:
An effort for the detection of damages in the reinforcement bars of reinforced concrete members using PZTs is presented. The damage can be the result of excessive elongation of the steel bar due to steel yielding or due to local steel corrosion. In both cases the damage is simulated by considering reduced diameter of the rebar along the damaged part of its length. An integration approach based on both electromechanical admittance methodology and guided wave propagation technique is used to evaluate the artificial damage on the examined longitudinal steel bar. Two actuator PZTs and a sensor PZT are considered to be bonded on the examined steel bar. The admittance of the Sensor PZT is calculated using COMSOL 3.4a. Fast Furrier Transformation for a better evaluation of the results is employed. An effort for the quantification of the damage detection using the root mean square deviation (RMSD) between the healthy condition and damage state of the sensor PZT is attempted. The numerical value of the RSMD yields a level for the difference between the healthy and the damaged admittance computation indicating this way the presence of damage in the structure. Experimental measurements are also presented.Keywords: concrete reinforcement, damage detection, electromechanical admittance, experimental measurements, finite element method, guided waves, PZT
Procedia PDF Downloads 2552959 An Automatic Model Transformation Methodology Based on Semantic and Syntactic Comparisons and the Granularity Issue Involved
Authors: Tiexin Wang, Sebastien Truptil, Frederick Benaben
Abstract:
Model transformation, as a pivotal aspect of Model-driven engineering, attracts more and more attentions both from researchers and practitioners. Many domains (enterprise engineering, software engineering, knowledge engineering, etc.) use model transformation principles and practices to serve to their domain specific problems; furthermore, model transformation could also be used to fulfill the gap between different domains: by sharing and exchanging knowledge. Since model transformation has been widely used, there comes new requirement on it: effectively and efficiently define the transformation process and reduce manual effort that involved in. This paper presents an automatic model transformation methodology based on semantic and syntactic comparisons, and focuses particularly on granularity issue that existed in transformation process. Comparing to the traditional model transformation methodologies, this methodology serves to a general purpose: cross-domain methodology. Semantic and syntactic checking measurements are combined into a refined transformation process, which solves the granularity issue. Moreover, semantic and syntactic comparisons are supported by software tool; manual effort is replaced in this way.Keywords: automatic model transformation, granularity issue, model-driven engineering, semantic and syntactic comparisons
Procedia PDF Downloads 3982958 Detection of Concrete Reinforcement Damage Using Piezoelectric Materials: Analytical and Experimental Study
Authors: C. P. Providakis, G. M. Angeli, M. J. Favvata, N. A. Papadopoulos, C. E. Chalioris, C. G. Karayannis
Abstract:
An effort for the detection of damages in the reinforcement bars of reinforced concrete members using PZTs is presented. The damage can be the result of excessive elongation of the steel bar due to steel yielding or due to local steel corrosion. In both cases the damage is simulated by considering reduced diameter of the rebar along the damaged part of its length. An integration approach based on both electro-mechanical admittance methodology and guided wave propagation technique is used to evaluate the artificial damage on the examined longitudinal steel bar. Two actuator PZTs and a sensor PZT are considered to be bonded on the examined steel bar. The admittance of the Sensor PZT is calculated using COMSOL 3.4a. Fast Furrier Transformation for a better evaluation of the results is employed. An effort for the quantification of the damage detection using the root mean square deviation (RMSD) between the healthy condition and damage state of the sensor PZT is attempted. The numerical value of the RSMD yields a level for the difference between the healthy and the damaged admittance computation indicating this way the presence of damage in the structure. Experimental measurements are also presented.Keywords: concrete reinforcement, damage detection, electromechanical admittance, experimental measurements, finite element method, guided waves, PZT
Procedia PDF Downloads 2932957 Shark Resources in the Iranian Waters of the Persian Gulf
Authors: Nassir Niamaimandi, Mehrdad Hosaini Shabankareh
Abstract:
This study was analyzed the annual catch and trawl survey data of sharks in the northern part of the Persian Gulf (26˚ 30΄ to 30˚ 00΄N and 49˚ 00΄ to 56˚ 00΄E) from 2004 to 2009. Trawl survey was conducted by research vessel Ferdous, equipped with bottom trawl nets in meshes 400mm and 80mm at the body, and cod-end, respectively. Ten stratums were selected in the study area and 199 stations were randomly trawled. The density (CPUA) of shark resources was estimated based on swept area method. The annual total catch was obtained from Iranian fisheries organization (Shilat). The results of catch per unit area showed 250.7 kg/nm2 in 2004 to 49.7 kg/nm2 in 2009. There was a high degree of variability of CPUA among different areas and the maximum was estimated 1870.8 kg/nm2 in Nayband and Mogham. In catch composition data, sharks have a decreasing trend from 4.2% in 2004 to 2.9% in 2009 that shows a decline with an annual average 1.3% during 2004-2009. This results suggesting that the shark resource is overexploited and the current effort is far higher than the effort required harvesting optimum yields.Keywords: shark resources, Iranian waters, Persian Gulf, Trawl survey
Procedia PDF Downloads 5612956 Parameter Estimation with Uncertainty and Sensitivity Analysis for the SARS Outbreak in Hong Kong
Authors: Afia Naheed, Manmohan Singh, David Lucy
Abstract:
This work is based on a mathematical as well as statistical study of an SEIJTR deterministic model for the interpretation of transmission of severe acute respiratory syndrome (SARS). Based on the SARS epidemic in 2003, the parameters are estimated using Runge-Kutta (Dormand-Prince pairs) and least squares methods. Possible graphical and numerical techniques are used to validate the estimates. Then effect of the model parameters on the dynamics of the disease is examined using sensitivity and uncertainty analysis. Sensitivity and uncertainty analytical techniques are used in order to analyze the affect of the uncertainty in the obtained parameter estimates and to determine which parameters have the largest impact on controlling the disease dynamics.Keywords: infectious disease, severe acute respiratory syndrome (SARS), parameter estimation, sensitivity analysis, uncertainty analysis, Runge-Kutta methods, Levenberg-Marquardt method
Procedia PDF Downloads 3612955 Evaluating Accuracy of Foetal Weight Estimation by Clinicians in Christian Medical College Hospital, India and Its Correlation to Actual Birth Weight: A Clinical Audit
Authors: Aarati Susan Mathew, Radhika Narendra Patel, Jiji Mathew
Abstract:
A retrospective study conducted at Christian Medical College (CMC) Teaching Hospital, Vellore, India on 14th August 2014 to assess the accuracy of clinically estimated foetal weight upon labour admission. Estimating foetal weight is a crucial factor in assessing maternal and foetal complications during and after labour. Medical notes of ninety-eight postnatal women who fulfilled the inclusion criteria were studied to evaluate the correlation between their recorded Estimated Foetal Weight (EFW) on admission and actual birth weight (ABW) of the newborn after delivery. Data concerning maternal and foetal demographics was also noted. Accuracy was determined by absolute percentage error and proportion of estimates within 10% of ABW. Actual birth weights ranged from 950-4080g. A strong positive correlation between EFW and ABW (r=0.904) was noted. Term deliveries (≥40 weeks) in the normal weight range (2500-4000g) had a 59.5% estimation accuracy (n=74) compared to pre-term (<40 weeks) with an estimation accuracy of 0% (n=2). Out of the term deliveries, macrosomic babies (>4000g) were underestimated by 25% (n=3) and low birthweight (LBW) babies were overestimated by 12.7% (n=9). Registrars who estimated foetal weight were accurate in babies within normal weight ranges. However, there needs to be an improvement in predicting weight of macrosomic and LBW foetuses. We have suggested the use of an amended version of the Johnson’s formula for the Indian population for improvement and a need to re-audit once implemented.Keywords: clinical palpation, estimated foetal weight, pregnancy, India, Johnson’s formula
Procedia PDF Downloads 3642954 Copula Markov Switching Multifractal Models for Forecasting Value-at-Risk
Authors: Giriraj Achari, Malay Bhattacharyya
Abstract:
In this paper, the effectiveness of Copula Markov Switching Multifractal (MSM) models at forecasting Value-at-Risk of a two-stock portfolio is studied. The innovations are allowed to be drawn from distributions that can capture skewness and leptokurtosis, which are well documented empirical characteristics observed in financial returns. The candidate distributions considered for this purpose are Johnson-SU, Pearson Type-IV and α-Stable distributions. The two univariate marginal distributions are combined using the Student-t copula. The estimation of all parameters is performed by Maximum Likelihood Estimation. Finally, the models are compared in terms of accurate Value-at-Risk (VaR) forecasts using tests of unconditional coverage and independence. It is found that Copula-MSM-models with leptokurtic innovation distributions perform slightly better than Copula-MSM model with Normal innovations. Copula-MSM models, in general, produce better VaR forecasts as compared to traditional methods like Historical Simulation method, Variance-Covariance approach and Copula-Generalized Autoregressive Conditional Heteroscedasticity (Copula-GARCH) models.Keywords: Copula, Markov Switching, multifractal, value-at-risk
Procedia PDF Downloads 1652953 The Estimation Method of Stress Distribution for Beam Structures Using the Terrestrial Laser Scanning
Authors: Sang Wook Park, Jun Su Park, Byung Kwan Oh, Yousok Kim, Hyo Seon Park
Abstract:
This study suggests the estimation method of stress distribution for the beam structures based on TLS (Terrestrial Laser Scanning). The main components of method are the creation of the lattices of raw data from TLS to satisfy the suitable condition and application of CSSI (Cubic Smoothing Spline Interpolation) for estimating stress distribution. Estimation of stress distribution for the structural member or the whole structure is one of the important factors for safety evaluation of the structure. Existing sensors which include ESG (Electric strain gauge) and LVDT (Linear Variable Differential Transformer) can be categorized as contact type sensor which should be installed on the structural members and also there are various limitations such as the need of separate space where the network cables are installed and the difficulty of access for sensor installation in real buildings. To overcome these problems inherent in the contact type sensors, TLS system of LiDAR (light detection and ranging), which can measure the displacement of a target in a long range without the influence of surrounding environment and also get the whole shape of the structure, has been applied to the field of structural health monitoring. The important characteristic of TLS measuring is a formation of point clouds which has many points including the local coordinate. Point clouds is not linear distribution but dispersed shape. Thus, to analyze point clouds, the interpolation is needed vitally. Through formation of averaged lattices and CSSI for the raw data, the method which can estimate the displacement of simple beam was developed. Also, the developed method can be extended to calculate the strain and finally applicable to estimate a stress distribution of a structural member. To verify the validity of the method, the loading test on a simple beam was conducted and TLS measured it. Through a comparison of the estimated stress and reference stress, the validity of the method is confirmed.Keywords: structural healthcare monitoring, terrestrial laser scanning, estimation of stress distribution, coordinate transformation, cubic smoothing spline interpolation
Procedia PDF Downloads 4332952 Controller Design and Experimental Evaluation of a Motorized Assistance for a Patient Transfer Floor Lift
Authors: Donatien Callon, Ian Lalonde, Mathieu Nadeau, Alexandre Girard
Abstract:
Patient transfer is a challenging, critical task because it exposes caregivers to injury risks. Available transfer devices, like floor lifts, lead to improvements but are far from perfect. They do not eliminate the caregivers’ risk of musculoskeletal disorders, and they can be burdensome to use due to their poor maneuverability. This paper presents a new motorized floor lift with a single central motorized wheel connected to an instrumented handle. Admittance controllers are designed to 1) improve the device maneuverability, 2) reduce the required caregiver effort, and 3) ensure the security and comfort of patients. Two controller designs, one with a linear admittance law and a non-linear admittance law with variable damping, were developed and implemented on a prototype. Tests were performed on seven participants to evaluate the performance of the assistance system and the controllers. The experimental results show that 1) the motorized assistance with the variable damping controller improves maneuverability by 28%, 2) reduces the amount of effort required to push the lift by 66%, and 3) provides the same level of patient comfort compared to a standard unassisted floor lift.Keywords: floor lift, human robot interaction, admittance controller, variable admittance
Procedia PDF Downloads 1132951 Age Estimation from Upper Anterior Teeth by Pulp/Tooth Ratio Using Peri-Apical X-Rays among Egyptians
Authors: Fatma Mohamed Magdy Badr El Dine, Amr Mohamed Abd Allah
Abstract:
Introduction: Age estimation of individuals is one of the crucial steps in forensic practice. Different traditional methods rely on the length of the diaphysis of long bones of limbs, epiphyseal-diaphyseal union, fusion of the primary ossification centers as well as dental eruption. However, there is a growing need for the development of precise and reliable methods to estimate age, especially in cases where dismembered corpses, burnt bodies, purified or fragmented parts are recovered. Teeth are the hardest and indestructible structure in the human body. In recent years, assessment of pulp/tooth area ratio, as an indirect quantification of secondary dentine deposition has received a considerable attention. However, scanty work has been done in Egypt in terms of applicability of pulp/tooth ratio for age estimation. Aim of the Work: The present work was designed to assess the Cameriere’s method for age estimation from pulp/tooth ratio of maxillary canines, central and lateral incisors among a sample from Egyptian population. In addition, to formulate regression equations to be used as population-based standards for age determination. Material and Methods: The present study was conducted on 270 peri-apical X-rays of maxillary canines, central and lateral incisors (collected from 131 males and 139 females aged between 19 and 52 years). The pulp and tooth areas were measured using the Adobe Photoshop software program and the pulp/tooth area ratio was computed. Linear regression equations were determined separately for canines, central and lateral incisors. Results: A significant correlation was recorded between the pulp/tooth area ratio and the chronological age. The linear regression analysis revealed a coefficient of determination (R² = 0.824 for canine, 0.588 for central incisor and 0.737 for lateral incisor teeth). Three regression equations were derived. Conclusion: As a conclusion, the pulp/tooth ratio is a useful technique for estimating age among Egyptians. Additionally, the regression equation derived from canines gave better result than the incisors.Keywords: age determination, canines, central incisors, Egypt, lateral incisors, pulp/tooth ratio
Procedia PDF Downloads 1842950 Sentiment Classification of Documents
Authors: Swarnadip Ghosh
Abstract:
Sentiment Analysis is the process of detecting the contextual polarity of text. In other words, it determines whether a piece of writing is positive, negative or neutral.Sentiment analysis of documents holds great importance in today's world, when numerous information is stored in databases and in the world wide web. An efficient algorithm to illicit such information, would be beneficial for social, economic as well as medical purposes. In this project, we have developed an algorithm to classify a document into positive or negative. Using our algorithm, we obtained a feature set from the data, and classified the documents based on this feature set. It is important to note that, in the classification, we have not used the independence assumption, which is considered by many procedures like the Naive Bayes. This makes the algorithm more general in scope. Moreover, because of the sparsity and high dimensionality of such data, we did not use empirical distribution for estimation, but developed a method by finding degree of close clustering of the data points. We have applied our algorithm on a movie review data set obtained from IMDb and obtained satisfactory results.Keywords: sentiment, Run's Test, cross validation, higher dimensional pmf estimation
Procedia PDF Downloads 404