Search results for: dental age estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2215

Search results for: dental age estimation

1705 Use of Dendrochronology in Estimation of Creep Velocity and Its Dependence on the Bulk Density of Soils

Authors: Mohammad Amjad Sabir, Ishtiaq Khan, Shahid Ali, Umar Shabbir, Aneel Ahmad

Abstract:

Creep, being the main silt contributor to the rivers, is a slow, downhill flow of soils. The creep velocity is measured in millimeters to a couple of centimeters per year and is determined with the help of tilt caused by creep in the vertical objects and needs at least ten years to get a reliable creep velocity. This project was devised to calculate creep velocity using dendrochronology and looking for the difference of creep velocity registered by different trees on the same slope. It was concluded that dendrochronology provides a very reliable procedure of creep velocity estimation if ‘J’ shaped trees are studied for their horizontal movement and age. The age of these trees was measured using tree coring, and the horizontal movement was measured with a conventional tape. Using this procedure it does not require decades and additionally the data reveals the creep velocity for up to 150 years and even more instead of just a decade. It was also concluded that the creep velocity does not only depend on bulk density of soil hence no pronounced effect of bulk density was detected.

Keywords: creep velocity, Galiyat, Pakistan, dendrochronology, Nagri Bala

Procedia PDF Downloads 298
1704 A Generalized Family of Estimators for Estimation of Unknown Population Variance in Simple Random Sampling

Authors: Saba Riaz, Syed A. Hussain

Abstract:

This paper is addressing the estimation method of the unknown population variance of the variable of interest. A new generalized class of estimators of the finite population variance has been suggested using the auxiliary information. To improve the precision of the proposed class, known population variance of the auxiliary variable has been used. Mathematical expressions for the biases and the asymptotic variances of the suggested class are derived under large sample approximation. Theoretical and numerical comparisons are made to investigate the performances of the proposed class of estimators. The empirical study reveals that the suggested class of estimators performs better than the usual estimator, classical ratio estimator, classical product estimator and classical linear regression estimator. It has also been found that the suggested class of estimators is also more efficient than some recently published estimators.

Keywords: study variable, auxiliary variable, finite population variance, bias, asymptotic variance, percent relative efficiency

Procedia PDF Downloads 212
1703 Parameter Estimation with Uncertainty and Sensitivity Analysis for the SARS Outbreak in Hong Kong

Authors: Afia Naheed, Manmohan Singh, David Lucy

Abstract:

This work is based on a mathematical as well as statistical study of an SEIJTR deterministic model for the interpretation of transmission of severe acute respiratory syndrome (SARS). Based on the SARS epidemic in 2003, the parameters are estimated using Runge-Kutta (Dormand-Prince pairs) and least squares methods. Possible graphical and numerical techniques are used to validate the estimates. Then effect of the model parameters on the dynamics of the disease is examined using sensitivity and uncertainty analysis. Sensitivity and uncertainty analytical techniques are used in order to analyze the affect of the uncertainty in the obtained parameter estimates and to determine which parameters have the largest impact on controlling the disease dynamics.

Keywords: infectious disease, severe acute respiratory syndrome (SARS), parameter estimation, sensitivity analysis, uncertainty analysis, Runge-Kutta methods, Levenberg-Marquardt method

Procedia PDF Downloads 347
1702 Evaluating Accuracy of Foetal Weight Estimation by Clinicians in Christian Medical College Hospital, India and Its Correlation to Actual Birth Weight: A Clinical Audit

Authors: Aarati Susan Mathew, Radhika Narendra Patel, Jiji Mathew

Abstract:

A retrospective study conducted at Christian Medical College (CMC) Teaching Hospital, Vellore, India on 14th August 2014 to assess the accuracy of clinically estimated foetal weight upon labour admission. Estimating foetal weight is a crucial factor in assessing maternal and foetal complications during and after labour. Medical notes of ninety-eight postnatal women who fulfilled the inclusion criteria were studied to evaluate the correlation between their recorded Estimated Foetal Weight (EFW) on admission and actual birth weight (ABW) of the newborn after delivery. Data concerning maternal and foetal demographics was also noted. Accuracy was determined by absolute percentage error and proportion of estimates within 10% of ABW. Actual birth weights ranged from 950-4080g. A strong positive correlation between EFW and ABW (r=0.904) was noted. Term deliveries (≥40 weeks) in the normal weight range (2500-4000g) had a 59.5% estimation accuracy (n=74) compared to pre-term (<40 weeks) with an estimation accuracy of 0% (n=2). Out of the term deliveries, macrosomic babies (>4000g) were underestimated by 25% (n=3) and low birthweight (LBW) babies were overestimated by 12.7% (n=9). Registrars who estimated foetal weight were accurate in babies within normal weight ranges. However, there needs to be an improvement in predicting weight of macrosomic and LBW foetuses. We have suggested the use of an amended version of the Johnson’s formula for the Indian population for improvement and a need to re-audit once implemented.

Keywords: clinical palpation, estimated foetal weight, pregnancy, India, Johnson’s formula

Procedia PDF Downloads 356
1701 Copula Markov Switching Multifractal Models for Forecasting Value-at-Risk

Authors: Giriraj Achari, Malay Bhattacharyya

Abstract:

In this paper, the effectiveness of Copula Markov Switching Multifractal (MSM) models at forecasting Value-at-Risk of a two-stock portfolio is studied. The innovations are allowed to be drawn from distributions that can capture skewness and leptokurtosis, which are well documented empirical characteristics observed in financial returns. The candidate distributions considered for this purpose are Johnson-SU, Pearson Type-IV and α-Stable distributions. The two univariate marginal distributions are combined using the Student-t copula. The estimation of all parameters is performed by Maximum Likelihood Estimation. Finally, the models are compared in terms of accurate Value-at-Risk (VaR) forecasts using tests of unconditional coverage and independence. It is found that Copula-MSM-models with leptokurtic innovation distributions perform slightly better than Copula-MSM model with Normal innovations. Copula-MSM models, in general, produce better VaR forecasts as compared to traditional methods like Historical Simulation method, Variance-Covariance approach and Copula-Generalized Autoregressive Conditional Heteroscedasticity (Copula-GARCH) models.

Keywords: Copula, Markov Switching, multifractal, value-at-risk

Procedia PDF Downloads 155
1700 The Estimation Method of Stress Distribution for Beam Structures Using the Terrestrial Laser Scanning

Authors: Sang Wook Park, Jun Su Park, Byung Kwan Oh, Yousok Kim, Hyo Seon Park

Abstract:

This study suggests the estimation method of stress distribution for the beam structures based on TLS (Terrestrial Laser Scanning). The main components of method are the creation of the lattices of raw data from TLS to satisfy the suitable condition and application of CSSI (Cubic Smoothing Spline Interpolation) for estimating stress distribution. Estimation of stress distribution for the structural member or the whole structure is one of the important factors for safety evaluation of the structure. Existing sensors which include ESG (Electric strain gauge) and LVDT (Linear Variable Differential Transformer) can be categorized as contact type sensor which should be installed on the structural members and also there are various limitations such as the need of separate space where the network cables are installed and the difficulty of access for sensor installation in real buildings. To overcome these problems inherent in the contact type sensors, TLS system of LiDAR (light detection and ranging), which can measure the displacement of a target in a long range without the influence of surrounding environment and also get the whole shape of the structure, has been applied to the field of structural health monitoring. The important characteristic of TLS measuring is a formation of point clouds which has many points including the local coordinate. Point clouds is not linear distribution but dispersed shape. Thus, to analyze point clouds, the interpolation is needed vitally. Through formation of averaged lattices and CSSI for the raw data, the method which can estimate the displacement of simple beam was developed. Also, the developed method can be extended to calculate the strain and finally applicable to estimate a stress distribution of a structural member. To verify the validity of the method, the loading test on a simple beam was conducted and TLS measured it. Through a comparison of the estimated stress and reference stress, the validity of the method is confirmed.

Keywords: structural healthcare monitoring, terrestrial laser scanning, estimation of stress distribution, coordinate transformation, cubic smoothing spline interpolation

Procedia PDF Downloads 427
1699 Sentiment Classification of Documents

Authors: Swarnadip Ghosh

Abstract:

Sentiment Analysis is the process of detecting the contextual polarity of text. In other words, it determines whether a piece of writing is positive, negative or neutral.Sentiment analysis of documents holds great importance in today's world, when numerous information is stored in databases and in the world wide web. An efficient algorithm to illicit such information, would be beneficial for social, economic as well as medical purposes. In this project, we have developed an algorithm to classify a document into positive or negative. Using our algorithm, we obtained a feature set from the data, and classified the documents based on this feature set. It is important to note that, in the classification, we have not used the independence assumption, which is considered by many procedures like the Naive Bayes. This makes the algorithm more general in scope. Moreover, because of the sparsity and high dimensionality of such data, we did not use empirical distribution for estimation, but developed a method by finding degree of close clustering of the data points. We have applied our algorithm on a movie review data set obtained from IMDb and obtained satisfactory results.

Keywords: sentiment, Run's Test, cross validation, higher dimensional pmf estimation

Procedia PDF Downloads 387
1698 Facial Pose Classification Using Hilbert Space Filling Curve and Multidimensional Scaling

Authors: Mekamı Hayet, Bounoua Nacer, Benabderrahmane Sidahmed, Taleb Ahmed

Abstract:

Pose estimation is an important task in computer vision. Though the majority of the existing solutions provide good accuracy results, they are often overly complex and computationally expensive. In this perspective, we propose the use of dimensionality reduction techniques to address the problem of facial pose estimation. Firstly, a face image is converted into one-dimensional time series using Hilbert space filling curve, then the approach converts these time series data to a symbolic representation. Furthermore, a distance matrix is calculated between symbolic series of an input learning dataset of images, to generate classifiers of frontal vs. profile face pose. The proposed method is evaluated with three public datasets. Experimental results have shown that our approach is able to achieve a correct classification rate exceeding 97% with K-NN algorithm.

Keywords: machine learning, pattern recognition, facial pose classification, time series

Procedia PDF Downloads 339
1697 Rainfall Estimation Using Himawari-8 Meteorological Satellite Imagery in Central Taiwan

Authors: Chiang Wei, Hui-Chung Yeh, Yen-Chang Chen

Abstract:

The objective of this study is to estimate the rainfall using the new generation Himawari-8 meteorological satellite with multi-band, high-bit format, and high spatiotemporal resolution, ground rainfall data at the Chen-Yu-Lan watershed of Joushuei River Basin (443.6 square kilometers) in Central Taiwan. Accurate and fine-scale rainfall information is essential for rugged terrain with high local variation for early warning of flood, landslide, and debris flow disasters. 10-minute and 2 km pixel-based rainfall of Typhoon Megi of 2016 and meiyu on June 1-4 of 2017 were tested to demonstrate the new generation Himawari-8 meteorological satellite can capture rainfall variation in the rugged mountainous area both at fine-scale and watershed scale. The results provide the valuable rainfall information for early warning of future disasters.

Keywords: estimation, Himawari-8, rainfall, satellite imagery

Procedia PDF Downloads 182
1696 Adaptive Multipath Mitigation Acquisition Approach for Global Positioning System Software Receivers

Authors: Animut Meseret Simachew

Abstract:

Parallel Code Phase Search Acquisition (PCSA) Algorithm has been considered as a promising method in GPS software receivers for detection and estimation of the accurate correlation peak between the received Global Positioning System (GPS) signal and locally generated replicas. GPS signal acquisition in highly dense multipath environments is the main research challenge. In this work, we proposed a robust variable step-size (RVSS) PCSA algorithm based on fast frequency transform (FFT) filtering technique to mitigate short time delay multipath signals. Simulation results reveal the effectiveness of the proposed algorithm over the conventional PCSA algorithm. The proposed RVSS-PCSA algorithm equalizes the received carrier wiped-off signal with locally generated C/A code.

Keywords: adaptive PCSA, detection and estimation, GPS signal acquisition, GPS software receiver

Procedia PDF Downloads 107
1695 Vehicle Speed Estimation Using Image Processing

Authors: Prodipta Bhowmik, Poulami Saha, Preety Mehra, Yogesh Soni, Triloki Nath Jha

Abstract:

In India, the smart city concept is growing day by day. So, for smart city development, a better traffic management and monitoring system is a very important requirement. Nowadays, road accidents increase due to more vehicles on the road. Reckless driving is mainly responsible for a huge number of accidents. So, an efficient traffic management system is required for all kinds of roads to control the traffic speed. The speed limit varies from road to road basis. Previously, there was a radar system but due to high cost and less precision, the radar system is unable to become favorable in a traffic management system. Traffic management system faces different types of problems every day and it has become a researchable topic on how to solve this problem. This paper proposed a computer vision and machine learning-based automated system for multiple vehicle detection, tracking, and speed estimation of vehicles using image processing. Detection of vehicles and estimating their speed from a real-time video is tough work to do. The objective of this paper is to detect vehicles and estimate their speed as accurately as possible. So for this, a real-time video is first captured, then the frames are extracted from that video, then from that frames, the vehicles are detected, and thereafter, the tracking of vehicles starts, and finally, the speed of the moving vehicles is estimated. The goal of this method is to develop a cost-friendly system that can able to detect multiple types of vehicles at the same time.

Keywords: OpenCV, Haar Cascade classifier, DLIB, YOLOV3, centroid tracker, vehicle detection, vehicle tracking, vehicle speed estimation, computer vision

Procedia PDF Downloads 68
1694 Channel Estimation/Equalization with Adaptive Modulation and Coding over Multipath Faded Channels for WiMAX

Authors: B. Siva Kumar Reddy, B. Lakshmi

Abstract:

WiMAX has adopted an Adaptive Modulation and Coding (AMC) in OFDM to endure higher data rates and error free transmission. AMC schemes employ the Channel State Information (CSI) to efficiently utilize the channel and maximize the throughput and for better spectral efficiency. This CSI has given to the transmitter by the channel estimators. In this paper, LSE (Least Square Error) and MMSE (Minimum Mean square Error) estimators are suggested and BER (Bit Error Rate) performance has been analyzed. Channel equalization is also integrated with with AMC-OFDM system and presented with Constant Modulus Algorithm (CMA) and Least Mean Square (LMS) algorithms with convergence rates analysis. Simulation results proved that increment in modulation scheme size causes to improvement in throughput along with BER value. There is a trade-off among modulation size, throughput, BER value and spectral efficiency. Results also reported the requirement of channel estimation and equalization in high data rate systems.

Keywords: AMC, CSI, CMA, OFDM, OFDMA, WiMAX

Procedia PDF Downloads 384
1693 Human Motion Capture: New Innovations in the Field of Computer Vision

Authors: Najm Alotaibi

Abstract:

Human motion capture has become one of the major area of interest in the field of computer vision. Some of the major application areas that have been rapidly evolving include the advanced human interfaces, virtual reality and security/surveillance systems. This study provides a brief overview of the techniques and applications used for the markerless human motion capture, which deals with analyzing the human motion in the form of mathematical formulations. The major contribution of this research is that it classifies the computer vision based techniques of human motion capture based on the taxonomy, and then breaks its down into four systematically different categories of tracking, initialization, pose estimation and recognition. The detailed descriptions and the relationships descriptions are given for the techniques of tracking and pose estimation. The subcategories of each process are further described. Various hypotheses have been used by the researchers in this domain are surveyed and the evolution of these techniques have been explained. It has been concluded in the survey that most researchers have focused on using the mathematical body models for the markerless motion capture.

Keywords: human motion capture, computer vision, vision-based, tracking

Procedia PDF Downloads 304
1692 Series Network-Structured Inverse Models of Data Envelopment Analysis: Pitfalls and Solutions

Authors: Zohreh Moghaddas, Morteza Yazdani, Farhad Hosseinzadeh

Abstract:

Nowadays, data envelopment analysis (DEA) models featuring network structures have gained widespread usage for evaluating the performance of production systems and activities (Decision-Making Units (DMUs)) across diverse fields. By examining the relationships between the internal stages of the network, these models offer valuable insights to managers and decision-makers regarding the performance of each stage and its impact on the overall network. To further empower system decision-makers, the inverse data envelopment analysis (IDEA) model has been introduced. This model allows the estimation of crucial information for estimating parameters while keeping the efficiency score unchanged or improved, enabling analysis of the sensitivity of system inputs or outputs according to managers' preferences. This empowers managers to apply their preferences and policies on resources, such as inputs and outputs, and analyze various aspects like production, resource allocation processes, and resource efficiency enhancement within the system. The results obtained can be instrumental in making informed decisions in the future. The top result of this study is an analysis of infeasibility and incorrect estimation that may arise in the theory and application of the inverse model of data envelopment analysis with network structures. By addressing these pitfalls, novel protocols are proposed to circumvent these shortcomings effectively. Subsequently, several theoretical and applied problems are examined and resolved through insightful case studies.

Keywords: inverse models of data envelopment analysis, series network, estimation of inputs and outputs, efficiency, resource allocation, sensitivity analysis, infeasibility

Procedia PDF Downloads 31
1691 Application of UAS in Forest Firefighting for Detecting Ignitions and 3D Fuel Volume Estimation

Authors: Artur Krukowski, Emmanouela Vogiatzaki

Abstract:

The article presents results from the AF3 project “Advanced Forest Fire Fighting” focused on Unmanned Aircraft Systems (UAS)-based 3D surveillance and 3D area mapping using high-resolution photogrammetric methods from multispectral imaging, also taking advantage of the 3D scanning techniques from the SCAN4RECO project. We also present a proprietary embedded sensor system used for the detection of fire ignitions in the forest using near-infrared based scanner with weight and form factors allowing it to be easily deployed on standard commercial micro-UAVs, such as DJI Inspire or Mavic. Results from real-life pilot trials in Greece, Spain, and Israel demonstrated added-value in the use of UAS for precise and reliable detection of forest fires, as well as high-resolution 3D aerial modeling for accurate quantification of human resources and equipment required for firefighting.

Keywords: forest wildfires, surveillance, fuel volume estimation, firefighting, ignition detectors, 3D modelling, UAV

Procedia PDF Downloads 133
1690 The Hyperbolic Smoothing Approach for Automatic Calibration of Rainfall-Runoff Models

Authors: Adilson Elias Xavier, Otto Corrêa Rotunno Filho, Paulo Canedo De Magalhães

Abstract:

This paper addresses the issue of automatic parameter estimation in conceptual rainfall-runoff (CRR) models. Due to threshold structures commonly occurring in CRR models, the associated mathematical optimization problems have the significant characteristic of being strongly non-differentiable. In order to face this enormous task, the resolution method proposed adopts a smoothing strategy using a special C∞ differentiable class function. The final estimation solution is obtained by solving a sequence of differentiable subproblems which gradually approach the original conceptual problem. The use of this technique, called Hyperbolic Smoothing Method (HSM), makes possible the application of the most powerful minimization algorithms, and also allows for the main difficulties presented by the original CRR problem to be overcome. A set of computational experiments is presented for the purpose of illustrating both the reliability and the efficiency of the proposed approach.

Keywords: rainfall-runoff models, automatic calibration, hyperbolic smoothing method

Procedia PDF Downloads 139
1689 Estimation Model for Concrete Slump Recovery by Using Superplasticizer

Authors: Chaiyakrit Raoupatham, Ram Hari Dhakal, Chalermchai Wanichlamlert

Abstract:

This paper is aimed to introduce the solution of concrete slump recovery using chemical admixture type-F (superplasticizer, naphthalene base) to the practice, in order to solve unusable concrete problem due to concrete loss its slump, especially for those tropical countries that have faster slump loss rate. In the other hand, randomly adding superplasticizer into concrete can cause concrete to segregate. Therefore, this paper also develops the estimation model used to calculate amount of second dose of superplasticizer need for concrete slump recovery. Fresh properties of ordinary Portland cement concrete with volumetric ratio of paste to void between aggregate (paste content) of 1.1-1.3 with water-cement ratio zone of 0.30 to 0.67 and initial superplasticizer (naphthalene base) of 0.25%- 1.6% were tested for initial slump and slump loss for every 30 minutes for one and half hour by slump cone test. Those concretes with slump loss range from 10% to 90% were re-dosed and successfully recovered back to its initial slump. Slump after re-dosed was tested by slump cone test. From the result, it has been concluded that, slump loss was slower for those mix with high initial dose of superplasticizer due to addition of superplasticizer will disturb cement hydration. The required second dose of superplasticizer was affected by two major parameter, which were water-cement ratio and paste content, where lower water-cement ratio and paste content cause an increase in require second dose of superplasticizer. The amount of second dose of superplasticizer is higher as the solid content within the system is increase, solid can be either from cement particles or aggregate. The data was analyzed to form an equation use to estimate the amount of second dosage requirement of superplasticizer to recovery slump to its original.

Keywords: estimation model, second superplasticizer dosage, slump loss, slump recovery

Procedia PDF Downloads 187
1688 Evaluation of the Relation between Serum and Saliva Levels of Sodium and Glucose in Healthy Referred Patients to Tabriz Faculty of Dentistry

Authors: Samaneh Nazemi, Ayla Bahramian, Marzieh Aghazadeh

Abstract:

Saliva is a clear liquid composed of water, electrolytes, glucose, amylase, glycoproteins, and antimicrobial enzymes. The presence of a wide range of molecules and proteins in saliva has made this fluid valuable in screening for some diseases as well as epidemiological studies. Saliva is easier than serum to collect in large populations. Due to the importance of sodium and glucose levels in many biological processes, this study investigates the relationship between sodium and glucose levels in salivary and serum samples of healthy individuals referring to Tabriz Dental School. This descriptive-analytical study was performed on 40 healthy individuals referred to the Oral Diseases Department of Tabriz Dental School. Serum and saliva samples were taken from these patients according to standard protocols. Data were presented as mean (standard deviation) and frequency (percentage) for quantitative and qualitative variables. Pearson test, paired-samples T-test and SPSS 24 software were used to determine the correlation between serum and salivary levels of these biomarkers. In this study, P less than 0.05% is considered significant. Out of 40 participants in this study, 14 (35%) were male, and 26 (65%) were female. According to the results of this study, the mean salivary sodium (127.53 ml/dl) was lower than the mean serum sodium (141.2725 ml/dl). In contrast, the mean salivary glucose (4.55 ml/dl) was lower than the mean serum glucose (89.7575 ml/dl). The result of paired samples T-test (p-value<0.05) showed that there is a statistically significant difference between the mean of serum sodium and salivary sodium, as well as between the serum glucose and salivary glucose. Pearson correlation test results showed that there is no significant correlation between serum sodium and salivary sodium (p-value >0.05), but here is a positive correlation between serum glucose and salivary glucose (p-value<0.001). Both serum sodium and glucose were higher than salivary sodium and glucose.In conclusion, this study found that there was not a statistical relationship between salivary glucose and serum glucose and also salivary sodium and serum sodium of healthy individuals. Perhaps salivary samples can’t be used to measure glucose and sodium in these individuals.

Keywords: glucose, saliva, serum, sodium

Procedia PDF Downloads 237
1687 UWB Channel Estimation Using an Efficient Sub-Nyquist Sampling Scheme

Authors: Yaacoub Tina, Youssef Roua, Radoi Emanuel, Burel Gilles

Abstract:

Recently, low-complexity sub-Nyquist sampling schemes based on the Finite Rate of Innovation (FRI) theory have been introduced to sample parametric signals at minimum rates. The multichannel modulating waveforms (MCMW) is such an efficient scheme, where the received signal is mixed with an appropriate set of arbitrary waveforms, integrated and sampled at rates far below the Nyquist rate. In this paper, the MCMW scheme is adapted to the special case of ultra wideband (UWB) channel estimation, characterized by dense multipaths. First, an appropriate structure, which accounts for the bandpass spectrum feature of UWB signals, is defined. Then, a novel approach to decrease the number of processing channels and reduce the complexity of this sampling scheme is presented. Finally, the proposed concepts are validated by simulation results, obtained with real filters, in the framework of a coherent Rake receiver.

Keywords: coherent rake receiver, finite rate of innovation, sub-nyquist sampling, ultra wideband

Procedia PDF Downloads 242
1686 Acceleration-Based Motion Model for Visual Simultaneous Localization and Mapping

Authors: Daohong Yang, Xiang Zhang, Lei Li, Wanting Zhou

Abstract:

Visual Simultaneous Localization and Mapping (VSLAM) is a technology that obtains information in the environment for self-positioning and mapping. It is widely used in computer vision, robotics and other fields. Many visual SLAM systems, such as OBSLAM3, employ a constant-speed motion model that provides the initial pose of the current frame to improve the speed and accuracy of feature matching. However, in actual situations, the constant velocity motion model is often difficult to be satisfied, which may lead to a large deviation between the obtained initial pose and the real value, and may lead to errors in nonlinear optimization results. Therefore, this paper proposed a motion model based on acceleration, which can be applied on most SLAM systems. In order to better describe the acceleration of the camera pose, we decoupled the pose transformation matrix, and calculated the rotation matrix and the translation vector respectively, where the rotation matrix is represented by rotation vector. We assume that, in a short period of time, the changes of rotating angular velocity and translation vector remain the same. Based on this assumption, the initial pose of the current frame is estimated. In addition, the error of constant velocity model was analyzed theoretically. Finally, we applied our proposed approach to the ORBSLAM3 system and evaluated two sets of sequences on the TUM dataset. The results showed that our proposed method had a more accurate initial pose estimation and the accuracy of ORBSLAM3 system is improved by 6.61% and 6.46% respectively on the two test sequences.

Keywords: error estimation, constant acceleration motion model, pose estimation, visual SLAM

Procedia PDF Downloads 79
1685 Reservoir Properties Effect on Estimating Initial Gas in Place Using Flowing Material Balance Method

Authors: Yousef S. Kh. S. Hashem

Abstract:

Accurate estimation of initial gas in place (IGIP) plays an important factor in the decision to develop a gas field. One of the methods that are available in the industry to estimate the IGIP is material balance. This method required that the well has to be shut-in while pressure is measured as it builds to average reservoir pressure. Since gas demand is high and shut-in well surveys are very expensive, flowing gas material balance (FGMB) is sometimes used instead of material balance. This work investigated the effect of reservoir properties (pressure, permeability, and reservoir size) on the estimation of IGIP when using FGMB. A gas reservoir simulator that accounts for friction loss, wellbore storage, and the non-Darcy effect was used to simulate 165 different possible causes (3 pressures, 5 reservoir sizes, and 11 permeabilities). Both tubing pressure and bottom-hole pressure were analyzed using FGMB. The results showed that the FGMB method is very sensitive for tied reservoirs (k < 10). Also, it showed which method is best to be used for different reservoir properties. This study can be used as a guideline for the application of the FGMB method.

Keywords: flowing material balance, gas reservoir, reserves, gas simulator

Procedia PDF Downloads 142
1684 Nonparametric Copula Approximations

Authors: Serge Provost, Yishan Zang

Abstract:

Copulas are currently utilized in finance, reliability theory, machine learning, signal processing, geodesy, hydrology and biostatistics, among several other fields of scientific investigation. It follows from Sklar's theorem that the joint distribution function of a multidimensional random vector can be expressed in terms of its associated copula and marginals. Since marginal distributions can easily be determined by making use of a variety of techniques, we address the problem of securing the distribution of the copula. This will be done by using several approaches. For example, we will obtain bivariate least-squares approximations of the empirical copulas, modify the kernel density estimation technique and propose a criterion for selecting appropriate bandwidths, differentiate linearized empirical copulas, secure Bernstein polynomial approximations of suitable degrees, and apply a corollary to Sklar's result. Illustrative examples involving actual observations will be presented. The proposed methodologies will as well be applied to a sample generated from a known copula distribution in order to validate their effectiveness.

Keywords: copulas, Bernstein polynomial approximation, least-squares polynomial approximation, kernel density estimation, density approximation

Procedia PDF Downloads 56
1683 Effectiveness of Computer Video Games on the Levels of Anxiety of Children Scheduled for Tooth Extraction

Authors: Marji Umil, Miane Karyle Urolaza, Ian Winston Dale Uy, John Charle Magne Valdez, Karen Elizabeth Valdez, Ervin Charles Valencia, Cheryleen Tan-Chua

Abstract:

Objective: Distraction techniques can be successful in reducing the anxiety of children during medical procedures. Dental procedures, in particular, are associated with dental anxiety which has been identified as a significant and common problem in children, however, only limited studies were conducted to address such problem. Thus, this study determined the effectiveness of computer video games on the levels of anxiety of children between 5-12 years old scheduled for tooth extraction. Methods: A pre-test post-test quasi-experimental study was conducted involving 30 randomly-assigned subjects, 15 in the experimental and 15 in the control. Subjects in the experimental group played computer video games for a maximum of 15 minutes, however, no intervention was done on the control. The modified Yale Pre-operative Anxiety Scale (m-YPAS) with a Cronbach’s alpha of 0.9 was used to assess anxiety at two different points: upon arrival in the clinic (pre-test anxiety) and 15 minutes after the first measurement (post-test anxiety). Paired t-test and ANCOVA were used to analyze the gathered data. Results: Results showed that there is a significant difference between the pre-test and post-test anxiety scores of the control group (p=0.0002) which indicates an increased anxiety. A significant difference was also noted between the pre-test and post-test anxiety scores of the experimental group (p=0.0002) which indicates decreased anxiety. Comparatively, the experimental group showed lower anxiety score (p=<0.0001) than the control. Conclusion: The use of computer video games is effective in reducing the pre-operative anxiety among children and can be an alternative non-pharmacological management in giving pre-operative care.

Keywords: play therapy, preoperative anxiety, tooth extraction, video games

Procedia PDF Downloads 433
1682 Community Pharmacist's Perceptions, Attitude and Role in Oral Health Promotion and Diseases Prevention

Authors: Bushra Alghamdi, Alla Alsharif, Hamzah Aljohani, Saba Kassim

Abstract:

Introduction: Collaborative work has always been acknowledged as a fundamental concept in delivering oral health care. Aim: This study aimed to assess the perception and attitude of pharmacists in oral health promotion and to determine the confident levels of pharmacists in delivering advice on oral health problems. Methods: An observational cross-sectional survey, using self-administered anonymous questionnaires, was conducted between March and April 2017. The study recruited a convenience sample of registered community pharmacists who were working in local private pharmaceutical stores in the urban area of Madinah, Kingdom of Saudi Arabia (KSA). A preliminary descriptive analysis was performed. Results: Thirty-five pharmacists have completed the surveys. All participants were males, with a mean age of 35.5 ( ± 6.92) years. Eighty-six percent of the participants reported that pharmacists should have a role in oral health promotion. Eighty percent have reported adequate level of confident when giving advice on most of the common oral health problems that include; oral health related risk behaviors such as tobacco cessation (46%), bleeding gums (63%) and sensitive teeth (60%). However, higher percentages of pharmacists have reported low confident levels when giving advice in relation to specific domain of dentistry, such as lost dental fillings (57%), loose crowns (60%), trauma to teeth (40%), denture-related problems (51%) and oral cancer (6.9%). Conclusion: Community pharmacists recognized their potential role in promoting oral health in KSA. Community pharmacists had varying levels of ability and confidence to offer support for oral health. The study highlighted that inner professional collaboration between pharmacists and dental care healthcare should be enhanced.

Keywords: community, oral health, promotion, pharmacist

Procedia PDF Downloads 185
1681 Allocating Channels and Flow Estimation at Flood Prone Area in Desert, Example from AlKharj City, Saudi Arabia

Authors: Farhan Aljuaidi

Abstract:

The rapid expansion of Alkarj city, Saudi Arabia, towards the outlet of Wadi AlAin is critical for the planners and decision makers. Nowadays, two major projects such as Salman bin Abdulaziz University compound and new industrial area are developed in this flood prone area where no channels are clear and identified. The main contribution of this study is to divert the flow away from these vital projects by reconstructing new channels. To do so, Lidar data were used to generate contour lines for the actual elevation of the highways and local roads. These data were analyzed and compared to the contour lines derived from the topographical maps 1:50.000. The magnitude of the expected flow was estimated using Snyder's Model based on the morphometric data acquired by DEM of the catchment area. The results indicate that maximum discharge peak reaches 2694,3 m3/sec, the mean is 303,7 m3/sec and the minimum is 74,3 m3/sec. The runoff was estimated at 252,2. 610 m3/s, the mean is 41,5. 610 m3/s and the minimum is 12,4. 610 m3/s.

Keywords: Desert flood, Saudi Arabia, Snyder's Model, flow estimation

Procedia PDF Downloads 301
1680 Travel Time Estimation of Public Transport Networks Based on Commercial Incidence Areas in Quito Historic Center

Authors: M. Fernanda Salgado, Alfonso Tierra, David S. Sandoval, Wilbert G. Aguilar

Abstract:

Public transportation buses usually vary the speed depending on the places with the number of passengers. They require having efficient travel planning, a plan that will help them choose the fast route. Initially, an estimation tool is necessary to determine the travel time of each route, clearly establishing the possibilities. In this work, we give a practical solution that makes use of a concept that defines as areas of commercial incidence. These areas are based on the hypothesis that in the commercial places there is a greater flow of people and therefore the buses remain more time in the stops. The areas have one or more segments of routes, which have an incidence factor that allows to estimate the times. In addition, initial results are presented that verify the hypotheses and that promise adequately the travel times. In a future work, we take this approach to make an efficient travel planning system.

Keywords: commercial incidence, planning, public transport, speed travel, travel time

Procedia PDF Downloads 231
1679 A Study on the Relation among Primary Care Professionals Serving Disadvantaged Community, Socioeconomic Status, and Adverse Health Outcome

Authors: Chau-Kuang Chen, Juanita Buford, Colette Davis, Raisha Allen, John Hughes, James Tyus, Dexter Samuels

Abstract:

During the post-Civil War era, the city of Nashville, Tennessee, had the highest mortality rate in the country. The elevated death and disease among ex-slaves were attributable to the unavailability of healthcare. To address the paucity of healthcare services, the College, an institution with the mission of educating minority professionals and serving the under served population, was established in 1876. This study was designed to assess if the College has accomplished its mission of serving under served communities and contributed to the elimination of health disparities in the United States. The study objective was to quantify the impact of socioeconomic status and adverse health outcomes on primary care professionals serving disadvantaged communities, which, in turn, was significantly associated with a health professional shortage score partly designated by the U.S. Department of Health and Human Services. Various statistical methods were used to analyze the alumni data in years 1975 – 2013. K-means cluster analysis was utilized to identify individual medical and dental graduates into the cluster groups of the practice communities (Disadvantaged or Non-disadvantaged Communities). Discriminant analysis was implemented to verify the classification accuracy of cluster analysis. The independent t test was performed to detect the significant mean differences for clustering and criterion variables between Disadvantaged and Non-disadvantaged Communities, which confirms the “content” validity of cluster analysis model. Chi-square test was used to assess if the proportion of cluster groups (Disadvantaged vs Non-disadvantaged Communities) were consistent with that of practicing specialties (primary care vs. non-primary care). Finally, the partial least squares (PLS) path model was constructed to explore the “construct” validity of analytics model by providing the magnitude effects of socioeconomic status and adverse health outcome on primary care professionals serving disadvantaged community. The social ecological theory along with statistical models mentioned was used to establish the relationship between medical and dental graduates (primary care professionals serving disadvantaged communities) and their social environments (socioeconomic status, adverse health outcome, health professional shortage score). Based on social ecological framework, it was hypothesized that the impact of socioeconomic status and adverse health outcomes on primary care professionals serving disadvantaged communities could be quantified. Also, primary care professionals serving disadvantaged communities related to a health professional shortage score can be measured. Adverse health outcome (adult obesity rate, age-adjusted premature mortality rate, and percent of people diagnosed with diabetes) could be affected by the latent variable, namely socioeconomic status (unemployment rate, poverty rate, percent of children who were in free lunch programs, and percent of uninsured adults). The study results indicated that approximately 83% (3,192/3,864) of the College’s medical and dental graduates from 1975 to 2013 were practicing in disadvantaged communities. In addition, the PLS path modeling demonstrated that primary care professionals serving disadvantaged community was significantly associated with socioeconomic status and adverse health outcome (p < .001). In summary, the majority of medical and dental graduates from the College provide primary care services to disadvantaged communities with low socioeconomic status and high adverse health outcomes, which demonstrate that the College has fulfilled its mission.

Keywords: disadvantaged community, K-means cluster analysis, PLS path modeling, primary care

Procedia PDF Downloads 539
1678 Hybrid Robust Estimation via Median Filter and Wavelet Thresholding with Automatic Boundary Correction

Authors: Alsaidi M. Altaher, Mohd Tahir Ismail

Abstract:

Wavelet thresholding has been a power tool in curve estimation and data analysis. In the presence of outliers this non parametric estimator can not suppress the outliers involved. This study proposes a new two-stage combined method based on the use of the median filter as primary step before applying wavelet thresholding. After suppressing the outliers in a signal through the median filter, the classical wavelet thresholding is then applied for removing the remaining noise. We use automatic boundary corrections; using a low order polynomial model or local polynomial model as a more realistic rule to correct the bias at the boundary region; instead of using the classical assumptions such periodic or symmetric. A simulation experiment has been conducted to evaluate the numerical performance of the proposed method. Results show strong evidences that the proposed method is extremely effective in terms of correcting the boundary bias and eliminating outlier’s sensitivity.

Keywords: boundary correction, median filter, simulation, wavelet thresholding

Procedia PDF Downloads 419
1677 A New Method to Estimate the Low Income Proportion: Monte Carlo Simulations

Authors: Encarnación Álvarez, Rosa M. García-Fernández, Juan F. Muñoz

Abstract:

Estimation of a proportion has many applications in economics and social studies. A common application is the estimation of the low income proportion, which gives the proportion of people classified as poor into a population. In this paper, we present this poverty indicator and propose to use the logistic regression estimator for the problem of estimating the low income proportion. Various sampling designs are presented. Assuming a real data set obtained from the European Survey on Income and Living Conditions, Monte Carlo simulation studies are carried out to analyze the empirical performance of the logistic regression estimator under the various sampling designs considered in this paper. Results derived from Monte Carlo simulation studies indicate that the logistic regression estimator can be more accurate than the customary estimator under the various sampling designs considered in this paper. The stratified sampling design can also provide more accurate results.

Keywords: poverty line, risk of poverty, auxiliary variable, ratio method

Procedia PDF Downloads 444
1676 Synthesis and Characterization of Silver/Graphene Oxide Co-Decorated TiO2 Nanotubular Arrays for Biomedical Applications

Authors: Alireza Rafieerad, Bushroa Abd Razak, Bahman Nasiri Tabrizi, Jamunarani Vadivelu

Abstract:

Recently, reports on the fabrication of nanotubular arrays have generated considerable scientific interest, owing to the broad range of applications of the oxide nanotubes in solar cells, orthopedic and dental implants, photocatalytic devices as well as lithium-ion batteries. A more attractive approach for the fabrication of oxide nanotubes with controllable morphology is the electrochemical anodization of substrate in a fluoride-containing electrolyte. Consequently, titanium dioxide nanotubes (TiO2 NTs) have been highly considered as an applicable material particularly in the district of artificial implants. In addition, regarding long-term efficacy and reasons of failing and infection after surgery of currently used dental implants required to enhance the cytocompatibility properties of Ti-based bone-like tissue. As well, graphene oxide (GO) with relevant biocompatibility features in tissue sites, osseointegration and drug delivery functionalization was fully understood. Besides, the boasting antibacterial ability of silver (Ag) remarkably provided for implantable devices without infection symptoms. Here, surface modification of Ti–6Al–7Nb implants (Ti67IMP) by the development of Ag/GO co-decorated TiO2 NTs was examined. Initially, the anodic TiO2 nanotubes obtained at a constant potential of 60 V were annealed at 600 degree centigrade for 2 h to improve the adhesion of the coating. Afterward, the Ag/GO co-decorated TiO2 NTs were developed by spin coating on Ti67IM. The microstructural features, phase composition and wettability behavior of the nanostructured coating were characterized comparably. In a nutshell, the results of the present study may contribute to the development of the nanostructured Ti67IMP with improved surface properties.

Keywords: anodic tio2 nanotube, biomedical applications, graphene oxide, silver, spin coating

Procedia PDF Downloads 317