Search results for: bayesian estimation
1631 Volume Estimation of Trees: An Exploratory Study on Rosewood Logging Within Forest Transition and Savannah Ecological Zones of Ghana
Authors: Albert Kwabena Osei Konadu
Abstract:
One of the endemic forest species of the savannah transition zones enlisted by the Convention of International Treaty for Endangered Species (CITES) in Appendix II is the Rosewood, also known as Pterocarpus erinaceus or Krayie. Its economic viability has made it increasingly popular and in high demand. Ghana’s forest resource management regime for these ecozones is mainly on conservation and very little on resource utilization. Consequently, commercial logging management standards are at teething stage and not fully developed, leading to a deficiency in the monitoring of logging operations and quantification of harvested trees volumes. Tree information form (TIF); a volume estimation and tracking regime, has proven to be an effective sustainable management tool for regulating timber resource extraction in the high forest zones of the country. This work aims to generate TIF that can track and capture requisite parameters to accurately estimate the volume of harvested rosewood within forest savannah transition zones. Tree information forms were created on three scenarios of individual billets, stacked billets and conveying vessel basis. The study was limited by the usage of regulators assigned volume as benchmark and also fraught with potential volume measurement error in the stacked billet scenario due to the existence of spaces within packed billets. These TIFs were field-tested to deduce the most viable option for the tracking and estimation of harvested volumes of rosewood using the smallian and cubic volume estimation formula. Overall, four districts were covered with individual billets, stacked billets and conveying vessel scenarios registering mean volumes of 25.83m3,45.08m3 and 32.6m3, respectively. These adduced volumes were validated by benchmarking to assigned volumes of the Forestry Commission of Ghana and known standard volumes of conveying vessels. The results did indicate an underestimation of extracted volumes under the quotas regime, a situation that could lead to unintended overexploitation of the species. The research revealed conveying vessels route is the most viable volume estimation and tracking regime for the sustainable management of the Pterocarpous erinaceus species as it provided a more practical volume estimate and data extraction protocol.Keywords: cubic volume formula, smallian volume formula, pterocarpus erinaceus, tree information form, forest transition and savannah zones, harvested tree volume
Procedia PDF Downloads 441630 Applications of Analytical Probabilistic Approach in Urban Stormwater Modeling in New Zealand
Authors: Asaad Y. Shamseldin
Abstract:
Analytical probabilistic approach is an innovative approach for urban stormwater modeling. It can provide information about the long-term performance of a stormwater management facility without being computationally very demanding. This paper explores the application of the analytical probabilistic approach in New Zealand. The paper presents the results of a case study aimed at development of an objective way of identifying what constitutes a rainfall storm event and the estimation of the corresponding statistical properties of storms using two selected automatic rainfall stations located in the Auckland region in New Zealand. The storm identification and the estimation of the storm statistical properties are regarded as the first step in the development of the analytical probabilistic models. The paper provides a recommendation about the definition of the storm inter-event time to be used in conjunction with the analytical probabilistic approach.Keywords: hydrology, rainfall storm, storm inter-event time, New Zealand, stormwater management
Procedia PDF Downloads 3441629 On the Optimality of Blocked Main Effects Plans
Authors: Rita SahaRay, Ganesh Dutta
Abstract:
In this article, experimental situations are considered where a main effects plan is to be used to study m two-level factors using n runs which are partitioned into b blocks, not necessarily of same size. Assuming the block sizes to be even for all blocks, for the case n ≡ 2 (mod 4), optimal designs are obtained with respect to type 1 and type 2 optimality criteria in the class of designs providing estimation of all main effects orthogonal to the block effects. In practice, such orthogonal estimation of main effects is often a desirable condition. In the wider class of all available m two level even sized blocked main effects plans, where the factors do not occur at high and low levels equally often in each block, E-optimal designs are also characterized. Simple construction methods based on Hadamard matrices and Kronecker product for these optimal designs are presented.Keywords: design matrix, Hadamard matrix, Kronecker product, type 1 criteria, type 2 criteria
Procedia PDF Downloads 3661628 Learning the Dynamics of Articulated Tracked Vehicles
Authors: Mario Gianni, Manuel A. Ruiz Garcia, Fiora Pirri
Abstract:
In this work, we present a Bayesian non-parametric approach to model the motion control of ATVs. The motion control model is based on a Dirichlet Process-Gaussian Process (DP-GP) mixture model. The DP-GP mixture model provides a flexible representation of patterns of control manoeuvres along trajectories of different lengths and discretizations. The model also estimates the number of patterns, sufficient for modeling the dynamics of the ATV.Keywords: Dirichlet processes, gaussian mixture models, learning motion patterns, tracked robots for urban search and rescue
Procedia PDF Downloads 4491627 Estimation of Stress-Strength Parameter for Burr Type XII Distribution Based on Progressive Type-II Censoring
Authors: A. M. Abd-Elfattah, M. H. Abu-Moussa
Abstract:
In this paper, the estimation of stress-strength parameter R = P(Y < X) is considered when X; Y the strength and stress respectively are two independent random variables of Burr Type XII distribution. The samples taken for X and Y are progressively censoring of type II. The maximum likelihood estimator (MLE) of R is obtained when the common parameter is unknown. But when the common parameter is known the MLE, uniformly minimum variance unbiased estimator (UMVUE) and the Bayes estimator of R = P(Y < X) are obtained. The exact condence interval of R based on MLE is obtained. The performance of the proposed estimators is compared using the computer simulation.Keywords: Burr Type XII distribution, progressive type-II censoring, stress-strength model, unbiased estimator, maximum-likelihood estimator, uniformly minimum variance unbiased estimator, confidence intervals, Bayes estimator
Procedia PDF Downloads 4561626 Estimation of the External Force for a Co-Manipulation Task Using the Drive Chain Robot
Authors: Sylvain Devie, Pierre-Philippe Robet, Yannick Aoustin, Maxime Gautier
Abstract:
The aim of this paper is to show that the observation of the external effort and the sensor-less control of a system is limited by the mechanical system. First, the model of a one-joint robot with a prismatic joint is presented. Based on this model, two different procedures were performed in order to identify the mechanical parameters of the system and observe the external effort applied on it. Experiments have proven that the accuracy of the force observer, based on the DC motor current, is limited by the mechanics of the robot. The sensor-less control will be limited by the accuracy in estimation of the mechanical parameters and by the maximum static friction force, that is the minimum force which can be observed in this case. The consequence of this limitation is that industrial robots without specific design are not well adapted to perform sensor-less precision tasks. Finally, an efficient control law is presented for high effort applications.Keywords: control, identification, robot, co-manipulation, sensor-less
Procedia PDF Downloads 1611625 Results of EPR Dosimetry Study of Population Residing in the Vicinity of the Uranium Mines and Uranium Processing Plant
Authors: K. Zhumadilov, P. Kazymbet, A. Ivannikov, M. Bakhtin, A. Akylbekov, K. Kadyrzhanov, A. Morzabayev, M. Hoshi
Abstract:
The aim of the study is to evaluate the possible excess of dose received by uranium processing plant workers. The possible excess of dose of workers was evaluated with comparison with population pool (Stepnogorsk) and control pool (Astana city). The measured teeth samples were extracted according to medical indications. In total, twenty-seven tooth enamel samples were analyzed from the residents of Stepnogorsk city (180 km from Astana city, Kazakhstan). About 6 tooth samples were collected from the workers of uranium processing plant. The results of tooth enamel dose estimation show us small influence of working conditions to workers, the maximum excess dose is less than 100 mGy. This is pilot study of EPR dose estimation and for a final conclusion additional sample is required.Keywords: EPR dose, workers, uranium mines, tooth samples
Procedia PDF Downloads 4111624 Yield Loss Estimation Using Multiple Drought Severity Indices
Authors: Sara Tokhi Arab, Rozo Noguchi, Tofeal Ahamed
Abstract:
Drought is a natural disaster that occurs in a region due to a lack of precipitation and high temperatures over a continuous period or in a single season as a consequence of climate change. Precipitation deficits and prolonged high temperatures mostly affect the agricultural sector, water resources, socioeconomics, and the environment. Consequently, it causes agricultural product loss, food shortage, famines, migration, and natural resources degradation in a region. Agriculture is the first sector affected by drought. Therefore, it is important to develop an agricultural drought risk and loss assessment to mitigate the drought impact in the agriculture sector. In this context, the main purpose of this study was to assess yield loss using composite drought indices in the drought-affected vineyards. In this study, the CDI was developed for the years 2016 to 2020 by comprising five indices: the vegetation condition index (VCI), temperature condition index (TCI), deviation of NDVI from the long-term mean (NDVI DEV), normalized difference moisture index (NDMI) and precipitation condition index (PCI). Moreover, the quantitative principal component analysis (PCA) approach was used to assign a weight for each input parameter, and then the weights of all the indices were combined into one composite drought index. Finally, Bayesian regularized artificial neural networks (BRANNs) were used to evaluate the yield variation in each affected vineyard. The composite drought index result indicated the moderate to severe droughts were observed across the Kabul Province during 2016 and 2018. Moreover, the results showed that there was no vineyard in extreme drought conditions. Therefore, we only considered the severe and moderated condition. According to the BRANNs results R=0.87 and R=0.94 in severe drought conditions for the years of 2016 and 2018 and the R= 0.85 and R=0.91 in moderate drought conditions for the years of 2016 and 2018, respectively. In the Kabul Province within the two years drought periods, there was a significate deficit in the vineyards. According to the findings, 2018 had the highest rate of loss almost -7 ton/ha. However, in 2016 the loss rates were about – 1.2 ton/ha. This research will support stakeholders to identify drought affect vineyards and support farmers during severe drought.Keywords: grapes, composite drought index, yield loss, satellite remote sensing
Procedia PDF Downloads 1571623 A Method to Determine Cutting Force Coefficients in Turning Using Mechanistic Approach
Authors: T. C. Bera, A. Bansal, D. Nema
Abstract:
During performing turning operation, cutting force plays a significant role in metal cutting process affecting tool-work piece deflection, vibration and eventually part quality. The present research work aims to develop a mechanistic cutting force model and to study the mechanistic constants used in the force model in case of turning operation. The proposed model can be used for the reliable and accurate estimation of the cutting forces establishing relationship of various force components (cutting force and feed force) with uncut chip thickness. The accurate estimation of cutting force is required to improve thin-walled part accuracy by controlling the tool-work piece deflection induced surface errors and tool-work piece vibration.Keywords: turning, cutting forces, cutting constants, uncut chip thickness
Procedia PDF Downloads 5221622 Use of Dendrochronology in Estimation of Creep Velocity and Its Dependence on the Bulk Density of Soils
Authors: Mohammad Amjad Sabir, Ishtiaq Khan, Shahid Ali, Umar Shabbir, Aneel Ahmad
Abstract:
Creep, being the main silt contributor to the rivers, is a slow, downhill flow of soils. The creep velocity is measured in millimeters to a couple of centimeters per year and is determined with the help of tilt caused by creep in the vertical objects and needs at least ten years to get a reliable creep velocity. This project was devised to calculate creep velocity using dendrochronology and looking for the difference of creep velocity registered by different trees on the same slope. It was concluded that dendrochronology provides a very reliable procedure of creep velocity estimation if ‘J’ shaped trees are studied for their horizontal movement and age. The age of these trees was measured using tree coring, and the horizontal movement was measured with a conventional tape. Using this procedure it does not require decades and additionally the data reveals the creep velocity for up to 150 years and even more instead of just a decade. It was also concluded that the creep velocity does not only depend on bulk density of soil hence no pronounced effect of bulk density was detected.Keywords: creep velocity, Galiyat, Pakistan, dendrochronology, Nagri Bala
Procedia PDF Downloads 3151621 A Generalized Family of Estimators for Estimation of Unknown Population Variance in Simple Random Sampling
Authors: Saba Riaz, Syed A. Hussain
Abstract:
This paper is addressing the estimation method of the unknown population variance of the variable of interest. A new generalized class of estimators of the finite population variance has been suggested using the auxiliary information. To improve the precision of the proposed class, known population variance of the auxiliary variable has been used. Mathematical expressions for the biases and the asymptotic variances of the suggested class are derived under large sample approximation. Theoretical and numerical comparisons are made to investigate the performances of the proposed class of estimators. The empirical study reveals that the suggested class of estimators performs better than the usual estimator, classical ratio estimator, classical product estimator and classical linear regression estimator. It has also been found that the suggested class of estimators is also more efficient than some recently published estimators.Keywords: study variable, auxiliary variable, finite population variance, bias, asymptotic variance, percent relative efficiency
Procedia PDF Downloads 2251620 Parameter Estimation with Uncertainty and Sensitivity Analysis for the SARS Outbreak in Hong Kong
Authors: Afia Naheed, Manmohan Singh, David Lucy
Abstract:
This work is based on a mathematical as well as statistical study of an SEIJTR deterministic model for the interpretation of transmission of severe acute respiratory syndrome (SARS). Based on the SARS epidemic in 2003, the parameters are estimated using Runge-Kutta (Dormand-Prince pairs) and least squares methods. Possible graphical and numerical techniques are used to validate the estimates. Then effect of the model parameters on the dynamics of the disease is examined using sensitivity and uncertainty analysis. Sensitivity and uncertainty analytical techniques are used in order to analyze the affect of the uncertainty in the obtained parameter estimates and to determine which parameters have the largest impact on controlling the disease dynamics.Keywords: infectious disease, severe acute respiratory syndrome (SARS), parameter estimation, sensitivity analysis, uncertainty analysis, Runge-Kutta methods, Levenberg-Marquardt method
Procedia PDF Downloads 3611619 Evaluating Accuracy of Foetal Weight Estimation by Clinicians in Christian Medical College Hospital, India and Its Correlation to Actual Birth Weight: A Clinical Audit
Authors: Aarati Susan Mathew, Radhika Narendra Patel, Jiji Mathew
Abstract:
A retrospective study conducted at Christian Medical College (CMC) Teaching Hospital, Vellore, India on 14th August 2014 to assess the accuracy of clinically estimated foetal weight upon labour admission. Estimating foetal weight is a crucial factor in assessing maternal and foetal complications during and after labour. Medical notes of ninety-eight postnatal women who fulfilled the inclusion criteria were studied to evaluate the correlation between their recorded Estimated Foetal Weight (EFW) on admission and actual birth weight (ABW) of the newborn after delivery. Data concerning maternal and foetal demographics was also noted. Accuracy was determined by absolute percentage error and proportion of estimates within 10% of ABW. Actual birth weights ranged from 950-4080g. A strong positive correlation between EFW and ABW (r=0.904) was noted. Term deliveries (≥40 weeks) in the normal weight range (2500-4000g) had a 59.5% estimation accuracy (n=74) compared to pre-term (<40 weeks) with an estimation accuracy of 0% (n=2). Out of the term deliveries, macrosomic babies (>4000g) were underestimated by 25% (n=3) and low birthweight (LBW) babies were overestimated by 12.7% (n=9). Registrars who estimated foetal weight were accurate in babies within normal weight ranges. However, there needs to be an improvement in predicting weight of macrosomic and LBW foetuses. We have suggested the use of an amended version of the Johnson’s formula for the Indian population for improvement and a need to re-audit once implemented.Keywords: clinical palpation, estimated foetal weight, pregnancy, India, Johnson’s formula
Procedia PDF Downloads 3631618 Copula Markov Switching Multifractal Models for Forecasting Value-at-Risk
Authors: Giriraj Achari, Malay Bhattacharyya
Abstract:
In this paper, the effectiveness of Copula Markov Switching Multifractal (MSM) models at forecasting Value-at-Risk of a two-stock portfolio is studied. The innovations are allowed to be drawn from distributions that can capture skewness and leptokurtosis, which are well documented empirical characteristics observed in financial returns. The candidate distributions considered for this purpose are Johnson-SU, Pearson Type-IV and α-Stable distributions. The two univariate marginal distributions are combined using the Student-t copula. The estimation of all parameters is performed by Maximum Likelihood Estimation. Finally, the models are compared in terms of accurate Value-at-Risk (VaR) forecasts using tests of unconditional coverage and independence. It is found that Copula-MSM-models with leptokurtic innovation distributions perform slightly better than Copula-MSM model with Normal innovations. Copula-MSM models, in general, produce better VaR forecasts as compared to traditional methods like Historical Simulation method, Variance-Covariance approach and Copula-Generalized Autoregressive Conditional Heteroscedasticity (Copula-GARCH) models.Keywords: Copula, Markov Switching, multifractal, value-at-risk
Procedia PDF Downloads 1651617 The Estimation Method of Stress Distribution for Beam Structures Using the Terrestrial Laser Scanning
Authors: Sang Wook Park, Jun Su Park, Byung Kwan Oh, Yousok Kim, Hyo Seon Park
Abstract:
This study suggests the estimation method of stress distribution for the beam structures based on TLS (Terrestrial Laser Scanning). The main components of method are the creation of the lattices of raw data from TLS to satisfy the suitable condition and application of CSSI (Cubic Smoothing Spline Interpolation) for estimating stress distribution. Estimation of stress distribution for the structural member or the whole structure is one of the important factors for safety evaluation of the structure. Existing sensors which include ESG (Electric strain gauge) and LVDT (Linear Variable Differential Transformer) can be categorized as contact type sensor which should be installed on the structural members and also there are various limitations such as the need of separate space where the network cables are installed and the difficulty of access for sensor installation in real buildings. To overcome these problems inherent in the contact type sensors, TLS system of LiDAR (light detection and ranging), which can measure the displacement of a target in a long range without the influence of surrounding environment and also get the whole shape of the structure, has been applied to the field of structural health monitoring. The important characteristic of TLS measuring is a formation of point clouds which has many points including the local coordinate. Point clouds is not linear distribution but dispersed shape. Thus, to analyze point clouds, the interpolation is needed vitally. Through formation of averaged lattices and CSSI for the raw data, the method which can estimate the displacement of simple beam was developed. Also, the developed method can be extended to calculate the strain and finally applicable to estimate a stress distribution of a structural member. To verify the validity of the method, the loading test on a simple beam was conducted and TLS measured it. Through a comparison of the estimated stress and reference stress, the validity of the method is confirmed.Keywords: structural healthcare monitoring, terrestrial laser scanning, estimation of stress distribution, coordinate transformation, cubic smoothing spline interpolation
Procedia PDF Downloads 4331616 Age Estimation from Upper Anterior Teeth by Pulp/Tooth Ratio Using Peri-Apical X-Rays among Egyptians
Authors: Fatma Mohamed Magdy Badr El Dine, Amr Mohamed Abd Allah
Abstract:
Introduction: Age estimation of individuals is one of the crucial steps in forensic practice. Different traditional methods rely on the length of the diaphysis of long bones of limbs, epiphyseal-diaphyseal union, fusion of the primary ossification centers as well as dental eruption. However, there is a growing need for the development of precise and reliable methods to estimate age, especially in cases where dismembered corpses, burnt bodies, purified or fragmented parts are recovered. Teeth are the hardest and indestructible structure in the human body. In recent years, assessment of pulp/tooth area ratio, as an indirect quantification of secondary dentine deposition has received a considerable attention. However, scanty work has been done in Egypt in terms of applicability of pulp/tooth ratio for age estimation. Aim of the Work: The present work was designed to assess the Cameriere’s method for age estimation from pulp/tooth ratio of maxillary canines, central and lateral incisors among a sample from Egyptian population. In addition, to formulate regression equations to be used as population-based standards for age determination. Material and Methods: The present study was conducted on 270 peri-apical X-rays of maxillary canines, central and lateral incisors (collected from 131 males and 139 females aged between 19 and 52 years). The pulp and tooth areas were measured using the Adobe Photoshop software program and the pulp/tooth area ratio was computed. Linear regression equations were determined separately for canines, central and lateral incisors. Results: A significant correlation was recorded between the pulp/tooth area ratio and the chronological age. The linear regression analysis revealed a coefficient of determination (R² = 0.824 for canine, 0.588 for central incisor and 0.737 for lateral incisor teeth). Three regression equations were derived. Conclusion: As a conclusion, the pulp/tooth ratio is a useful technique for estimating age among Egyptians. Additionally, the regression equation derived from canines gave better result than the incisors.Keywords: age determination, canines, central incisors, Egypt, lateral incisors, pulp/tooth ratio
Procedia PDF Downloads 1841615 Sentiment Classification of Documents
Authors: Swarnadip Ghosh
Abstract:
Sentiment Analysis is the process of detecting the contextual polarity of text. In other words, it determines whether a piece of writing is positive, negative or neutral.Sentiment analysis of documents holds great importance in today's world, when numerous information is stored in databases and in the world wide web. An efficient algorithm to illicit such information, would be beneficial for social, economic as well as medical purposes. In this project, we have developed an algorithm to classify a document into positive or negative. Using our algorithm, we obtained a feature set from the data, and classified the documents based on this feature set. It is important to note that, in the classification, we have not used the independence assumption, which is considered by many procedures like the Naive Bayes. This makes the algorithm more general in scope. Moreover, because of the sparsity and high dimensionality of such data, we did not use empirical distribution for estimation, but developed a method by finding degree of close clustering of the data points. We have applied our algorithm on a movie review data set obtained from IMDb and obtained satisfactory results.Keywords: sentiment, Run's Test, cross validation, higher dimensional pmf estimation
Procedia PDF Downloads 4021614 Facial Pose Classification Using Hilbert Space Filling Curve and Multidimensional Scaling
Authors: Mekamı Hayet, Bounoua Nacer, Benabderrahmane Sidahmed, Taleb Ahmed
Abstract:
Pose estimation is an important task in computer vision. Though the majority of the existing solutions provide good accuracy results, they are often overly complex and computationally expensive. In this perspective, we propose the use of dimensionality reduction techniques to address the problem of facial pose estimation. Firstly, a face image is converted into one-dimensional time series using Hilbert space filling curve, then the approach converts these time series data to a symbolic representation. Furthermore, a distance matrix is calculated between symbolic series of an input learning dataset of images, to generate classifiers of frontal vs. profile face pose. The proposed method is evaluated with three public datasets. Experimental results have shown that our approach is able to achieve a correct classification rate exceeding 97% with K-NN algorithm.Keywords: machine learning, pattern recognition, facial pose classification, time series
Procedia PDF Downloads 3501613 Rainfall Estimation Using Himawari-8 Meteorological Satellite Imagery in Central Taiwan
Authors: Chiang Wei, Hui-Chung Yeh, Yen-Chang Chen
Abstract:
The objective of this study is to estimate the rainfall using the new generation Himawari-8 meteorological satellite with multi-band, high-bit format, and high spatiotemporal resolution, ground rainfall data at the Chen-Yu-Lan watershed of Joushuei River Basin (443.6 square kilometers) in Central Taiwan. Accurate and fine-scale rainfall information is essential for rugged terrain with high local variation for early warning of flood, landslide, and debris flow disasters. 10-minute and 2 km pixel-based rainfall of Typhoon Megi of 2016 and meiyu on June 1-4 of 2017 were tested to demonstrate the new generation Himawari-8 meteorological satellite can capture rainfall variation in the rugged mountainous area both at fine-scale and watershed scale. The results provide the valuable rainfall information for early warning of future disasters.Keywords: estimation, Himawari-8, rainfall, satellite imagery
Procedia PDF Downloads 1941612 Adaptive Multipath Mitigation Acquisition Approach for Global Positioning System Software Receivers
Authors: Animut Meseret Simachew
Abstract:
Parallel Code Phase Search Acquisition (PCSA) Algorithm has been considered as a promising method in GPS software receivers for detection and estimation of the accurate correlation peak between the received Global Positioning System (GPS) signal and locally generated replicas. GPS signal acquisition in highly dense multipath environments is the main research challenge. In this work, we proposed a robust variable step-size (RVSS) PCSA algorithm based on fast frequency transform (FFT) filtering technique to mitigate short time delay multipath signals. Simulation results reveal the effectiveness of the proposed algorithm over the conventional PCSA algorithm. The proposed RVSS-PCSA algorithm equalizes the received carrier wiped-off signal with locally generated C/A code.Keywords: adaptive PCSA, detection and estimation, GPS signal acquisition, GPS software receiver
Procedia PDF Downloads 1171611 Vehicle Speed Estimation Using Image Processing
Authors: Prodipta Bhowmik, Poulami Saha, Preety Mehra, Yogesh Soni, Triloki Nath Jha
Abstract:
In India, the smart city concept is growing day by day. So, for smart city development, a better traffic management and monitoring system is a very important requirement. Nowadays, road accidents increase due to more vehicles on the road. Reckless driving is mainly responsible for a huge number of accidents. So, an efficient traffic management system is required for all kinds of roads to control the traffic speed. The speed limit varies from road to road basis. Previously, there was a radar system but due to high cost and less precision, the radar system is unable to become favorable in a traffic management system. Traffic management system faces different types of problems every day and it has become a researchable topic on how to solve this problem. This paper proposed a computer vision and machine learning-based automated system for multiple vehicle detection, tracking, and speed estimation of vehicles using image processing. Detection of vehicles and estimating their speed from a real-time video is tough work to do. The objective of this paper is to detect vehicles and estimate their speed as accurately as possible. So for this, a real-time video is first captured, then the frames are extracted from that video, then from that frames, the vehicles are detected, and thereafter, the tracking of vehicles starts, and finally, the speed of the moving vehicles is estimated. The goal of this method is to develop a cost-friendly system that can able to detect multiple types of vehicles at the same time.Keywords: OpenCV, Haar Cascade classifier, DLIB, YOLOV3, centroid tracker, vehicle detection, vehicle tracking, vehicle speed estimation, computer vision
Procedia PDF Downloads 841610 Channel Estimation/Equalization with Adaptive Modulation and Coding over Multipath Faded Channels for WiMAX
Authors: B. Siva Kumar Reddy, B. Lakshmi
Abstract:
WiMAX has adopted an Adaptive Modulation and Coding (AMC) in OFDM to endure higher data rates and error free transmission. AMC schemes employ the Channel State Information (CSI) to efficiently utilize the channel and maximize the throughput and for better spectral efficiency. This CSI has given to the transmitter by the channel estimators. In this paper, LSE (Least Square Error) and MMSE (Minimum Mean square Error) estimators are suggested and BER (Bit Error Rate) performance has been analyzed. Channel equalization is also integrated with with AMC-OFDM system and presented with Constant Modulus Algorithm (CMA) and Least Mean Square (LMS) algorithms with convergence rates analysis. Simulation results proved that increment in modulation scheme size causes to improvement in throughput along with BER value. There is a trade-off among modulation size, throughput, BER value and spectral efficiency. Results also reported the requirement of channel estimation and equalization in high data rate systems.Keywords: AMC, CSI, CMA, OFDM, OFDMA, WiMAX
Procedia PDF Downloads 3931609 Human Motion Capture: New Innovations in the Field of Computer Vision
Authors: Najm Alotaibi
Abstract:
Human motion capture has become one of the major area of interest in the field of computer vision. Some of the major application areas that have been rapidly evolving include the advanced human interfaces, virtual reality and security/surveillance systems. This study provides a brief overview of the techniques and applications used for the markerless human motion capture, which deals with analyzing the human motion in the form of mathematical formulations. The major contribution of this research is that it classifies the computer vision based techniques of human motion capture based on the taxonomy, and then breaks its down into four systematically different categories of tracking, initialization, pose estimation and recognition. The detailed descriptions and the relationships descriptions are given for the techniques of tracking and pose estimation. The subcategories of each process are further described. Various hypotheses have been used by the researchers in this domain are surveyed and the evolution of these techniques have been explained. It has been concluded in the survey that most researchers have focused on using the mathematical body models for the markerless motion capture.Keywords: human motion capture, computer vision, vision-based, tracking
Procedia PDF Downloads 3191608 Series Network-Structured Inverse Models of Data Envelopment Analysis: Pitfalls and Solutions
Authors: Zohreh Moghaddas, Morteza Yazdani, Farhad Hosseinzadeh
Abstract:
Nowadays, data envelopment analysis (DEA) models featuring network structures have gained widespread usage for evaluating the performance of production systems and activities (Decision-Making Units (DMUs)) across diverse fields. By examining the relationships between the internal stages of the network, these models offer valuable insights to managers and decision-makers regarding the performance of each stage and its impact on the overall network. To further empower system decision-makers, the inverse data envelopment analysis (IDEA) model has been introduced. This model allows the estimation of crucial information for estimating parameters while keeping the efficiency score unchanged or improved, enabling analysis of the sensitivity of system inputs or outputs according to managers' preferences. This empowers managers to apply their preferences and policies on resources, such as inputs and outputs, and analyze various aspects like production, resource allocation processes, and resource efficiency enhancement within the system. The results obtained can be instrumental in making informed decisions in the future. The top result of this study is an analysis of infeasibility and incorrect estimation that may arise in the theory and application of the inverse model of data envelopment analysis with network structures. By addressing these pitfalls, novel protocols are proposed to circumvent these shortcomings effectively. Subsequently, several theoretical and applied problems are examined and resolved through insightful case studies.Keywords: inverse models of data envelopment analysis, series network, estimation of inputs and outputs, efficiency, resource allocation, sensitivity analysis, infeasibility
Procedia PDF Downloads 511607 Application of UAS in Forest Firefighting for Detecting Ignitions and 3D Fuel Volume Estimation
Authors: Artur Krukowski, Emmanouela Vogiatzaki
Abstract:
The article presents results from the AF3 project “Advanced Forest Fire Fighting” focused on Unmanned Aircraft Systems (UAS)-based 3D surveillance and 3D area mapping using high-resolution photogrammetric methods from multispectral imaging, also taking advantage of the 3D scanning techniques from the SCAN4RECO project. We also present a proprietary embedded sensor system used for the detection of fire ignitions in the forest using near-infrared based scanner with weight and form factors allowing it to be easily deployed on standard commercial micro-UAVs, such as DJI Inspire or Mavic. Results from real-life pilot trials in Greece, Spain, and Israel demonstrated added-value in the use of UAS for precise and reliable detection of forest fires, as well as high-resolution 3D aerial modeling for accurate quantification of human resources and equipment required for firefighting.Keywords: forest wildfires, surveillance, fuel volume estimation, firefighting, ignition detectors, 3D modelling, UAV
Procedia PDF Downloads 1421606 The Hyperbolic Smoothing Approach for Automatic Calibration of Rainfall-Runoff Models
Authors: Adilson Elias Xavier, Otto Corrêa Rotunno Filho, Paulo Canedo De Magalhães
Abstract:
This paper addresses the issue of automatic parameter estimation in conceptual rainfall-runoff (CRR) models. Due to threshold structures commonly occurring in CRR models, the associated mathematical optimization problems have the significant characteristic of being strongly non-differentiable. In order to face this enormous task, the resolution method proposed adopts a smoothing strategy using a special C∞ differentiable class function. The final estimation solution is obtained by solving a sequence of differentiable subproblems which gradually approach the original conceptual problem. The use of this technique, called Hyperbolic Smoothing Method (HSM), makes possible the application of the most powerful minimization algorithms, and also allows for the main difficulties presented by the original CRR problem to be overcome. A set of computational experiments is presented for the purpose of illustrating both the reliability and the efficiency of the proposed approach.Keywords: rainfall-runoff models, automatic calibration, hyperbolic smoothing method
Procedia PDF Downloads 1491605 Estimation Model for Concrete Slump Recovery by Using Superplasticizer
Authors: Chaiyakrit Raoupatham, Ram Hari Dhakal, Chalermchai Wanichlamlert
Abstract:
This paper is aimed to introduce the solution of concrete slump recovery using chemical admixture type-F (superplasticizer, naphthalene base) to the practice, in order to solve unusable concrete problem due to concrete loss its slump, especially for those tropical countries that have faster slump loss rate. In the other hand, randomly adding superplasticizer into concrete can cause concrete to segregate. Therefore, this paper also develops the estimation model used to calculate amount of second dose of superplasticizer need for concrete slump recovery. Fresh properties of ordinary Portland cement concrete with volumetric ratio of paste to void between aggregate (paste content) of 1.1-1.3 with water-cement ratio zone of 0.30 to 0.67 and initial superplasticizer (naphthalene base) of 0.25%- 1.6% were tested for initial slump and slump loss for every 30 minutes for one and half hour by slump cone test. Those concretes with slump loss range from 10% to 90% were re-dosed and successfully recovered back to its initial slump. Slump after re-dosed was tested by slump cone test. From the result, it has been concluded that, slump loss was slower for those mix with high initial dose of superplasticizer due to addition of superplasticizer will disturb cement hydration. The required second dose of superplasticizer was affected by two major parameter, which were water-cement ratio and paste content, where lower water-cement ratio and paste content cause an increase in require second dose of superplasticizer. The amount of second dose of superplasticizer is higher as the solid content within the system is increase, solid can be either from cement particles or aggregate. The data was analyzed to form an equation use to estimate the amount of second dosage requirement of superplasticizer to recovery slump to its original.Keywords: estimation model, second superplasticizer dosage, slump loss, slump recovery
Procedia PDF Downloads 1991604 UWB Channel Estimation Using an Efficient Sub-Nyquist Sampling Scheme
Authors: Yaacoub Tina, Youssef Roua, Radoi Emanuel, Burel Gilles
Abstract:
Recently, low-complexity sub-Nyquist sampling schemes based on the Finite Rate of Innovation (FRI) theory have been introduced to sample parametric signals at minimum rates. The multichannel modulating waveforms (MCMW) is such an efficient scheme, where the received signal is mixed with an appropriate set of arbitrary waveforms, integrated and sampled at rates far below the Nyquist rate. In this paper, the MCMW scheme is adapted to the special case of ultra wideband (UWB) channel estimation, characterized by dense multipaths. First, an appropriate structure, which accounts for the bandpass spectrum feature of UWB signals, is defined. Then, a novel approach to decrease the number of processing channels and reduce the complexity of this sampling scheme is presented. Finally, the proposed concepts are validated by simulation results, obtained with real filters, in the framework of a coherent Rake receiver.Keywords: coherent rake receiver, finite rate of innovation, sub-nyquist sampling, ultra wideband
Procedia PDF Downloads 2561603 Acceleration-Based Motion Model for Visual Simultaneous Localization and Mapping
Authors: Daohong Yang, Xiang Zhang, Lei Li, Wanting Zhou
Abstract:
Visual Simultaneous Localization and Mapping (VSLAM) is a technology that obtains information in the environment for self-positioning and mapping. It is widely used in computer vision, robotics and other fields. Many visual SLAM systems, such as OBSLAM3, employ a constant-speed motion model that provides the initial pose of the current frame to improve the speed and accuracy of feature matching. However, in actual situations, the constant velocity motion model is often difficult to be satisfied, which may lead to a large deviation between the obtained initial pose and the real value, and may lead to errors in nonlinear optimization results. Therefore, this paper proposed a motion model based on acceleration, which can be applied on most SLAM systems. In order to better describe the acceleration of the camera pose, we decoupled the pose transformation matrix, and calculated the rotation matrix and the translation vector respectively, where the rotation matrix is represented by rotation vector. We assume that, in a short period of time, the changes of rotating angular velocity and translation vector remain the same. Based on this assumption, the initial pose of the current frame is estimated. In addition, the error of constant velocity model was analyzed theoretically. Finally, we applied our proposed approach to the ORBSLAM3 system and evaluated two sets of sequences on the TUM dataset. The results showed that our proposed method had a more accurate initial pose estimation and the accuracy of ORBSLAM3 system is improved by 6.61% and 6.46% respectively on the two test sequences.Keywords: error estimation, constant acceleration motion model, pose estimation, visual SLAM
Procedia PDF Downloads 941602 Reservoir Properties Effect on Estimating Initial Gas in Place Using Flowing Material Balance Method
Authors: Yousef S. Kh. S. Hashem
Abstract:
Accurate estimation of initial gas in place (IGIP) plays an important factor in the decision to develop a gas field. One of the methods that are available in the industry to estimate the IGIP is material balance. This method required that the well has to be shut-in while pressure is measured as it builds to average reservoir pressure. Since gas demand is high and shut-in well surveys are very expensive, flowing gas material balance (FGMB) is sometimes used instead of material balance. This work investigated the effect of reservoir properties (pressure, permeability, and reservoir size) on the estimation of IGIP when using FGMB. A gas reservoir simulator that accounts for friction loss, wellbore storage, and the non-Darcy effect was used to simulate 165 different possible causes (3 pressures, 5 reservoir sizes, and 11 permeabilities). Both tubing pressure and bottom-hole pressure were analyzed using FGMB. The results showed that the FGMB method is very sensitive for tied reservoirs (k < 10). Also, it showed which method is best to be used for different reservoir properties. This study can be used as a guideline for the application of the FGMB method.Keywords: flowing material balance, gas reservoir, reserves, gas simulator
Procedia PDF Downloads 155