Search results for: Depth estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4951

Search results for: Depth estimation

4291 Nanomechanical Characterization of Titanium Alloy Modified by Nitrogen Ion Implantation

Authors: Josef Sepitka, Petr Vlcak, Tomas Horazdovsky, Vratislav Perina

Abstract:

An ion implantation technique was used for designing the surface area of a titanium alloy and for irradiation-enhanced hardening of the surface. The Ti6Al4V alloy was treated by nitrogen ion implantation at fluences of 2·1017 and 4·1017 cm-2 and at ion energy 90 keV. The depth distribution of the nitrogen was investigated by Rutherford Backscattering Spectroscopy. The gradient of mechanical properties was investigated by nanoindentation. The continuous measurement mode was used to obtain depth profiles of the indentation hardness and the reduced storage modulus of the modified surface area. The reduced storage modulus and the hardness increase with increasing fluence. Increased fluence shifts the peak of the mechanical properties as well as the peak of nitrogen concentration towards to the surface. This effect suggests a direct relationship between mechanical properties and nitrogen distribution.

Keywords: nitrogen ion implantation, titanium-based nanolayer, storage modulus, hardness, microstructure

Procedia PDF Downloads 323
4290 Laser Corneoplastique™: A Refractive Surgery for Corneal Scars

Authors: Arun C. Gulani, Aaishwariya A. Gulani, Amanda Southall

Abstract:

Background: Laser Corneoplastique™ as a least interventional, visually promising technique for patients with vision disability from corneal scars of varied causes has been retrospectively reviewed and proves to cause a paradigm shift in mindset and approach towards corneal scars as a Refractive surgery aiming for emmetropic, unaided vision of 20;/20 in most cases. Three decades of work on this technique has been compiled in this 15-year study. Subject and Methods: The objective of this study was to determine the success of Laser Corneoplastique™ surgery as a treatment of corneal scar cases. A survey of corneal scar cases caused by various medical histories that had undergone Laser Corneoplastique™ surgery over the past twenty years by a single surgeon Arun C. Gulani, M.D. were retrospectively reviewed. The details of each of the cases were retrieved from their medical records and analyzed. Each patient had been examined thoroughly at their preoperative appointments for stability of refraction and vision, depth of scar, pachymetry, topography, pattern of the scar and uncorrected and best corrected vision potential, which were all taken into account in the patients' treatment plans. Results: 64 eyes of 53 patients were investigated for scar etiology, keratometry, visual acuity, and complications. There were 25 different etiologies seen, with the most common being a Herpetic scar. The average visual acuity post-op was, on average, 20/23.55 (±7.05). Laser parameters used were depth and pulses. Overall, the mean Laser ablation depth was 30.67 (±19.05), ranging from 2 to 73 µm. Number of Laser pulses averaged 191.85 (±112.02). Conclusion: Refractive Laser Corneoplastique™ surgery, when practiced as an art, can address all levels of ametropia while reversing complex corneas and scars from refractive surgery complications back to 20/20 vision.

Keywords: corneal scar, refractive surgery, corneal transplant, laser corneoplastique

Procedia PDF Downloads 163
4289 Technology of Gyro Orientation Measurement Unit (Gyro Omu) for Underground Utility Mapping Practice

Authors: Mohd Ruzlin Mohd Mokhtar

Abstract:

At present, most operators who are working on projects for utilities such as power, water, oil, gas, telecommunication and sewerage are using technologies e.g. Total station, Global Positioning System (GPS), Electromagnetic Locator (EML) and Ground Penetrating Radar (GPR) to perform underground utility mapping. With the increase in popularity of Horizontal Directional Drilling (HDD) method among the local authorities and asset owners, most of newly installed underground utilities need to use the HDD method. HDD method is seen as simple and create not much disturbance to the public and traffic. Thus, it was the preferred utilities installation method in most of areas especially in urban areas. HDDs were installed much deeper than exiting utilities (some reports saying that HDD is averaging 5 meter in depth). However, this impacts the accuracy or ability of existing underground utility mapping technologies. In most of Malaysia underground soil condition, those technologies were limited to maximum of 3 meter depth. Thus, those utilities which were installed much deeper than 3 meter depth could not be detected by using existing detection tools. The accuracy and reliability of existing underground utility mapping technologies or work procedure were in doubt. Thus, a mitigation action plan is required. While installing new utility using Horizontal Directional Drilling (HDD) method, a more accurate underground utility mapping can be achieved by using Gyro OMU compared to existing practice using e.g. EML and GPR. Gyro OMU is a method to accurately identify the location of HDD thus this mapping can be used or referred to avoid those cost of breakdown due to future HDD works which can be caused by inaccurate underground utility mapping.

Keywords: Gyro Orientation Measurement Unit (Gyro OMU), Horizontal Directional Drilling (HDD), Ground Penetrating Radar (GPR), Electromagnetic Locator (EML)

Procedia PDF Downloads 120
4288 A Remedy for the Confusing Occlusal Principles - An Approach to a Passionate, In-Depth Understanding of Tooth Surfaces Dynamics

Authors: Kariem Elhelow

Abstract:

The task of optimizing teeth surface relations remains perplexing for many dental practitioners. The well-being of teeth, periodontium, and the musculoskeletal system is closely associated with occlusal stability. Dental occlusion is rather far beyond the simple contact of the occlusal surfaces of the opposite jaws, a fact that turned the word “Occlusion” into one of the most complicated puzzles in dentistry. The literature describing the pathological approaches made the practice of occlusion even more intimidating. Understanding the biomechanics of teeth and jaw movements makes the goals of occlusal rehabilitation very lively and simple to practice. The purpose of this article is to establish a path for understanding and practicing the fundamental occlusal principles in a simple yet in depth way. Relying of the evidence based core would deliver a context for showing that occlusion is not as complicated as literatures might reflect. Conclusion: Maintaining a well-defined picture of what a healthy occlusion should be like is very gratifying to both the operator and the patient, with added worth of predictability, esthetics, and function to the whole treatment.

Keywords: occlusal, temporomandibular joint, prosthetic, dentition

Procedia PDF Downloads 109
4287 Is Privatization Related with Macroeconomic Management? Evidence from Some Selected African Countries

Authors: E. O. George, P. Ojeaga, D. Odejimi, O. Mattehws

Abstract:

Has macroeconomic management succeeded in making privatization promote growth in Africa? What are the probable strategies that should accompany the privatization reform process to promote growth in Africa? To what extent has the privatization process succeeded in attracting foreign direct investment to Africa? The study investigates the relationship between macroeconomic management and privatization. Many African countries have embarked on one form of privatization reform or the other since 1980 as one of the stringent conditions for accessing capital from the IMF and the World Bank. Secondly globalization and the gradually integration of the African economy into the global economy also means that Africa has to strategically develop its domestic market to cushion itself from fluctuations and probable contagion associated with global economic crisis that are always inevitable Stiglitz. The methods of estimation used are the OLS, linear mixed effects (LME), 2SLS and the GMM method of estimation. It was found that macroeconomic management has the capacity to affect the success of the privatization reform process. It was also found that privatization was not promoting growth in Africa; privatization could promote growth if long run growth strategies are implemented together with the privatization reform process. Privatization was also found not to have the capacity to attract foreign investment to many African countries.

Keywords: Africa, political economy, game theory, macroeconomic management and privatization

Procedia PDF Downloads 312
4286 Inference for Compound Truncated Poisson Lognormal Model with Application to Maximum Precipitation Data

Authors: M. Z. Raqab, Debasis Kundu, M. A. Meraou

Abstract:

In this paper, we have analyzed maximum precipitation data during a particular period of time obtained from different stations in the Global Historical Climatological Network of the USA. One important point to mention is that some stations are shut down on certain days for some reason or the other. Hence, the maximum values are recorded by excluding those readings. It is assumed that the number of stations that operate follows zero-truncated Poisson random variables, and the daily precipitation follows a lognormal random variable. We call this model a compound truncated Poisson lognormal model. The proposed model has three unknown parameters, and it can take a variety of shapes. The maximum likelihood estimators can be obtained quite conveniently using Expectation-Maximization (EM) algorithm. Approximate maximum likelihood estimators are also derived. The associated confidence intervals also can be obtained from the observed Fisher information matrix. Simulation results have been performed to check the performance of the EM algorithm, and it is observed that the EM algorithm works quite well in this case. When we analyze the precipitation data set using the proposed model, it is observed that the proposed model provides a better fit than some of the existing models.

Keywords: compound Poisson lognormal distribution, EM algorithm, maximum likelihood estimation, approximate maximum likelihood estimation, Fisher information, skew distribution

Procedia PDF Downloads 95
4285 Assessment of the Response of Seismic Refraction Tomography and Resistivity Imaging to the Same Geologic Environment: A Case Study of Zaria Basement Complex in North Central Nigeria

Authors: Collins C. Chiemeke, I. B. Osazuwa, S. O. Ibe, G. N. Egwuonwu, C. D. Ani, E. C. Chii

Abstract:

The study area is Zaria, located in the basement complex of northern Nigeria. The rock type forming the major part of the Zaria batholith is granite. This research work was carried out to compare the responses of seismic refraction tomography and resistivity tomography in the same geologic environment and under the same conditions. Hence, the choice of the site that has a visible granitic outcrop that extends across a narrow stream channel and is flanked by unconsolidated overburden, a neutral profile that was covered by plain overburden and a site with thick lateritic cover became necessary. The results of the seismic and resistivity tomography models reveals that seismic velocity and resistivity does not always simultaneously increase with depth, but their responses in any geologic environment are determined by changes in the mechanical and chemical content of the rock types rather than depth.

Keywords: environment, resistivity, response, seismic, velocity

Procedia PDF Downloads 334
4284 Analysis of Earthquake Potential and Shock Level Scenarios in South Sulawesi

Authors: Takhul Bakhtiar

Abstract:

In South Sulawesi Province, there is an active Walanae Fault causing this area to frequently experience earthquakes. This study aims to determine the level of seismicity of the earthquake in order to obtain the potential for earthquakes in the future. The estimation of the potential for earthquakes is then made a scenario model determine the estimated level of shocks as an effort to mitigate earthquake disasters in the region. The method used in this study is the Gutenberg Richter Method through the statistical likelihood approach. This study used earthquake data in the South Sulawesi region in 1972 - 2022. The research location is located at the coordinates of 3.5° – 5.5° South Latitude and 119.5° – 120.5° East Longitude and divided into two segments, namely the northern segment at the coordinates of 3.5° – 4.5° South Latitude and 119,5° – 120,5° East Longitude then the southern segment with coordinates of 4.5° – 5.5° South Latitude and 119,5° – 120.5° East Longitude. This study uses earthquake parameters with a magnitude > 1 and a depth < 50 km. The results of the analysis show that the potential for earthquakes in the next ten years with a magnitude of M = 7 in the northern segment is estimated at 98.81% with an estimated shock level of VI-VII MMI around the cities of Pare-Pare, Barru, Pinrang and Soppeng then IV - V MMI in the cities of Bulukumba, Selayar, Makassar and Gowa. In the southern segment, the potential for earthquakes in the next ten years with a magnitude of M = 7 is estimated at 32.89% with an estimated VI-VII MMI shock level in the cities of Bulukumba, Selayar, Makassar and Gowa, then III-IV MMI around the cities of Pare-Pare, Barru, Pinrang and Soppeng.

Keywords: Gutenberg Richter, likelihood method, seismicity, shakemap and MMI scale

Procedia PDF Downloads 107
4283 Estimation of Fragility Curves Using Proposed Ground Motion Selection and Scaling Procedure

Authors: Esra Zengin, Sinan Akkar

Abstract:

Reliable and accurate prediction of nonlinear structural response requires specification of appropriate earthquake ground motions to be used in nonlinear time history analysis. The current research has mainly focused on selection and manipulation of real earthquake records that can be seen as the most critical step in the performance based seismic design and assessment of the structures. Utilizing amplitude scaled ground motions that matches with the target spectra is commonly used technique for the estimation of nonlinear structural response. Representative ground motion ensembles are selected to match target spectrum such as scenario-based spectrum derived from ground motion prediction equations, Uniform Hazard Spectrum (UHS), Conditional Mean Spectrum (CMS) or Conditional Spectrum (CS). Different sets of criteria exist among those developed methodologies to select and scale ground motions with the objective of obtaining robust estimation of the structural performance. This study presents ground motion selection and scaling procedure that considers the spectral variability at target demand with the level of ground motion dispersion. The proposed methodology provides a set of ground motions whose response spectra match target median and corresponding variance within a specified period interval. The efficient and simple algorithm is used to assemble the ground motion sets. The scaling stage is based on the minimization of the error between scaled median and the target spectra where the dispersion of the earthquake shaking is preserved along the period interval. The impact of the spectral variability on nonlinear response distribution is investigated at the level of inelastic single degree of freedom systems. In order to see the effect of different selection and scaling methodologies on fragility curve estimations, results are compared with those obtained by CMS-based scaling methodology. The variability in fragility curves due to the consideration of dispersion in ground motion selection process is also examined.

Keywords: ground motion selection, scaling, uncertainty, fragility curve

Procedia PDF Downloads 570
4282 FPGA Based Vector Control of PM Motor Using Sliding Mode Observer

Authors: Hanan Mikhael Dawood, Afaneen Anwer Abood Al-Khazraji

Abstract:

The paper presents an investigation of field oriented control strategy of Permanent Magnet Synchronous Motor (PMSM) based on hardware in the loop simulation (HIL) over a wide speed range. A sensorless rotor position estimation using sliding mode observer for permanent magnet synchronous motor is illustrated considering the effects of magnetic saturation between the d and q axes. The cross saturation between d and q axes has been calculated by finite-element analysis. Therefore, the inductance measurement regards the saturation and cross saturation which are used to obtain the suitable id-characteristics in base and flux weakening regions. Real time matrix multiplication in Field Programmable Gate Array (FPGA) using floating point number system is used utilizing Quartus-II environment to develop FPGA designs and then download these designs files into development kit. dSPACE DS1103 is utilized for Pulse Width Modulation (PWM) switching and the controller. The hardware in the loop results conducted to that from the Matlab simulation. Various dynamic conditions have been investigated.

Keywords: magnetic saturation, rotor position estimation, sliding mode observer, hardware in the loop (HIL)

Procedia PDF Downloads 508
4281 Numerical Analysis Of Stainless Steel Beam To Column Joints With Bolted Flush End Plates

Authors: Takwiir Tahriim Khan, Tausif Khalid, Mohammad Redwan Ahamed, Md Soebur Rahman

Abstract:

The mutual connection in joints has a significant impact on the safe and cost-effective design of steel structures. Generally, the end plates are welded at the end of the beam and columns are bolted with the end plates. Thus, the moment will be transferred at the interface, which is a critical segment at the connection. 3-D Finite Element Models (FEM) has been developed using ABAQUS 2017 software to predict the yield capacity of the end plate connections. The parameters used in this study are the depth, width, and thickness of the end plate, dimensions of the bolt, sectional and material properties of beams and columns. The influence width, depth, and thicknesses of the end plate connection on yield capacity were investigated through parametric studies. The results showed that, for increasing plate thickness from 0.3 inch to 0.8 inch by an increment of 0.1 inch the yield capacity increased by 2.85% on average, for decreasing the end plate depth from 13 inch to 11 inch the yield capacity increased by 25.4 %, and for decreasing the end plate width from 6.5 inch to 5.75 inch the yield capacity increased by 35.4%. Variation in yield capacity was also found by changing the beam and column section. Besides, the numerical results showed a good agreement with published experimental literature with an average variation of less than 8.3 % in yield capacity. So the study allows for a more effective combination of beam, column, and end plate dimensions.

Keywords: steel beam-column joints, finite element analysis, yield moment capacity, parametric study, ABAQUS, bolted joints, flush end plates, moment vs rotation curves

Procedia PDF Downloads 92
4280 Offline Parameter Identification and State-of-Charge Estimation for Healthy and Aged Electric Vehicle Batteries Based on the Combined Model

Authors: Xiaowei Zhang, Min Xu, Saeid Habibi, Fengjun Yan, Ryan Ahmed

Abstract:

Recently, Electric Vehicles (EVs) have received extensive consideration since they offer a more sustainable and greener transportation alternative compared to fossil-fuel propelled vehicles. Lithium-Ion (Li-ion) batteries are increasingly being deployed in EVs because of their high energy density, high cell-level voltage, and low rate of self-discharge. Since Li-ion batteries represent the most expensive component in the EV powertrain, accurate monitoring and control strategies must be executed to ensure their prolonged lifespan. The Battery Management System (BMS) has to accurately estimate parameters such as the battery State-of-Charge (SOC), State-of-Health (SOH), and Remaining Useful Life (RUL). In order for the BMS to estimate these parameters, an accurate and control-oriented battery model has to work collaboratively with a robust state and parameter estimation strategy. Since battery physical parameters, such as the internal resistance and diffusion coefficient change depending on the battery state-of-life (SOL), the BMS has to be adaptive to accommodate for this change. In this paper, an extensive battery aging study has been conducted over 12-months period on 5.4 Ah, 3.7 V Lithium polymer cells. Instead of using fixed charging/discharging aging cycles at fixed C-rate, a set of real-world driving scenarios have been used to age the cells. The test has been interrupted every 5% capacity degradation by a set of reference performance tests to assess the battery degradation and track model parameters. As battery ages, the combined model parameters are optimized and tracked in an offline mode over the entire batteries lifespan. Based on the optimized model, a state and parameter estimation strategy based on the Extended Kalman Filter (EKF) and the relatively new Smooth Variable Structure Filter (SVSF) have been applied to estimate the SOC at various states of life.

Keywords: lithium-ion batteries, genetic algorithm optimization, battery aging test, parameter identification

Procedia PDF Downloads 252
4279 Support Vector Machine Based Retinal Therapeutic for Glaucoma Using Machine Learning Algorithm

Authors: P. S. Jagadeesh Kumar, Mingmin Pan, Yang Yung, Tracy Lin Huan

Abstract:

Glaucoma is a group of visual maladies represented by the scheduled optic nerve neuropathy; means to the increasing dwindling in vision ground, resulting in loss of sight. In this paper, a novel support vector machine based retinal therapeutic for glaucoma using machine learning algorithm is conservative. The algorithm has fitting pragmatism; subsequently sustained on correlation clustering mode, it visualizes perfect computations in the multi-dimensional space. Support vector clustering turns out to be comparable to the scale-space advance that investigates the cluster organization by means of a kernel density estimation of the likelihood distribution, where cluster midpoints are idiosyncratic by the neighborhood maxima of the concreteness. The predicted planning has 91% attainment rate on data set deterrent on a consolidation of 500 realistic images of resolute and glaucoma retina; therefore, the computational benefit of depending on the cluster overlapping system pedestal on machine learning algorithm has complete performance in glaucoma therapeutic.

Keywords: machine learning algorithm, correlation clustering mode, cluster overlapping system, glaucoma, kernel density estimation, retinal therapeutic

Procedia PDF Downloads 225
4278 An Approach for Estimation in Hierarchical Clustered Data Applicable to Rare Diseases

Authors: Daniel C. Bonzo

Abstract:

Practical considerations lead to the use of unit of analysis within subjects, e.g., bleeding episodes or treatment-related adverse events, in rare disease settings. This is coupled with data augmentation techniques such as extrapolation to enlarge the subject base. In general, one can think about extrapolation of data as extending information and conclusions from one estimand to another estimand. This approach induces hierarchichal clustered data with varying cluster sizes. Extrapolation of clinical trial data is being accepted increasingly by regulatory agencies as a means of generating data in diverse situations during drug development process. Under certain circumstances, data can be extrapolated to a different population, a different but related indication, and different but similar product. We consider here the problem of estimation (point and interval) using a mixed-models approach under an extrapolation. It is proposed that estimators (point and interval) be constructed using weighting schemes for the clusters, e.g., equally weighted and with weights proportional to cluster size. Simulated data generated under varying scenarios are then used to evaluate the performance of this approach. In conclusion, the evaluation result showed that the approach is a useful means for improving statistical inference in rare disease settings and thus aids not only signal detection but risk-benefit evaluation as well.

Keywords: clustered data, estimand, extrapolation, mixed model

Procedia PDF Downloads 121
4277 Plot Scale Estimation of Crop Biophysical Parameters from High Resolution Satellite Imagery

Authors: Shreedevi Moharana, Subashisa Dutta

Abstract:

The present study focuses on the estimation of crop biophysical parameters like crop chlorophyll, nitrogen and water stress at plot scale in the crop fields. To achieve these, we have used high-resolution satellite LISS IV imagery. A new methodology has proposed in this research work, the spectral shape function of paddy crop is employed to get the significant wavelengths sensitive to paddy crop parameters. From the shape functions, regression index models were established for the critical wavelength with minimum and maximum wavelengths of multi-spectrum high-resolution LISS IV data. Moreover, the functional relationships were utilized to develop the index models. From these index models crop, biophysical parameters were estimated and mapped from LISS IV imagery at plot scale in crop field level. The result showed that the nitrogen content of the paddy crop varied from 2-8%, chlorophyll from 1.5-9% and water content variation observed from 40-90% respectively. It was observed that the variability in rice agriculture system in India was purely a function of field topography.

Keywords: crop parameters, index model, LISS IV imagery, plot scale, shape function

Procedia PDF Downloads 149
4276 Determination and Distribution of Formation Thickness Using Seismic and Well Data in Baga/Lake Sub-basin, Chad Basin Nigeria

Authors: Gabriel Efomeh Omolaiye, Olatunji Seminu, Jimoh Ajadi, Yusuf Ayoola Jimoh

Abstract:

The Nigerian part of the Chad Basin till date has been one of the few critically studied basins, with few published scholarly works, compared to other basins such as Niger Delta, Dahomey, etc. This work was undertaken by the integration of 3D seismic interpretations and the well data analysis of eight wells fairly distributed in block A, Baga/Lake sub-basin in Borno basin with the aim of determining the thickness of Chad, Kerri-Kerri, Fika, and Gongila Formations in the sub-basin. Da-1 well (type-well) used in this study was subdivided into stratigraphic units based on the regional stratigraphic subdivision of the Chad basin and was later correlated with other wells using similarity of observed log responses. The combined density and sonic logs were used to generate synthetic seismograms for seismic to well ties. Five horizons were mapped, representing the tops of the formations on the 3D seismic data covering the block; average velocity function with maximum error/residual of 0.48% was adopted in the time to depth conversion of all the generated maps. There is a general thickening of sediments from the west to the east, and the estimated thicknesses of the various formations in the Baga/Lake sub-basin are Chad Formation (400-750 m), Kerri-Kerri Formation (300-1200 m), Fika Formation (300-2200 m) and Gongila Formation (100-1300 m). The thickness of the Bima Formation could not be established because the deepest well (Da-1) terminates within the formation. This is a modification to the previous and widely referenced studies of over forty decades that based the estimation of formation thickness within the study area on the observed outcrops at different locations and the use of few well data.

Keywords: Baga/Lake sub-basin, Chad basin, formation thickness, seismic, velocity

Procedia PDF Downloads 157
4275 Games behind Bars: A Longitudinal Study of Inmates Pro-Social Preferences

Authors: Mario A. Maggioni, Domenico Rossignoli, Simona Beretta, Sara Balestri

Abstract:

The paper presents the results of a Longitudinal Randomized Control Trial implemented in 2016 two State Prisons in California (USA). The subjects were randomly assigned to a 10-months program (GRIP, Guiding Rage Into Power) aiming at undoing the destructive behavioral patterns that lead to criminal actions by raising the individual’s 'mindfulness'. This study tests whether the participation to this program (treatment), based on strong relationships and mutual help, affects pro-social behavior of participants, in particular with reference to trust and inequality aversion. The research protocol entails the administration of two questionnaires including a set of behavioral situations ('games') - widely used in the relevant literature in the field - to 80 inmates, 42 treated (enrolled in the program) and 38 controls. The first questionnaire has been administered before treatment and randomization took place; the second questionnaire at the end of the program. The results of a Difference-in-Differences estimation procedure, show that trust significantly increases GRIP participants to compared to the control group. The result is robust to alternative estimation techniques and to the inclusion of a set of covariates to further control for idiosyncratic characteristics of the prisoners.

Keywords: behavioral economics, difference in differences, longitudinal study, pro-social preferences

Procedia PDF Downloads 373
4274 Evaluation of Expected Annual Loss Probabilities of RC Moment Resisting Frames

Authors: Saemee Jun, Dong-Hyeon Shin, Tae-Sang Ahn, Hyung-Joon Kim

Abstract:

Building loss estimation methodologies which have been advanced considerably in recent decades are usually used to estimate socio and economic impacts resulting from seismic structural damage. In accordance with these methods, this paper presents the evaluation of an annual loss probability of a reinforced concrete moment resisting frame designed according to Korean Building Code. The annual loss probability is defined by (1) a fragility curve obtained from a capacity spectrum method which is similar to a method adopted from HAZUS, and (2) a seismic hazard curve derived from annual frequencies of exceedance per peak ground acceleration. Seismic fragilities are computed to calculate the annual loss probability of a certain structure using functions depending on structural capacity, seismic demand, structural response and the probability of exceeding damage state thresholds. This study carried out a nonlinear static analysis to obtain the capacity of a RC moment resisting frame selected as a prototype building. The analysis results show that the probability of being extensive structural damage in the prototype building is expected to 0.004% in a year.

Keywords: expected annual loss, loss estimation, RC structure, fragility analysis

Procedia PDF Downloads 386
4273 Assessment Using Copulas of Simultaneous Damage to Multiple Buildings Due to Tsunamis

Authors: Yo Fukutani, Shuji Moriguchi, Takuma Kotani, Terada Kenjiro

Abstract:

If risk management of the assets owned by companies, risk assessment of real estate portfolio, and risk identification of the entire region are to be implemented, it is necessary to consider simultaneous damage to multiple buildings. In this research, the Sagami Trough earthquake tsunami that could have a significant effect on the Japanese capital region is focused on, and a method is proposed for simultaneous damage assessment using copulas that can take into consideration the correlation of tsunami depths and building damage between two sites. First, the tsunami inundation depths at two sites were simulated by using a nonlinear long-wave equation. The tsunamis were simulated by varying the slip amount (five cases) and the depths (five cases) for each of 10 sources of the Sagami Trough. For each source, the frequency distributions of the tsunami inundation depth were evaluated by using the response surface method. Then, Monte-Carlo simulation was conducted, and frequency distributions of tsunami inundation depth were evaluated at the target sites for all sources of the Sagami Trough. These are marginal distributions. Kendall’s tau for the tsunami inundation simulation at two sites was 0.83. Based on this value, the Gaussian copula, t-copula, Clayton copula, and Gumbel copula (n = 10,000) were generated. Then, the simultaneous distributions of the damage rate were evaluated using the marginal distributions and the copulas. For the correlation of the tsunami inundation depth at the two sites, the expected value hardly changed compared with the case of no correlation, but the damage rate of the ninety-ninth percentile value was approximately 2%, and the maximum value was approximately 6% when using the Gumbel copula.

Keywords: copulas, Monte-Carlo simulation, probabilistic risk assessment, tsunamis

Procedia PDF Downloads 124
4272 High Speed Motion Tracking with Magnetometer in Nonuniform Magnetic Field

Authors: Jeronimo Cox, Tomonari Furukawa

Abstract:

Magnetometers have become more popular in inertial measurement units (IMU) for their ability to correct estimations using the earth's magnetic field. Accelerometer and gyroscope-based packages fail with dead-reckoning errors accumulated over time. Localization in robotic applications with magnetometer-inclusive IMUs has become popular as a way to track the odometry of slower-speed robots. With high-speed motions, the accumulated error increases over smaller periods of time, making them difficult to track with IMU. Tracking a high-speed motion is especially difficult with limited observability. Visual obstruction of motion leaves motion-tracking cameras unusable. When motions are too dynamic for estimation techniques reliant on the observability of the gravity vector, the use of magnetometers is further justified. As available magnetometer calibration methods are limited with the assumption that background magnetic fields are uniform, estimation in nonuniform magnetic fields is problematic. Hard iron distortion is a distortion of the magnetic field by other objects that produce magnetic fields. This kind of distortion is often observed as the offset from the origin of the center of data points when a magnetometer is rotated. The magnitude of hard iron distortion is dependent on proximity to distortion sources. Soft iron distortion is more related to the scaling of the axes of magnetometer sensors. Hard iron distortion is more of a contributor to the error of attitude estimation with magnetometers. Indoor environments or spaces inside ferrite-based structures, such as building reinforcements or a vehicle, often cause distortions with proximity. As positions correlate to areas of distortion, methods of magnetometer localization include the production of spatial mapping of magnetic field and collection of distortion signatures to better aid location tracking. The goal of this paper is to compare magnetometer methods that don't need pre-productions of magnetic field maps. Mapping the magnetic field in some spaces can be costly and inefficient. Dynamic measurement fusion is used to track the motion of a multi-link system with us. Conventional calibration by data collection of rotation at a static point, real-time estimation of calibration parameters each time step, and using two magnetometers for determining local hard iron distortion are compared to confirm the robustness and accuracy of each technique. With opposite-facing magnetometers, hard iron distortion can be accounted for regardless of position, Rather than assuming that hard iron distortion is constant regardless of positional change. The motion measured is a repeatable planar motion of a two-link system connected by revolute joints. The links are translated on a moving base to impulse rotation of the links. Equipping the joints with absolute encoders and recording the motion with cameras to enable ground truth comparison to each of the magnetometer methods. While the two-magnetometer method accounts for local hard iron distortion, the method fails where the magnetic field direction in space is inconsistent.

Keywords: motion tracking, sensor fusion, magnetometer, state estimation

Procedia PDF Downloads 65
4271 An Approach to Apply Kernel Density Estimation Tool for Crash Prone Location Identification

Authors: Kazi Md. Shifun Newaz, S. Miaji, Shahnewaz Hazanat-E-Rabbi

Abstract:

In this study, the kernel density estimation tool has been used to identify most crash prone locations in a national highway of Bangladesh. Like other developing countries, in Bangladesh road traffic crashes (RTC) have now become a great social alarm and the situation is deteriorating day by day. Today’s black spot identification process is not based on modern technical tools and most of the cases provide wrong output. In this situation, characteristic analysis and black spot identification by spatial analysis would be an effective and low cost approach in ensuring road safety. The methodology of this study incorporates a framework on the basis of spatial-temporal study to identify most RTC occurrence locations. In this study, a very important and economic corridor like Dhaka to Sylhet highway has been chosen to apply the method. This research proposes that KDE method for identification of Hazardous Road Location (HRL) could be used for all other National highways in Bangladesh and also for other developing countries. Some recommendations have been suggested for policy maker to reduce RTC in Dhaka-Sylhet especially in black spots.

Keywords: hazardous road location (HRL), crash, GIS, kernel density

Procedia PDF Downloads 297
4270 The Construction of Research-Oriented/Practice-Oriented Engineering Testing and Measurement Technology Course under the Condition of New Technology

Authors: He Lingsong, Wang Junfeng, Tan Qiong, Xu Jiang

Abstract:

The paper describes efforts on reconstruction methods of engineering testing and measurement technology course by applying new techniques and applications. Firstly, flipped classroom was introduced. In-class time was used for in-depth discussions and interactions while theory concept teaching was done by self-study course outside of class. Secondly, two hands-on practices of technique applications, including the program design of MATLAB Signal Analysis and the measurement application of Arduino sensor, have been covered in class. Class was transformed from an instructor-centered teaching process into an active student-centered learning process, consisting of the pre-class massive open online course (MOOC), in-class discussion and after-class practice. The third is to change sole written homework to the research-oriented application practice assignments, so as to enhance the breadth and depth of the course.

Keywords: testing and measurement, flipped classroom, MOOC, research-oriented learning, practice-oriented learning

Procedia PDF Downloads 127
4269 Nondestructive Testing for Reinforced Concrete Buildings with Active Infrared Thermography

Authors: Huy Q. Tran, Jungwon Huh, Kiseok Kwak, Choonghyun Kang

Abstract:

Infrared thermography (IRT) technique has been proven to be a good method for nondestructive evaluation of concrete material. In the building, a broad range of applications has been used such as subsurface defect inspection, energy loss, and moisture detection. The purpose of this research is to consider the qualitative and quantitative performance of reinforced concrete deteriorations using active infrared thermography technique. An experiment of three different heating regimes was conducted on a concrete slab in the laboratory. The thermal characteristics of the IRT method, i.e., absolute contrast and observation time, are investigated. A linear relationship between the observation time and the real depth was established with a well linear regression R-squared of 0.931. The results showed that the absolute contrast above defective area increases with the rise of the size of delamination and the heating time. In addition, the depth of delamination can be predicted by using the proposal relationship of this study.

Keywords: concrete building, infrared thermography, nondestructive evaluation, subsurface delamination

Procedia PDF Downloads 269
4268 Estimating View-Through Ad Attribution from User Surveys Using Convex Optimization

Authors: Yuhan Lin, Rohan Kekatpure, Cassidy Yeung

Abstract:

In Digital Marketing, robust quantification of View-through attribution (VTA) is necessary for evaluating channel effectiveness. VTA occurs when a product purchase is aided by an Ad but without an explicit click (e.g. a TV ad). A lack of a tracking mechanism makes VTA estimation challenging. Most prevalent VTA estimation techniques rely on post-purchase in-product user surveys. User surveys enable the calculation of channel multipliers, which are the ratio of the view-attributed to the click-attributed purchases of each marketing channel. Channel multipliers thus provide a way to estimate the unknown VTA for a channel from its known click attribution. In this work, we use Convex Optimization to compute channel multipliers in a way that enables a mathematical encoding of the expected channel behavior. Large fluctuations in channel attributions often result from overfitting the calculations to user surveys. Casting channel attribution as a Convex Optimization problem allows an introduction of constraints that limit such fluctuations. The result of our study is a distribution of channel multipliers across the entire marketing funnel, with important implications for marketing spend optimization. Our technique can be broadly applied to estimate Ad effectiveness in a privacy-centric world that increasingly limits user tracking.

Keywords: digital marketing, survey analysis, operational research, convex optimization, channel attribution

Procedia PDF Downloads 162
4267 Stereo Motion Tracking

Authors: Yudhajit Datta, Hamsi Iyer, Jonathan Bandi, Ankit Sethia

Abstract:

Motion Tracking and Stereo Vision are complicated, albeit well-understood problems in computer vision. Existing softwares that combine the two approaches to perform stereo motion tracking typically employ complicated and computationally expensive procedures. The purpose of this study is to create a simple and effective solution capable of combining the two approaches. The study aims to explore a strategy to combine the two techniques of two-dimensional motion tracking using Kalman Filter; and depth detection of object using Stereo Vision. In conventional approaches objects in the scene of interest are observed using a single camera. However for Stereo Motion Tracking; the scene of interest is observed using video feeds from two calibrated cameras. Using two simultaneous measurements from the two cameras a calculation for the depth of the object from the plane containing the cameras is made. The approach attempts to capture the entire three-dimensional spatial information of each object at the scene and represent it through a software estimator object. In discrete intervals, the estimator tracks object motion in the plane parallel to plane containing cameras and updates the perpendicular distance value of the object from the plane containing the cameras as depth. The ability to efficiently track the motion of objects in three-dimensional space using a simplified approach could prove to be an indispensable tool in a variety of surveillance scenarios. The approach may find application from high security surveillance scenes such as premises of bank vaults, prisons or other detention facilities; to low cost applications in supermarkets and car parking lots.

Keywords: kalman filter, stereo vision, motion tracking, matlab, object tracking, camera calibration, computer vision system toolbox

Procedia PDF Downloads 307
4266 3D Design of Orthotic Braces and Casts in Medical Applications Using Microsoft Kinect Sensor

Authors: Sanjana S. Mallya, Roshan Arvind Sivakumar

Abstract:

Orthotics is the branch of medicine that deals with the provision and use of artificial casts or braces to alter the biomechanical structure of the limb and provide support for the limb. Custom-made orthoses provide more comfort and can correct issues better than those available over-the-counter. However, they are expensive and require intricate modelling of the limb. Traditional methods of modelling involve creating a plaster of Paris mould of the limb. Lately, CAD/CAM and 3D printing processes have improved the accuracy and reduced the production time. Ordinarily, digital cameras are used to capture the features of the limb from different views to create a 3D model. We propose a system to model the limb using Microsoft Kinect2 sensor. The Kinect can capture RGB and depth frames simultaneously up to 30 fps with sufficient accuracy. The region of interest is captured from three views, each shifted by 90 degrees. The RGB and depth data are fused into a single RGB-D frame. The resolution of the RGB frame is 1920px x 1080px while the resolution of the Depth frame is 512px x 424px. As the resolution of the frames is not equal, RGB pixels are mapped onto the Depth pixels to make sure data is not lost even if the resolution is lower. The resulting RGB-D frames are collected and using the depth coordinates, a three dimensional point cloud is generated for each view of the Kinect sensor. A common reference system was developed to merge the individual point clouds from the Kinect sensors. The reference system consisted of 8 coloured cubes, connected by rods to form a skeleton-cube with the coloured cubes at the corners. For each Kinect, the region of interest is the square formed by the centres of the four cubes facing the Kinect. The point clouds are merged by considering one of the cubes as the origin of a reference system. Depending on the relative distance from each cube, the three dimensional coordinate points from each point cloud is aligned to the reference frame to give a complete point cloud. The RGB data is used to correct for any errors in depth data for the point cloud. A triangular mesh is generated from the point cloud by applying Delaunay triangulation which generates the rough surface of the limb. This technique forms an approximation of the surface of the limb. The mesh is smoothened to obtain a smooth outer layer to give an accurate model of the limb. The model of the limb is used as a base for designing the custom orthotic brace or cast. It is transferred to a CAD/CAM design file to design of the brace above the surface of the limb. The proposed system would be more cost effective than current systems that use MRI or CT scans for generating 3D models and would be quicker than using traditional plaster of Paris cast modelling and the overall setup time is also low. Preliminary results indicate that the accuracy of the Kinect2 is satisfactory to perform modelling.

Keywords: 3d scanning, mesh generation, Microsoft kinect, orthotics, registration

Procedia PDF Downloads 171
4265 The Response of the Central Bank to the Exchange Rate Movement: A Dynamic Stochastic General Equilibrium-Vector Autoregressive Approach for Tunisian Economy

Authors: Abdelli Soulaima, Belhadj Besma

Abstract:

The paper examines the choice of the central bank toward the movements of the nominal exchange rate and evaluates its effects on the volatility of the output growth and the inflation. The novel hybrid method of the dynamic stochastic general equilibrium called the DSGE-VAR is proposed for analyzing this policy experiment in a small scale open economy in particular Tunisia. The contribution is provided to the empirical literature as we apply the Tunisian data with this model, which is rarely used in this context. Note additionally that the issue of treating the degree of response of the central bank to the exchange rate in Tunisia is special. To ameliorate the estimation, the Bayesian technique is carried out for the sample 1980:q1 to 2011 q4. Our results reveal that the central bank should not react or softly react to the exchange rate. The variance decomposition displayed that the overall inflation volatility is more pronounced with the fixed exchange rate regime for most of the shocks except for the productivity and the interest rate. The output volatility is also higher with this regime with the majority of the shocks exempting the foreign interest rate and the interest rate shocks.

Keywords: DSGE-VAR modeling, exchange rate, monetary policy, Bayesian estimation

Procedia PDF Downloads 282
4264 Predictive Analytics in Traffic Flow Management: Integrating Temporal Dynamics and Traffic Characteristics to Estimate Travel Time

Authors: Maria Ezziani, Rabie Zine, Amine Amar, Ilhame Kissani

Abstract:

This paper introduces a predictive model for urban transportation engineering, which is vital for efficient traffic management. Utilizing comprehensive datasets and advanced statistical techniques, the model accurately forecasts travel times by considering temporal variations and traffic dynamics. Machine learning algorithms, including regression trees and neural networks, are employed to capture sequential dependencies. Results indicate significant improvements in predictive accuracy, particularly during peak hours and holidays, with the incorporation of traffic flow and speed variables. Future enhancements may integrate weather conditions and traffic incidents. The model's applications range from adaptive traffic management systems to route optimization algorithms, facilitating congestion reduction and enhancing journey reliability. Overall, this research extends beyond travel time estimation, offering insights into broader transportation planning and policy-making realms, empowering stakeholders to optimize infrastructure utilization and improve network efficiency.

Keywords: predictive analytics, traffic flow, travel time estimation, urban transportation, machine learning, traffic management

Procedia PDF Downloads 59
4263 Dual-Channel Multi-Band Spectral Subtraction Algorithm Dedicated to a Bilateral Cochlear Implant

Authors: Fathi Kallel, Ahmed Ben Hamida, Christian Berger-Vachon

Abstract:

In this paper, a Speech Enhancement Algorithm based on Multi-Band Spectral Subtraction (MBSS) principle is evaluated for Bilateral Cochlear Implant (BCI) users. Specifically, dual-channel noise power spectral estimation algorithm using Power Spectral Densities (PSD) and Cross Power Spectral Densities (CPSD) of the observed signals is studied. The enhanced speech signal is obtained using Dual-Channel Multi-Band Spectral Subtraction ‘DC-MBSS’ algorithm. For performance evaluation, objective speech assessment test relying on Perceptual Evaluation of Speech Quality (PESQ) score is performed to fix the optimal number of frequency bands needed in DC-MBSS algorithm. In order to evaluate the speech intelligibility, subjective listening tests are assessed with 3 deafened BCI patients. Experimental results obtained using French Lafon database corrupted by an additive babble noise at different Signal-to-Noise Ratios (SNR) showed that DC-MBSS algorithm improves speech understanding for single and multiple interfering noise sources.

Keywords: speech enhancement, spectral substracion, noise estimation, cochlear impalnt

Procedia PDF Downloads 533
4262 Parameter Estimation for Contact Tracing in Graph-Based Models

Authors: Augustine Okolie, Johannes Müller, Mirjam Kretzchmar

Abstract:

We adopt a maximum-likelihood framework to estimate parameters of a stochastic susceptible-infected-recovered (SIR) model with contact tracing on a rooted random tree. Given the number of detectees per index case, our estimator allows to determine the degree distribution of the random tree as well as the tracing probability. Since we do not discover all infectees via contact tracing, this estimation is non-trivial. To keep things simple and stable, we develop an approximation suited for realistic situations (contract tracing probability small, or the probability for the detection of index cases small). In this approximation, the only epidemiological parameter entering the estimator is the basic reproduction number R0. The estimator is tested in a simulation study and applied to covid-19 contact tracing data from India. The simulation study underlines the efficiency of the method. For the empirical covid-19 data, we are able to compare different degree distributions and perform a sensitivity analysis. We find that particularly a power-law and a negative binomial degree distribution meet the data well and that the tracing probability is rather large. The sensitivity analysis shows no strong dependency on the reproduction number.

Keywords: stochastic SIR model on graph, contact tracing, branching process, parameter inference

Procedia PDF Downloads 64