Search results for: Kalman Filtering
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 440

Search results for: Kalman Filtering

140 A Development of Portable Intrinsically Safe Explosion-Proof Type of Dual Gas Detector

Authors: Sangguk Ahn, Youngyu Kim, Jaheon Gu, Gyoutae Park

Abstract:

In this paper, we developed a dual gas leak instrument to detect Hydrocarbon (HC) and Monoxide (CO) gases. To two kinds of gases, it is necessary to design compact structure for sensors. And then it is important to draw sensing circuits such as measuring, amplifying and filtering. After that, it should be well programmed with robust, systematic and module coding methods. In center of them, improvement of accuracy and initial response time are a matter of vital importance. To manufacture distinguished gas leak detector, we applied intrinsically safe explosion-proof structure to lithium ion battery, main circuits, a pump with motor, color LCD interfaces and sensing circuits. On software, to enhance measuring accuracy we used numerical analysis such as Lagrange and Neville interpolation. Performance test result is conducted by using standard Methane with seven different concentrations with three other products. We want raise risk prevention and efficiency of gas safe management through distributing to the field of gas safety. Acknowledgment: This study was supported by Small and Medium Business Administration under the research theme of ‘Commercialized Development of a portable intrinsically safe explosion-proof type dual gas leak detector’, (task number S2456036).

Keywords: gas leak, dual gas detector, intrinsically safe, explosion proof

Procedia PDF Downloads 228
139 Sepiolite as a Processing Aid in Fibre Reinforced Cement Produced in Hatschek Machine

Authors: R. Pérez Castells, J. M. Carbajo

Abstract:

Sepiolite is used as a processing aid in the manufacture of fibre cement from the start of the replacement of asbestos in the 80s. Sepiolite increases the inter-laminar bond between cement layers and improves homogeneity of the slurries. A new type of sepiolite processed product, Wollatrop TF/C, has been checked as a retention agent for fine particles in the production of fibre cement in a Hatschek machine. The effect of Wollatrop T/FC on filtering and fine particle losses was studied as well as the interaction with anionic polyacrylamide and microsilica. The design of the experiments were factorial and the VDT equipment used for measuring retention and drainage was modified Rapid Köethen laboratory sheet former. Wollatrop TF/C increased the fine particle retention improving the economy of the process and reducing the accumulation of solids in recycled process water. At the same time, drainage time increased sharply at high concentration, however drainage time can be improved by adjusting APAM concentration. Wollatrop TF/C and microsilica are having very small interactions among them. Microsilica does not control fine particle losses while Wollatrop TF/C does efficiently. Further research on APAM type (molecular weight and anionic character) is advisable to improve drainage.

Keywords: drainage, fibre-reinforced cement, fine particle losses, flocculation, microsilica, sepiolite

Procedia PDF Downloads 326
138 Comparisons of Co-Seismic Gravity Changes between GRACE Observations and the Predictions from the Finite-Fault Models for the 2012 Mw = 8.6 Indian Ocean Earthquake Off-Sumatra

Authors: Armin Rahimi

Abstract:

The Gravity Recovery and Climate Experiment (GRACE) has been a very successful project in determining math redistribution within the Earth system. Large deformations caused by earthquakes are in the high frequency band. Unfortunately, GRACE is only capable to provide reliable estimate at the low-to-medium frequency band for the gravitational changes. In this study, we computed the gravity changes after the 2012 Mw8.6 Indian Ocean earthquake off-Sumatra using the GRACE Level-2 monthly spherical harmonic (SH) solutions released by the University of Texas Center for Space Research (UTCSR). Moreover, we calculated gravity changes using different fault models derived from teleseismic data. The model predictions showed non-negligible discrepancies in gravity changes. However, after removing high-frequency signals, using Gaussian filtering 350 km commensurable GRACE spatial resolution, the discrepancies vanished, and the spatial patterns of total gravity changes predicted from all slip models became similar at the spatial resolution attainable by GRACE observations, and predicted-gravity changes were consistent with the GRACE-detected gravity changes. Nevertheless, the fault models, in which give different slip amplitudes, proportionally lead to different amplitude in the predicted gravity changes.

Keywords: undersea earthquake, GRACE observation, gravity change, dislocation model, slip distribution

Procedia PDF Downloads 355
137 CVOIP-FRU: Comprehensive VoIP Forensics Report Utility

Authors: Alejandro Villegas, Cihan Varol

Abstract:

Voice over Internet Protocol (VoIP) products is an emerging technology that can contain forensically important information for a criminal activity. Without having the user name and passwords, this forensically important information can still be gathered by the investigators. Although there are a few VoIP forensic investigative applications available in the literature, most of them are particularly designed to collect evidence from the Skype product. Therefore, in order to assist law enforcement with collecting forensically important information from variety of Betamax VoIP tools, CVOIP-FRU framework is developed. CVOIP-FRU provides a data gathering solution that retrieves usernames, contact lists, as well as call and SMS logs from Betamax VoIP products. It is a scripting utility that searches for data within the registry, logs and the user roaming profiles in Windows and Mac OSX operating systems. Subsequently, it parses the output into readable text and html formats. One superior way of CVOIP-FRU compared to the other applications that due to intelligent data filtering capabilities and cross platform scripting back end of CVOIP-FRU, it is expandable to include other VoIP solutions as well. Overall, this paper reveals the exploratory analysis performed in order to find the key data paths and locations, the development stages of the framework, and the empirical testing and quality assurance of CVOIP-FRU.

Keywords: betamax, digital forensics, report utility, VoIP, VoIPBuster, VoIPWise

Procedia PDF Downloads 297
136 The Trigger-DAQ System in the Mu2e Experiment

Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella

Abstract:

The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).

Keywords: trigger, daq, mu2e, Fermilab

Procedia PDF Downloads 155
135 Improving Research by the Integration of a Collaborative Dimension in an Information Retrieval (IR) System

Authors: Amel Hannech, Mehdi Adda, Hamid Mcheick

Abstract:

In computer science, the purpose of finding useful information is still one of the most active and important research topics. The most popular application of information retrieval (IR) are Search Engines, they meet users' specific needs and aim to locate the effective information in the web. However, these search engines have some limitations related to the relevancy of the results and the ease to explore those results. In this context, we proposed in previous works a Multi-Space Search Engine model that is based on a multidimensional interpretation universe. In the present paper, we integrate an additional dimension that allows to offer users new research experiences. The added component is based on creating user profiles and calculating the similarity between them that then allow the use of collaborative filtering in retrieving search results. To evaluate the effectiveness of the proposed model, a prototype is developed. The experiments showed that the additional dimension has improved the relevancy of results by predicting the interesting items of users based on their experiences and the experiences of other similar users. The offered personalization service allows users to approve the pertinent items, which allows to enrich their profiles and further improve research.

Keywords: information retrieval, v-facets, user behavior analysis, user profiles, topical ontology, association rules, data personalization

Procedia PDF Downloads 262
134 Lab Bench for Synthetic Aperture Radar Imaging System

Authors: Karthiyayini Nagarajan, P. V. Ramakrishna

Abstract:

Radar Imaging techniques provides extensive applications in the field of remote sensing, majorly Synthetic Aperture Radar (SAR) that provide high resolution target images. This paper work puts forward the effective and realizable signal generation and processing for SAR images. The major units in the system include camera, signal generation unit, signal processing unit and display screen. The real radio channel is replaced by its mathematical model based on optical image to calculate a reflected signal model in real time. Signal generation realizes the algorithm and forms the radar reflection model. Signal processing unit provides range and azimuth resolution through matched filtering and spectrum analysis procedure to form radar image on the display screen. The restored image has the same quality as that of the optical image. This SAR imaging system has been designed and implemented using MATLAB and Quartus II tools on Stratix III device as a System (Lab Bench) that works in real time to study/investigate on radar imaging rudiments and signal processing scheme for educational and research purposes.

Keywords: synthetic aperture radar, radio reflection model, lab bench, imaging engineering

Procedia PDF Downloads 497
133 Closed-Loop Audit of the Degree of the Management of Thrombocytosis in Accordance with Nice Guidance at Roseneath General Practice

Authors: Georgia Mills, Rachel Parsonage

Abstract:

Thrombocytosis is a platelet count above the upper limit of the normal range. An urgent referral is advised for counts over 1000 x109 and if the count is between 600-1000 x109 with certain conditions/age. A non-urgent referral is warranted when the level is above 450 × 109/L (for more than 3 months) or over 600 × 109/L on at least two occasions (4–6 weeks apart) or within the range 450–600 × 109/L with other haematological abnormalities. The aim of this audit is the assess how well Roseneath's general practice has adhered to the National Institute for Health and Care Excellence (NICE) guidelines for investigations and management of high platelet counts. Through the filtering tool on Vision, all blood results in the surgery were filtered to only show those with a platelet count above 450 x 109 /L. These patients were then analyzed individually to see where they fall on the current NICE guidance pathway for management. The investigations and management of thrombocytosis were generally poor. 60% of those who needed an urgent referral did not have it done. 30% of those who needed a follow-up blood test did not have it done. 60% of those needing a routine referral from complete investigations did not have it done. To improve the knowledge of NICE guidelines within the practice, a teaching session was delivered. Percentages then reached 100% in the 2nd audit. There is a lack of awareness of guidelines and education on thrombocytosis in primary care. Teaching sessions will benefit outcomes greatly

Keywords: platelets, thrombocytosis, management, referral

Procedia PDF Downloads 63
132 Design and Implementation of a Lab Bench for Synthetic Aperture Radar Imaging System

Authors: Karthiyayini Nagarajan, P. V. RamaKrishna

Abstract:

Radar Imaging techniques provides extensive applications in the field of remote sensing, majorly Synthetic Aperture Radar(SAR) that provide high resolution target images. This paper work puts forward the effective and realizable signal generation and processing for SAR images. The major units in the system include camera, signal generation unit, signal processing unit and display screen. The real radio channel is replaced by its mathematical model based on optical image to calculate a reflected signal model in real time. Signal generation realizes the algorithm and forms the radar reflection model. Signal processing unit provides range and azimuth resolution through matched filtering and spectrum analysis procedure to form radar image on the display screen. The restored image has the same quality as that of the optical image. This SAR imaging system has been designed and implemented using MATLAB and Quartus II tools on Stratix III device as a System(lab bench) that works in real time to study/investigate on radar imaging rudiments and signal processing scheme for educational and research purposes.

Keywords: synthetic aperture radar, radio reflection model, lab bench

Procedia PDF Downloads 468
131 Measuring and Evaluating the Effectiveness of Mobile High Efficiency Particulate Air Filtering on Particulate Matter within the Road Traffic Network of a Sample of Non-Sparse and Sparse Urban Environments in the UK

Authors: Richard Maguire

Abstract:

This research evaluates the efficiency of using mobile HEPA filters to reduce localized Particulate Matter (PM), Total Volatile Organic Chemical (TVOC) and Formaldehyde (HCHO) Air Pollution. The research is being performed using a standard HEPA filter that is tube fitted and attached to a motor vehicle. The velocity of the vehicle is used to generate the pressure difference that allows the filter to remove PM, VOC and HCOC pollution from the localized atmosphere of a road transport traffic route. The testing has been performed on a sample of traffic routes in Non-Sparse and Sparse urban environments within the UK. Pre and Post filter measuring of the PM2.5 Air Quality has been carried out along with demographics of the climate environment, including live filming of the traffic conditions. This provides a base line for future national and international research. The effectiveness measurement is generated through evaluating the difference in PM2.5 Air Quality measured pre- and post- the mobile filter test equipment. A series of further research opportunities and future exploitation options are made based on the results of the research.

Keywords: high efficiency particulate air, HEPA filter, particulate matter, traffic pollution

Procedia PDF Downloads 123
130 Contactless Heart Rate Measurement System based on FMCW Radar and LSTM for Automotive Applications

Authors: Asma Omri, Iheb Sifaoui, Sofiane Sayahi, Hichem Besbes

Abstract:

Future vehicle systems demand advanced capabilities, notably in-cabin life detection and driver monitoring systems, with a particular emphasis on drowsiness detection. To meet these requirements, several techniques employ artificial intelligence methods based on real-time vital sign measurements. In parallel, Frequency-Modulated Continuous-Wave (FMCW) radar technology has garnered considerable attention in the domains of healthcare and biomedical engineering for non-invasive vital sign monitoring. FMCW radar offers a multitude of advantages, including its non-intrusive nature, continuous monitoring capacity, and its ability to penetrate through clothing. In this paper, we propose a system utilizing the AWR6843AOP radar from Texas Instruments (TI) to extract precise vital sign information. The radar allows us to estimate Ballistocardiogram (BCG) signals, which capture the mechanical movements of the body, particularly the ballistic forces generated by heartbeats and respiration. These signals are rich sources of information about the cardiac cycle, rendering them suitable for heart rate estimation. The process begins with real-time subject positioning, followed by clutter removal, computation of Doppler phase differences, and the use of various filtering methods to accurately capture subtle physiological movements. To address the challenges associated with FMCW radar-based vital sign monitoring, including motion artifacts due to subjects' movement or radar micro-vibrations, Long Short-Term Memory (LSTM) networks are implemented. LSTM's adaptability to different heart rate patterns and ability to handle real-time data make it suitable for continuous monitoring applications. Several crucial steps were taken, including feature extraction (involving amplitude, time intervals, and signal morphology), sequence modeling, heart rate estimation through the analysis of detected cardiac cycles and their temporal relationships, and performance evaluation using metrics such as Root Mean Square Error (RMSE) and correlation with reference heart rate measurements. For dataset construction and LSTM training, a comprehensive data collection system was established, integrating the AWR6843AOP radar, a Heart Rate Belt, and a smart watch for ground truth measurements. Rigorous synchronization of these devices ensured data accuracy. Twenty participants engaged in various scenarios, encompassing indoor and real-world conditions within a moving vehicle equipped with the radar system. Static and dynamic subject’s conditions were considered. The heart rate estimation through LSTM outperforms traditional signal processing techniques that rely on filtering, Fast Fourier Transform (FFT), and thresholding. It delivers an average accuracy of approximately 91% with an RMSE of 1.01 beat per minute (bpm). In conclusion, this paper underscores the promising potential of FMCW radar technology integrated with artificial intelligence algorithms in the context of automotive applications. This innovation not only enhances road safety but also paves the way for its integration into the automotive ecosystem to improve driver well-being and overall vehicular safety.

Keywords: ballistocardiogram, FMCW Radar, vital sign monitoring, LSTM

Procedia PDF Downloads 72
129 Modeling of a UAV Longitudinal Dynamics through System Identification Technique

Authors: Asadullah I. Qazi, Mansoor Ahsan, Zahir Ashraf, Uzair Ahmad

Abstract:

System identification of an Unmanned Aerial Vehicle (UAV), to acquire its mathematical model, is a significant step in the process of aircraft flight automation. The need for reliable mathematical model is an established requirement for autopilot design, flight simulator development, aircraft performance appraisal, analysis of aircraft modifications, preflight testing of prototype aircraft and investigation of fatigue life and stress distribution etc.  This research is aimed at system identification of a fixed wing UAV by means of specifically designed flight experiment. The purposely designed flight maneuvers were performed on the UAV and aircraft states were recorded during these flights. Acquired data were preprocessed for noise filtering and bias removal followed by parameter estimation of longitudinal dynamics transfer functions using MATLAB system identification toolbox. Black box identification based transfer function models, in response to elevator and throttle inputs, were estimated using least square error   technique. The identification results show a high confidence level and goodness of fit between the estimated model and actual aircraft response.

Keywords: fixed wing UAV, system identification, black box modeling, longitudinal dynamics, least square error

Procedia PDF Downloads 325
128 Video Text Information Detection and Localization in Lecture Videos Using Moments

Authors: Belkacem Soundes, Guezouli Larbi

Abstract:

This paper presents a robust and accurate method for text detection and localization over lecture videos. Frame regions are classified into text or background based on visual feature analysis. However, lecture video shows significant degradation mainly related to acquisition conditions, camera motion and environmental changes resulting in low quality videos. Hence, affecting feature extraction and description efficiency. Moreover, traditional text detection methods cannot be directly applied to lecture videos. Therefore, robust feature extraction methods dedicated to this specific video genre are required for robust and accurate text detection and extraction. Method consists of a three-step process: Slide region detection and segmentation; Feature extraction and non-text filtering. For robust and effective features extraction moment functions are used. Two distinct types of moments are used: orthogonal and non-orthogonal. For orthogonal Zernike Moments, both Pseudo Zernike moments are used, whereas for non-orthogonal ones Hu moments are used. Expressivity and description efficiency are given and discussed. Proposed approach shows that in general, orthogonal moments show high accuracy in comparison to the non-orthogonal one. Pseudo Zernike moments are more effective than Zernike with better computation time.

Keywords: text detection, text localization, lecture videos, pseudo zernike moments

Procedia PDF Downloads 152
127 Robust Medical Image Watermarking based on Contourlet and Extraction Using ICA

Authors: S. Saju, G. Thirugnanam

Abstract:

In this paper, a medical image watermarking algorithm based on contourlet is proposed. Medical image watermarking is a special subcategory of image watermarking in the sense that images have special requirements. Watermarked medical images should not differ perceptually from their original counterparts because clinical reading of images must not be affected. Watermarking techniques based on wavelet transform are reported in many literatures but robustness and security using contourlet are better when compared to wavelet transform. The main challenge in exploring geometry in images comes from the discrete nature of the data. In this paper, original image is decomposed to two level using contourlet and the watermark is embedded in the resultant sub-bands. Sub-band selection is based on the value of Peak Signal to Noise Ratio (PSNR) that is calculated between watermarked and original image. To extract the watermark, Kernel ICA is used and it has a novel characteristic is that it does not require the transformation process to extract the watermark. Simulation results show that proposed scheme is robust against attacks such as Salt and Pepper noise, Median filtering and rotation. The performance measures like PSNR and Similarity measure are evaluated and compared with Discrete Wavelet Transform (DWT) to prove the robustness of the scheme. Simulations are carried out using Matlab Software.

Keywords: digital watermarking, independent component analysis, wavelet transform, contourlet

Procedia PDF Downloads 528
126 Semiautomatic Calculation of Ejection Fraction Using Echocardiographic Image Processing

Authors: Diana Pombo, Maria Loaiza, Mauricio Quijano, Alberto Cadena, Juan Pablo Tello

Abstract:

In this paper, we present a semi-automatic tool for calculating ejection fraction from an echocardiographic video signal which is derived from a database in DICOM format, of Clinica de la Costa - Barranquilla. Described in this paper are each of the steps and methods used to find the respective calculation that includes acquisition and formation of the test samples, processing and finally the calculation of the parameters to obtain the ejection fraction. Two imaging segmentation methods were compared following a methodological framework that is similar only in the initial stages of processing (process of filtering and image enhancement) and differ in the end when algorithms are implemented (Active Contour and Region Growing Algorithms). The results were compared with the measurements obtained by two different medical specialists in cardiology who calculated the ejection fraction of the study samples using the traditional method, which consists of drawing the region of interest directly from the computer using echocardiography equipment and a simple equation to calculate the desired value. The results showed that if the quality of video samples are good (i.e., after the pre-processing there is evidence of an improvement in the contrast), the values provided by the tool are substantially close to those reported by physicians; also the correlation between physicians does not vary significantly.

Keywords: echocardiography, DICOM, processing, segmentation, EDV, ESV, ejection fraction

Procedia PDF Downloads 426
125 A Combination of Filtration and Coagulation Processes for Tannery Effluent Treatment

Authors: M. G. Mostafa, Manjushree Chowdhury, Tapan Kumar Biswas, , Ananda Kumar Saha

Abstract:

This study focused on effluents characterization and treatment process to reduce of toxicity from tannery effluents. Tanning industry is one of the oldest industries in the world. It is typically characterized as pollutants generated industries which produce wide varieties of high strength toxic chemicals. The study was conducted during the year 2008 to 2009 and the tannery effluents were collected three times in a year from the outlet of some selected leather industries located in Hagaribagh industrial zone Dhaka, Bangladesh. The analysis results of the raw effluents reveal that the effluents were yellowish-brown color, having basic pH, very high value of BOD5¬¬, COD, TDS, TSS, TS, and high concentrations of Cr, Na, SO42-, Cl- and other organic and inorganic constituents. The tannery effluents were treated with various doses of FeCl3 after settling and a subsequent filtration through sand-stone. The study observed that coagulant (FeCl3) 150 mg/L dose around neutral pH showed the best removal efficiency for major physico-chemical parameters. The analysis results of illustrate that the most of the physical and chemical parameters were found well below the prescribed permissible limits for effluent discharged. The study suggests that tannery effluents could be treated by a combined process consisting of settling, filtering and coagulating with FeCl3.

Keywords: characterization, effluent, tannery, treatment

Procedia PDF Downloads 450
124 Frequency Selective Filters for Estimating the Equivalent Circuit Parameters of Li-Ion Battery

Authors: Arpita Mondal, Aurobinda Routray, Sreeraj Puravankara, Rajashree Biswas

Abstract:

The most difficult part of designing a battery management system (BMS) is battery modeling. A good battery model can capture the dynamics which helps in energy management, by accurate model-based state estimation algorithms. So far the most suitable and fruitful model is the equivalent circuit model (ECM). However, in real-time applications, the model parameters are time-varying, changes with current, temperature, state of charge (SOC), and aging of the battery and this make a great impact on the performance of the model. Therefore, to increase the equivalent circuit model performance, the parameter estimation has been carried out in the frequency domain. The battery is a very complex system, which is associated with various chemical reactions and heat generation. Therefore, it’s very difficult to select the optimal model structure. As we know, if the model order is increased, the model accuracy will be improved automatically. However, the higher order model will face the tendency of over-parameterization and unfavorable prediction capability, while the model complexity will increase enormously. In the time domain, it becomes difficult to solve higher order differential equations as the model order increases. This problem can be resolved by frequency domain analysis, where the overall computational problems due to ill-conditioning reduce. In the frequency domain, several dominating frequencies can be found in the input as well as output data. The selective frequency domain estimation has been carried out, first by estimating the frequencies of the input and output by subspace decomposition, then by choosing the specific bands from the most dominating to the least, while carrying out the least-square, recursive least square and Kalman Filter based parameter estimation. In this paper, a second order battery model consisting of three resistors, two capacitors, and one SOC controlled voltage source has been chosen. For model identification and validation hybrid pulse power characterization (HPPC) tests have been carried out on a 2.6 Ah LiFePO₄ battery.

Keywords: equivalent circuit model, frequency estimation, parameter estimation, subspace decomposition

Procedia PDF Downloads 150
123 Investigation of the Litho-Structure of Ilesa Using High Resolution Aeromagnetic Data

Authors: Oladejo Olagoke Peter, Adagunodo T. A., Ogunkoya C. O.

Abstract:

The research investigated the arrangement of some geological features under Ilesa employing aeromagnetic data. The obtained data was subjected to various data filtering and processing techniques, which are Total Horizontal Derivative (THD), Depth Continuation and Analytical Signal Amplitude using Geosoft Oasis Montaj 6.4.2 software. The Reduced to the Equator –Total Magnetic Intensity (TRE-TMI) outcomes reveal significant magnetic anomalies, with high magnitude (55.1 to 155 nT) predominantly at the Northwest half of the area. Intermediate magnetic susceptibility, ranging between 6.0 to 55.1 nT, dominates the eastern part, separated by depressions and uplifts. The southern part of the area exhibits a magnetic field of low intensity, ranging from -76.6 to 6.0 nT. The lineaments exhibit varying lengths ranging from 2.5 and 16.0 km. Analyzing the Rose Diagram and the analytical signal amplitude indicates structural styles mainly of E-W and NE-SW orientations, particularly evident in the western, SW and NE regions with an amplitude of 0.0318nT/m. The identified faults in the area demonstrate orientations of NNW-SSE, NNE-SSW and WNW-ESE, situated at depths ranging from 500 to 750 m. Considering the divergence magnetic susceptibility, structural style or orientation of the lineaments, identified fault and their depth, these lithological features could serve as a valuable foundation for assessing ground motion, particularly in the presence of sufficient seismic energy.

Keywords: lineament, aeromagnetic, anomaly, fault, magnetic

Procedia PDF Downloads 75
122 An Enhanced SAR-Based Tsunami Detection System

Authors: Jean-Pierre Dubois, Jihad S. Daba, H. Karam, J. Abdallah

Abstract:

Tsunami early detection and warning systems have proved to be of ultimate importance, especially after the destructive tsunami that hit Japan in March 2012. Such systems are crucial to inform the authorities of any risk of a tsunami and of the degree of its danger in order to make the right decision and notify the public of the actions they need to take to save their lives. The purpose of this research is to enhance existing tsunami detection and warning systems. We first propose an automated and miniaturized model of an early tsunami detection and warning system. The model for the operation of a tsunami warning system is simulated using the data acquisition toolbox of Matlab and measurements acquired from specified internet pages due to the lack of the required real-life sensors, both seismic and hydrologic, and building a graphical user interface for the system. In the second phase of this work, we implement various satellite image filtering schemes to enhance the acquired synthetic aperture radar images of the tsunami affected region that are masked by speckle noise. This enables us to conduct a post-tsunami damage extent study and calculate the percentage damage. We conclude by proposing improvements to the existing telecommunication infrastructure of existing warning tsunami systems using a migration to IP-based networks and fiber optics links.

Keywords: detection, GIS, GSN, GTS, GPS, speckle noise, synthetic aperture radar, tsunami, wiener filter

Procedia PDF Downloads 392
121 Location Uncertainty – A Probablistic Solution for Automatic Train Control

Authors: Monish Sengupta, Benjamin Heydecker, Daniel Woodland

Abstract:

New train control systems rely mainly on Automatic Train Protection (ATP) and Automatic Train Operation (ATO) dynamically to control the speed and hence performance. The ATP and the ATO form the vital element within the CBTC (Communication Based Train Control) and within the ERTMS (European Rail Traffic Management System) system architectures. Reliable and accurate measurement of train location, speed and acceleration are vital to the operation of train control systems. In the past, all CBTC and ERTMS system have deployed a balise or equivalent to correct the uncertainty element of the train location. Typically a CBTC train is allowed to miss only one balise on the track, after which the Automatic Train Protection (ATP) system applies emergency brake to halt the service. This is because the location uncertainty, which grows within the train control system, cannot tolerate missing more than one balise. Balises contribute a significant amount towards wayside maintenance and studies have shown that balises on the track also forms a constraint for future track layout change and change in speed profile.This paper investigates the causes of the location uncertainty that is currently experienced and considers whether it is possible to identify an effective filter to ascertain, in conjunction with appropriate sensors, more accurate speed, distance and location for a CBTC driven train without the need of any external balises. An appropriate sensor fusion algorithm and intelligent sensor selection methodology will be deployed to ascertain the railway location and speed measurement at its highest precision. Similar techniques are already in use in aviation, satellite, submarine and other navigation systems. Developing a model for the speed control and the use of Kalman filter is a key element in this research. This paper will summarize the research undertaken and its significant findings, highlighting the potential for introducing alternative approaches to train positioning that would enable removal of all trackside location correction balises, leading to huge reduction in maintenances and more flexibility in future track design.

Keywords: ERTMS, CBTC, ATP, ATO

Procedia PDF Downloads 410
120 TerraEnhance: High-Resolution Digital Elevation Model Generation using GANs

Authors: Siddharth Sarma, Ayush Majumdar, Nidhi Sabu, Mufaddal Jiruwaala, Shilpa Paygude

Abstract:

Digital Elevation Models (DEMs) are digital representations of the Earth’s topography, which include information about the elevation, slope, aspect, and other terrain attributes. DEMs play a crucial role in various applications, including terrain analysis, urban planning, and environmental modeling. In this paper, TerraEnhance is proposed, a distinct approach for high-resolution DEM generation using Generative Adversarial Networks (GANs) combined with Real-ESRGANs. By learning from a dataset of low-resolution DEMs, the GANs are trained to upscale the data by 10 times, resulting in significantly enhanced DEMs with improved resolution and finer details. The integration of Real-ESRGANs further enhances visual quality, leading to more accurate representations of the terrain. A post-processing layer is introduced, employing high-pass filtering to refine the generated DEMs, preserving important details while reducing noise and artifacts. The results demonstrate that TerraEnhance outperforms existing methods, producing high-fidelity DEMs with intricate terrain features and exceptional accuracy. These advancements make TerraEnhance suitable for various applications, such as terrain analysis and precise environmental modeling.

Keywords: DEM, ESRGAN, image upscaling, super resolution, computer vision

Procedia PDF Downloads 8
119 Intelligent Chatbot Generating Dynamic Responses Through Natural Language Processing

Authors: Aarnav Singh, Jatin Moolchandani

Abstract:

The proposed research work aims to build a query-based AI chatbot that can answer any question related to any topic. A chatbot is software that converses with users via text messages. In the proposed system, we aim to build a chatbot that generates a response based on the user’s query. For this, we use natural language processing to analyze the query and some set of texts to form a concise answer. The texts are obtained through web-scrapping and filtering all the credible sources from a web search. The objective of this project is to provide a chatbot that is able to provide simple and accurate answers without the user having to read through a large number of articles and websites. Creating an AI chatbot that can answer a variety of user questions on a variety of topics is the goal of the proposed research project. This chatbot uses natural language processing to comprehend user inquiries and provides succinct responses by examining a collection of writings that were scraped from the internet. The texts are carefully selected from reliable websites that are found via internet searches. This project aims to provide users with a chatbot that provides clear and precise responses, removing the need to go through several articles and web pages in great detail. In addition to exploring the reasons for their broad acceptance and their usefulness across many industries, this article offers an overview of the interest in chatbots throughout the world.

Keywords: Chatbot, Artificial Intelligence, natural language processing, web scrapping

Procedia PDF Downloads 66
118 Offline Parameter Identification and State-of-Charge Estimation for Healthy and Aged Electric Vehicle Batteries Based on the Combined Model

Authors: Xiaowei Zhang, Min Xu, Saeid Habibi, Fengjun Yan, Ryan Ahmed

Abstract:

Recently, Electric Vehicles (EVs) have received extensive consideration since they offer a more sustainable and greener transportation alternative compared to fossil-fuel propelled vehicles. Lithium-Ion (Li-ion) batteries are increasingly being deployed in EVs because of their high energy density, high cell-level voltage, and low rate of self-discharge. Since Li-ion batteries represent the most expensive component in the EV powertrain, accurate monitoring and control strategies must be executed to ensure their prolonged lifespan. The Battery Management System (BMS) has to accurately estimate parameters such as the battery State-of-Charge (SOC), State-of-Health (SOH), and Remaining Useful Life (RUL). In order for the BMS to estimate these parameters, an accurate and control-oriented battery model has to work collaboratively with a robust state and parameter estimation strategy. Since battery physical parameters, such as the internal resistance and diffusion coefficient change depending on the battery state-of-life (SOL), the BMS has to be adaptive to accommodate for this change. In this paper, an extensive battery aging study has been conducted over 12-months period on 5.4 Ah, 3.7 V Lithium polymer cells. Instead of using fixed charging/discharging aging cycles at fixed C-rate, a set of real-world driving scenarios have been used to age the cells. The test has been interrupted every 5% capacity degradation by a set of reference performance tests to assess the battery degradation and track model parameters. As battery ages, the combined model parameters are optimized and tracked in an offline mode over the entire batteries lifespan. Based on the optimized model, a state and parameter estimation strategy based on the Extended Kalman Filter (EKF) and the relatively new Smooth Variable Structure Filter (SVSF) have been applied to estimate the SOC at various states of life.

Keywords: lithium-ion batteries, genetic algorithm optimization, battery aging test, parameter identification

Procedia PDF Downloads 267
117 Case Study of High-Resolution Marine Seismic Survey in Shallow Water, Arabian Gulf, Saudi Arabia

Authors: Almalki M., Alajmi M., Qadrouh Y., Alzahrani E., Sulaiman A., Aleid M., Albaiji A., Alfaifi H., Alhadadi A., Almotairy H., Alrasheed R., Alhafedh Y.

Abstract:

High-resolution marine seismic survey is a well-established technique that commonly used to characterize near-surface sediments and geological structures at shallow water. We conduct single channel seismic survey to provide high quality seismic images for near-surface sediments upto 100m depth at Jubal costal area, Arabian Gulf. Eight hydrophones streamer has been used to collect stacked seismic traces alone 5km seismic line. To reach the required depth, we have used spark system that discharges energies above 5000 J with expected frequency output span the range from 200 to 2000 Hz. A suitable processing flow implemented to enhance the signal-to-noise ratio of the seismic profile. We have found that shallow sedimentary layers at the study site have complex pattern of reflectivity, which decay significantly due to amount of source energy used as well as the multiples associated to seafloor. In fact, the results reveal that single channel marine seismic at shallow water is a cost-effective technique that can be easily repeated to observe any possibly changes in the wave physical properties at the near surface layers

Keywords: shallow marine single-channel data, high resolution, frequency filtering, shallow water

Procedia PDF Downloads 72
116 Image-Based UAV Vertical Distance and Velocity Estimation Algorithm during the Vertical Landing Phase Using Low-Resolution Images

Authors: Seyed-Yaser Nabavi-Chashmi, Davood Asadi, Karim Ahmadi, Eren Demir

Abstract:

The landing phase of a UAV is very critical as there are many uncertainties in this phase, which can easily entail a hard landing or even a crash. In this paper, the estimation of relative distance and velocity to the ground, as one of the most important processes during the landing phase, is studied. Using accurate measurement sensors as an alternative approach can be very expensive for sensors like LIDAR, or with a limited operational range, for sensors like ultrasonic sensors. Additionally, absolute positioning systems like GPS or IMU cannot provide distance to the ground independently. The focus of this paper is to determine whether we can measure the relative distance and velocity of UAV and ground in the landing phase using just low-resolution images taken by a monocular camera. The Lucas-Konda feature detection technique is employed to extract the most suitable feature in a series of images taken during the UAV landing. Two different approaches based on Extended Kalman Filters (EKF) have been proposed, and their performance in estimation of the relative distance and velocity are compared. The first approach uses the kinematics of the UAV as the process and the calculated optical flow as the measurement; On the other hand, the second approach uses the feature’s projection on the camera plane (pixel position) as the measurement while employing both the kinematics of the UAV and the dynamics of variation of projected point as the process to estimate both relative distance and relative velocity. To verify the results, a sequence of low-quality images taken by a camera that is moving on a specifically developed testbed has been used to compare the performance of the proposed algorithm. The case studies show that the quality of images results in considerable noise, which reduces the performance of the first approach. On the other hand, using the projected feature position is much less sensitive to the noise and estimates the distance and velocity with relatively high accuracy. This approach also can be used to predict the future projected feature position, which can drastically decrease the computational workload, as an important criterion for real-time applications.

Keywords: altitude estimation, drone, image processing, trajectory planning

Procedia PDF Downloads 113
115 A Posteriori Trading-Inspired Model-Free Time Series Segmentation

Authors: Plessen Mogens Graf

Abstract:

Within the context of multivariate time series segmentation, this paper proposes a method inspired by a posteriori optimal trading. After a normalization step, time series are treated channelwise as surrogate stock prices that can be traded optimally a posteriori in a virtual portfolio holding either stock or cash. Linear transaction costs are interpreted as hyperparameters for noise filtering. Trading signals, as well as trading signals obtained on the reversed time series, are used for unsupervised channelwise labeling before a consensus over all channels is reached that determines the final segmentation time instants. The method is model-free such that no model prescriptions for segments are made. Benefits of proposed approach include simplicity, computational efficiency, and adaptability to a wide range of different shapes of time series. Performance is demonstrated on synthetic and real-world data, including a large-scale dataset comprising a multivariate time series of dimension 1000 and length 2709. Proposed method is compared to a popular model-based bottom-up approach fitting piecewise affine models and to a recent model-based top-down approach fitting Gaussian models and found to be consistently faster while producing more intuitive results in the sense of segmenting time series at peaks and valleys.

Keywords: time series segmentation, model-free, trading-inspired, multivariate data

Procedia PDF Downloads 136
114 Magnetic Survey for the Delineation of Concrete Pillars in Geotechnical Investigation for Site Characterization

Authors: Nuraddeen Usman, Khiruddin Abdullah, Mohd Nawawi, Amin Khalil Ismail

Abstract:

A magnetic survey is carried out in order to locate the remains of construction items, specifically concrete pillars. The conventional Euler deconvolution technique can perform the task but it requires the use of fixed structural index (SI) and the construction items are made of materials with different shapes which require different SI (unknown). A Euler deconvolution technique that estimate background, horizontal coordinate (xo and yo), depth and structural index (SI) simultaneously is prepared and used for this task. The synthetic model study carried indicated the new methodology can give a good estimate of location and does not depend on magnetic latitude. For field data, both the total magnetic field and gradiometer reading had been collected simultaneously. The computed vertical derivatives and gradiometer readings are compared and they have shown good correlation signifying the effectiveness of the method. The filtering is carried out using automated procedure, analytic signal and other traditional techniques. The clustered depth solutions coincided with the high amplitude/values of analytic signal and these are the possible target positions of the concrete pillars being sought. The targets under investigation are interpreted to be located at the depth between 2.8 to 9.4 meters. More follow up survey is recommended as this mark the preliminary stage of the work.

Keywords: concrete pillar, magnetic survey, geotechnical investigation, Euler Deconvolution

Procedia PDF Downloads 258
113 Bayesian Inference for High Dimensional Dynamic Spatio-Temporal Models

Authors: Sofia M. Karadimitriou, Kostas Triantafyllopoulos, Timothy Heaton

Abstract:

Reduced dimension Dynamic Spatio-Temporal Models (DSTMs) jointly describe the spatial and temporal evolution of a function observed subject to noise. A basic state space model is adopted for the discrete temporal variation, while a continuous autoregressive structure describes the continuous spatial evolution. Application of such a DSTM relies upon the pre-selection of a suitable reduced set of basic functions and this can present a challenge in practice. In this talk, we propose an online estimation method for high dimensional spatio-temporal data based upon DSTM and we attempt to resolve this issue by allowing the basis to adapt to the observed data. Specifically, we present a wavelet decomposition in order to obtain a parsimonious approximation of the spatial continuous process. This parsimony can be achieved by placing a Laplace prior distribution on the wavelet coefficients. The aim of using the Laplace prior, is to filter wavelet coefficients with low contribution, and thus achieve the dimension reduction with significant computation savings. We then propose a Hierarchical Bayesian State Space model, for the estimation of which we offer an appropriate particle filter algorithm. The proposed methodology is illustrated using real environmental data.

Keywords: multidimensional Laplace prior, particle filtering, spatio-temporal modelling, wavelets

Procedia PDF Downloads 427
112 A Palmprint Identification System Based Multi-Layer Perceptron

Authors: David P. Tantua, Abdulkader Helwan

Abstract:

Biometrics has been recently used for the human identification systems using the biological traits such as the fingerprints and iris scanning. Identification systems based biometrics show great efficiency and accuracy in such human identification applications. However, these types of systems are so far based on some image processing techniques only, which may decrease the efficiency of such applications. Thus, this paper aims to develop a human palmprint identification system using multi-layer perceptron neural network which has the capability to learn using a backpropagation learning algorithms. The developed system uses images obtained from a public database available on the internet (CASIA). The processing system is as follows: image filtering using median filter, image adjustment, image skeletonizing, edge detection using canny operator to extract features, clear unwanted components of the image. The second phase is to feed those processed images into a neural network classifier which will adaptively learn and create a class for each different image. 100 different images are used for training the system. Since this is an identification system, it should be tested with the same images. Therefore, the same 100 images are used for testing it, and any image out of the training set should be unrecognized. The experimental results shows that this developed system has a great accuracy 100% and it can be implemented in real life applications.

Keywords: biometrics, biological traits, multi-layer perceptron neural network, image skeletonizing, edge detection using canny operator

Procedia PDF Downloads 371
111 Carbon Skimming: Towards an Application to Summarise and Compare Embodied Carbon to Aid Early-Stage Decision Making

Authors: Rivindu Nethmin Bandara Menik Hitihamy Mudiyanselage, Matthias Hank Haeusler, Ben Doherty

Abstract:

Investors and clients in the Architectural, Engineering and Construction industry find it difficult to understand complex datasets and reports with little to no graphic representation. The stakeholders examined in this paper include designers, design clients and end-users. Communicating embodied carbon information graphically and concisely can aid with decision support early in a building's life cycle. It is essential to create a common visualisation approach as the level of knowledge about embodied carbon varies between stakeholders. The tool, designed in conjunction with Bates Smart, condenses Tally Life Cycle Assessment data to a carbon hot-spotting visualisation, highlighting the sections with the highest amounts of embodied carbon. This allows stakeholders at every stage of a given project to have a better understanding of the carbon implications with minimal effort. It further allows stakeholders to differentiate building elements by their carbon values, which enables the evaluation of the cost-effectiveness of the selected materials at an early stage. To examine and build a decision-support tool, an action-design research methodology of cycles of iterations was used along with precedents of embodied carbon visualising tools. Accordingly, the importance of visualisation and Building Information Modelling are also explored to understand the best format for relaying these results.

Keywords: embodied carbon, visualisation, summarisation, data filtering, early-stage decision-making, materiality

Procedia PDF Downloads 82