Search results for: Bayesian filtering
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 627

Search results for: Bayesian filtering

177 The Trigger-DAQ System in the Mu2e Experiment

Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella

Abstract:

The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).

Keywords: trigger, daq, mu2e, Fermilab

Procedia PDF Downloads 125
176 Assessing Level of Pregnancy Rate and Milk Yield in Indian Murrah Buffaloes

Authors: V. Jamuna, A. K. Chakravarty, C. S. Patil, Vijay Kumar, M. A. Mir, Rakesh Kumar

Abstract:

Intense selection of buffaloes for milk production at organized herds of the country without giving due attention to fertility traits viz. pregnancy rate has lead to deterioration in their performances. Aim of study is to develop an optimum model for predicting pregnancy rate and to assess the level of pregnancy rate with respect to milk production Murrah buffaloes. Data pertaining to 1224 lactation records of Murrah buffaloes spread over a period 21 years were analyzed and it was observed that pregnancy rate depicted negative phenotypic association with lactation milk yield (-0.08 ± 0.04). For developing optimum model for pregnancy rate in Murrah buffaloes seven simple and multiple regression models were developed. Among the seven models, model II having only Service period as an independent reproduction variable, was found to be the best prediction model, based on the four statistical criterions (high coefficient of determination (R 2), low mean sum of squares due to error (MSSe), conceptual predictive (CP) value, and Bayesian information criterion (BIC). For standardizing the level of fertility with milk production, pregnancy rate was classified into seven classes with the increment of 10% in all parities, life time and their corresponding average pregnancy rate in relation to the average lactation milk yield (MY).It was observed that to achieve around 2000 kg MY which can be considered optimum for Indian Murrah buffaloes, level of pregnancy rate should be in between 30-50%.

Keywords: life time, pregnancy rate, production, service period, standardization

Procedia PDF Downloads 604
175 Improving Research by the Integration of a Collaborative Dimension in an Information Retrieval (IR) System

Authors: Amel Hannech, Mehdi Adda, Hamid Mcheick

Abstract:

In computer science, the purpose of finding useful information is still one of the most active and important research topics. The most popular application of information retrieval (IR) are Search Engines, they meet users' specific needs and aim to locate the effective information in the web. However, these search engines have some limitations related to the relevancy of the results and the ease to explore those results. In this context, we proposed in previous works a Multi-Space Search Engine model that is based on a multidimensional interpretation universe. In the present paper, we integrate an additional dimension that allows to offer users new research experiences. The added component is based on creating user profiles and calculating the similarity between them that then allow the use of collaborative filtering in retrieving search results. To evaluate the effectiveness of the proposed model, a prototype is developed. The experiments showed that the additional dimension has improved the relevancy of results by predicting the interesting items of users based on their experiences and the experiences of other similar users. The offered personalization service allows users to approve the pertinent items, which allows to enrich their profiles and further improve research.

Keywords: information retrieval, v-facets, user behavior analysis, user profiles, topical ontology, association rules, data personalization

Procedia PDF Downloads 235
174 Lab Bench for Synthetic Aperture Radar Imaging System

Authors: Karthiyayini Nagarajan, P. V. Ramakrishna

Abstract:

Radar Imaging techniques provides extensive applications in the field of remote sensing, majorly Synthetic Aperture Radar (SAR) that provide high resolution target images. This paper work puts forward the effective and realizable signal generation and processing for SAR images. The major units in the system include camera, signal generation unit, signal processing unit and display screen. The real radio channel is replaced by its mathematical model based on optical image to calculate a reflected signal model in real time. Signal generation realizes the algorithm and forms the radar reflection model. Signal processing unit provides range and azimuth resolution through matched filtering and spectrum analysis procedure to form radar image on the display screen. The restored image has the same quality as that of the optical image. This SAR imaging system has been designed and implemented using MATLAB and Quartus II tools on Stratix III device as a System (Lab Bench) that works in real time to study/investigate on radar imaging rudiments and signal processing scheme for educational and research purposes.

Keywords: synthetic aperture radar, radio reflection model, lab bench, imaging engineering

Procedia PDF Downloads 461
173 Closed-Loop Audit of the Degree of the Management of Thrombocytosis in Accordance with Nice Guidance at Roseneath General Practice

Authors: Georgia Mills, Rachel Parsonage

Abstract:

Thrombocytosis is a platelet count above the upper limit of the normal range. An urgent referral is advised for counts over 1000 x109 and if the count is between 600-1000 x109 with certain conditions/age. A non-urgent referral is warranted when the level is above 450 × 109/L (for more than 3 months) or over 600 × 109/L on at least two occasions (4–6 weeks apart) or within the range 450–600 × 109/L with other haematological abnormalities. The aim of this audit is the assess how well Roseneath's general practice has adhered to the National Institute for Health and Care Excellence (NICE) guidelines for investigations and management of high platelet counts. Through the filtering tool on Vision, all blood results in the surgery were filtered to only show those with a platelet count above 450 x 109 /L. These patients were then analyzed individually to see where they fall on the current NICE guidance pathway for management. The investigations and management of thrombocytosis were generally poor. 60% of those who needed an urgent referral did not have it done. 30% of those who needed a follow-up blood test did not have it done. 60% of those needing a routine referral from complete investigations did not have it done. To improve the knowledge of NICE guidelines within the practice, a teaching session was delivered. Percentages then reached 100% in the 2nd audit. There is a lack of awareness of guidelines and education on thrombocytosis in primary care. Teaching sessions will benefit outcomes greatly

Keywords: platelets, thrombocytosis, management, referral

Procedia PDF Downloads 36
172 Design and Implementation of a Lab Bench for Synthetic Aperture Radar Imaging System

Authors: Karthiyayini Nagarajan, P. V. RamaKrishna

Abstract:

Radar Imaging techniques provides extensive applications in the field of remote sensing, majorly Synthetic Aperture Radar(SAR) that provide high resolution target images. This paper work puts forward the effective and realizable signal generation and processing for SAR images. The major units in the system include camera, signal generation unit, signal processing unit and display screen. The real radio channel is replaced by its mathematical model based on optical image to calculate a reflected signal model in real time. Signal generation realizes the algorithm and forms the radar reflection model. Signal processing unit provides range and azimuth resolution through matched filtering and spectrum analysis procedure to form radar image on the display screen. The restored image has the same quality as that of the optical image. This SAR imaging system has been designed and implemented using MATLAB and Quartus II tools on Stratix III device as a System(lab bench) that works in real time to study/investigate on radar imaging rudiments and signal processing scheme for educational and research purposes.

Keywords: synthetic aperture radar, radio reflection model, lab bench

Procedia PDF Downloads 440
171 Measuring and Evaluating the Effectiveness of Mobile High Efficiency Particulate Air Filtering on Particulate Matter within the Road Traffic Network of a Sample of Non-Sparse and Sparse Urban Environments in the UK

Authors: Richard Maguire

Abstract:

This research evaluates the efficiency of using mobile HEPA filters to reduce localized Particulate Matter (PM), Total Volatile Organic Chemical (TVOC) and Formaldehyde (HCHO) Air Pollution. The research is being performed using a standard HEPA filter that is tube fitted and attached to a motor vehicle. The velocity of the vehicle is used to generate the pressure difference that allows the filter to remove PM, VOC and HCOC pollution from the localized atmosphere of a road transport traffic route. The testing has been performed on a sample of traffic routes in Non-Sparse and Sparse urban environments within the UK. Pre and Post filter measuring of the PM2.5 Air Quality has been carried out along with demographics of the climate environment, including live filming of the traffic conditions. This provides a base line for future national and international research. The effectiveness measurement is generated through evaluating the difference in PM2.5 Air Quality measured pre- and post- the mobile filter test equipment. A series of further research opportunities and future exploitation options are made based on the results of the research.

Keywords: high efficiency particulate air, HEPA filter, particulate matter, traffic pollution

Procedia PDF Downloads 98
170 Performance Assessment of Multi-Level Ensemble for Multi-Class Problems

Authors: Rodolfo Lorbieski, Silvia Modesto Nassar

Abstract:

Many supervised machine learning tasks require decision making across numerous different classes. Multi-class classification has several applications, such as face recognition, text recognition and medical diagnostics. The objective of this article is to analyze an adapted method of Stacking in multi-class problems, which combines ensembles within the ensemble itself. For this purpose, a training similar to Stacking was used, but with three levels, where the final decision-maker (level 2) performs its training by combining outputs from the tree-based pair of meta-classifiers (level 1) from Bayesian families. These are in turn trained by pairs of base classifiers (level 0) of the same family. This strategy seeks to promote diversity among the ensembles forming the meta-classifier level 2. Three performance measures were used: (1) accuracy, (2) area under the ROC curve, and (3) time for three factors: (a) datasets, (b) experiments and (c) levels. To compare the factors, ANOVA three-way test was executed for each performance measure, considering 5 datasets by 25 experiments by 3 levels. A triple interaction between factors was observed only in time. The accuracy and area under the ROC curve presented similar results, showing a double interaction between level and experiment, as well as for the dataset factor. It was concluded that level 2 had an average performance above the other levels and that the proposed method is especially efficient for multi-class problems when compared to binary problems.

Keywords: stacking, multi-layers, ensemble, multi-class

Procedia PDF Downloads 241
169 Contactless Heart Rate Measurement System based on FMCW Radar and LSTM for Automotive Applications

Authors: Asma Omri, Iheb Sifaoui, Sofiane Sayahi, Hichem Besbes

Abstract:

Future vehicle systems demand advanced capabilities, notably in-cabin life detection and driver monitoring systems, with a particular emphasis on drowsiness detection. To meet these requirements, several techniques employ artificial intelligence methods based on real-time vital sign measurements. In parallel, Frequency-Modulated Continuous-Wave (FMCW) radar technology has garnered considerable attention in the domains of healthcare and biomedical engineering for non-invasive vital sign monitoring. FMCW radar offers a multitude of advantages, including its non-intrusive nature, continuous monitoring capacity, and its ability to penetrate through clothing. In this paper, we propose a system utilizing the AWR6843AOP radar from Texas Instruments (TI) to extract precise vital sign information. The radar allows us to estimate Ballistocardiogram (BCG) signals, which capture the mechanical movements of the body, particularly the ballistic forces generated by heartbeats and respiration. These signals are rich sources of information about the cardiac cycle, rendering them suitable for heart rate estimation. The process begins with real-time subject positioning, followed by clutter removal, computation of Doppler phase differences, and the use of various filtering methods to accurately capture subtle physiological movements. To address the challenges associated with FMCW radar-based vital sign monitoring, including motion artifacts due to subjects' movement or radar micro-vibrations, Long Short-Term Memory (LSTM) networks are implemented. LSTM's adaptability to different heart rate patterns and ability to handle real-time data make it suitable for continuous monitoring applications. Several crucial steps were taken, including feature extraction (involving amplitude, time intervals, and signal morphology), sequence modeling, heart rate estimation through the analysis of detected cardiac cycles and their temporal relationships, and performance evaluation using metrics such as Root Mean Square Error (RMSE) and correlation with reference heart rate measurements. For dataset construction and LSTM training, a comprehensive data collection system was established, integrating the AWR6843AOP radar, a Heart Rate Belt, and a smart watch for ground truth measurements. Rigorous synchronization of these devices ensured data accuracy. Twenty participants engaged in various scenarios, encompassing indoor and real-world conditions within a moving vehicle equipped with the radar system. Static and dynamic subject’s conditions were considered. The heart rate estimation through LSTM outperforms traditional signal processing techniques that rely on filtering, Fast Fourier Transform (FFT), and thresholding. It delivers an average accuracy of approximately 91% with an RMSE of 1.01 beat per minute (bpm). In conclusion, this paper underscores the promising potential of FMCW radar technology integrated with artificial intelligence algorithms in the context of automotive applications. This innovation not only enhances road safety but also paves the way for its integration into the automotive ecosystem to improve driver well-being and overall vehicular safety.

Keywords: ballistocardiogram, FMCW Radar, vital sign monitoring, LSTM

Procedia PDF Downloads 46
168 Modeling of a UAV Longitudinal Dynamics through System Identification Technique

Authors: Asadullah I. Qazi, Mansoor Ahsan, Zahir Ashraf, Uzair Ahmad

Abstract:

System identification of an Unmanned Aerial Vehicle (UAV), to acquire its mathematical model, is a significant step in the process of aircraft flight automation. The need for reliable mathematical model is an established requirement for autopilot design, flight simulator development, aircraft performance appraisal, analysis of aircraft modifications, preflight testing of prototype aircraft and investigation of fatigue life and stress distribution etc.  This research is aimed at system identification of a fixed wing UAV by means of specifically designed flight experiment. The purposely designed flight maneuvers were performed on the UAV and aircraft states were recorded during these flights. Acquired data were preprocessed for noise filtering and bias removal followed by parameter estimation of longitudinal dynamics transfer functions using MATLAB system identification toolbox. Black box identification based transfer function models, in response to elevator and throttle inputs, were estimated using least square error   technique. The identification results show a high confidence level and goodness of fit between the estimated model and actual aircraft response.

Keywords: fixed wing UAV, system identification, black box modeling, longitudinal dynamics, least square error

Procedia PDF Downloads 295
167 Video Text Information Detection and Localization in Lecture Videos Using Moments

Authors: Belkacem Soundes, Guezouli Larbi

Abstract:

This paper presents a robust and accurate method for text detection and localization over lecture videos. Frame regions are classified into text or background based on visual feature analysis. However, lecture video shows significant degradation mainly related to acquisition conditions, camera motion and environmental changes resulting in low quality videos. Hence, affecting feature extraction and description efficiency. Moreover, traditional text detection methods cannot be directly applied to lecture videos. Therefore, robust feature extraction methods dedicated to this specific video genre are required for robust and accurate text detection and extraction. Method consists of a three-step process: Slide region detection and segmentation; Feature extraction and non-text filtering. For robust and effective features extraction moment functions are used. Two distinct types of moments are used: orthogonal and non-orthogonal. For orthogonal Zernike Moments, both Pseudo Zernike moments are used, whereas for non-orthogonal ones Hu moments are used. Expressivity and description efficiency are given and discussed. Proposed approach shows that in general, orthogonal moments show high accuracy in comparison to the non-orthogonal one. Pseudo Zernike moments are more effective than Zernike with better computation time.

Keywords: text detection, text localization, lecture videos, pseudo zernike moments

Procedia PDF Downloads 125
166 Breast Cancer Survivability Prediction via Classifier Ensemble

Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia

Abstract:

This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.

Keywords: classifier ensemble, breast cancer survivability, data mining, SEER

Procedia PDF Downloads 297
165 Robust Medical Image Watermarking based on Contourlet and Extraction Using ICA

Authors: S. Saju, G. Thirugnanam

Abstract:

In this paper, a medical image watermarking algorithm based on contourlet is proposed. Medical image watermarking is a special subcategory of image watermarking in the sense that images have special requirements. Watermarked medical images should not differ perceptually from their original counterparts because clinical reading of images must not be affected. Watermarking techniques based on wavelet transform are reported in many literatures but robustness and security using contourlet are better when compared to wavelet transform. The main challenge in exploring geometry in images comes from the discrete nature of the data. In this paper, original image is decomposed to two level using contourlet and the watermark is embedded in the resultant sub-bands. Sub-band selection is based on the value of Peak Signal to Noise Ratio (PSNR) that is calculated between watermarked and original image. To extract the watermark, Kernel ICA is used and it has a novel characteristic is that it does not require the transformation process to extract the watermark. Simulation results show that proposed scheme is robust against attacks such as Salt and Pepper noise, Median filtering and rotation. The performance measures like PSNR and Similarity measure are evaluated and compared with Discrete Wavelet Transform (DWT) to prove the robustness of the scheme. Simulations are carried out using Matlab Software.

Keywords: digital watermarking, independent component analysis, wavelet transform, contourlet

Procedia PDF Downloads 506
164 Information Communication Technology Based Road Traffic Accidents’ Identification, and Related Smart Solution Utilizing Big Data

Authors: Ghulam Haider Haidaree, Nsenda Lukumwena

Abstract:

Today the world of research enjoys abundant data, available in virtually any field, technology, science, and business, politics, etc. This is commonly referred to as big data. This offers a great deal of precision and accuracy, supportive of an in-depth look at any decision-making process. When and if well used, Big Data affords its users with the opportunity to produce substantially well supported and good results. This paper leans extensively on big data to investigate possible smart solutions to urban mobility and related issues, namely road traffic accidents, its casualties, and fatalities based on multiple factors, including age, gender, location occurrences of accidents, etc. Multiple technologies were used in combination to produce an Information Communication Technology (ICT) based solution with embedded technology. Those technologies include principally Geographic Information System (GIS), Orange Data Mining Software, Bayesian Statistics, to name a few. The study uses the Leeds accident 2016 to illustrate the thinking process and extracts thereof a model that can be tested, evaluated, and replicated. The authors optimistically believe that the proposed model will significantly and smartly help to flatten the curve of road traffic accidents in the fast-growing population densities, which increases considerably motor-based mobility.

Keywords: accident factors, geographic information system, information communication technology, mobility

Procedia PDF Downloads 173
163 Semiautomatic Calculation of Ejection Fraction Using Echocardiographic Image Processing

Authors: Diana Pombo, Maria Loaiza, Mauricio Quijano, Alberto Cadena, Juan Pablo Tello

Abstract:

In this paper, we present a semi-automatic tool for calculating ejection fraction from an echocardiographic video signal which is derived from a database in DICOM format, of Clinica de la Costa - Barranquilla. Described in this paper are each of the steps and methods used to find the respective calculation that includes acquisition and formation of the test samples, processing and finally the calculation of the parameters to obtain the ejection fraction. Two imaging segmentation methods were compared following a methodological framework that is similar only in the initial stages of processing (process of filtering and image enhancement) and differ in the end when algorithms are implemented (Active Contour and Region Growing Algorithms). The results were compared with the measurements obtained by two different medical specialists in cardiology who calculated the ejection fraction of the study samples using the traditional method, which consists of drawing the region of interest directly from the computer using echocardiography equipment and a simple equation to calculate the desired value. The results showed that if the quality of video samples are good (i.e., after the pre-processing there is evidence of an improvement in the contrast), the values provided by the tool are substantially close to those reported by physicians; also the correlation between physicians does not vary significantly.

Keywords: echocardiography, DICOM, processing, segmentation, EDV, ESV, ejection fraction

Procedia PDF Downloads 404
162 A Combination of Filtration and Coagulation Processes for Tannery Effluent Treatment

Authors: M. G. Mostafa, Manjushree Chowdhury, Tapan Kumar Biswas, , Ananda Kumar Saha

Abstract:

This study focused on effluents characterization and treatment process to reduce of toxicity from tannery effluents. Tanning industry is one of the oldest industries in the world. It is typically characterized as pollutants generated industries which produce wide varieties of high strength toxic chemicals. The study was conducted during the year 2008 to 2009 and the tannery effluents were collected three times in a year from the outlet of some selected leather industries located in Hagaribagh industrial zone Dhaka, Bangladesh. The analysis results of the raw effluents reveal that the effluents were yellowish-brown color, having basic pH, very high value of BOD5¬¬, COD, TDS, TSS, TS, and high concentrations of Cr, Na, SO42-, Cl- and other organic and inorganic constituents. The tannery effluents were treated with various doses of FeCl3 after settling and a subsequent filtration through sand-stone. The study observed that coagulant (FeCl3) 150 mg/L dose around neutral pH showed the best removal efficiency for major physico-chemical parameters. The analysis results of illustrate that the most of the physical and chemical parameters were found well below the prescribed permissible limits for effluent discharged. The study suggests that tannery effluents could be treated by a combined process consisting of settling, filtering and coagulating with FeCl3.

Keywords: characterization, effluent, tannery, treatment

Procedia PDF Downloads 425
161 Investigation of the Litho-Structure of Ilesa Using High Resolution Aeromagnetic Data

Authors: Oladejo Olagoke Peter, Adagunodo T. A., Ogunkoya C. O.

Abstract:

The research investigated the arrangement of some geological features under Ilesa employing aeromagnetic data. The obtained data was subjected to various data filtering and processing techniques, which are Total Horizontal Derivative (THD), Depth Continuation and Analytical Signal Amplitude using Geosoft Oasis Montaj 6.4.2 software. The Reduced to the Equator –Total Magnetic Intensity (TRE-TMI) outcomes reveal significant magnetic anomalies, with high magnitude (55.1 to 155 nT) predominantly at the Northwest half of the area. Intermediate magnetic susceptibility, ranging between 6.0 to 55.1 nT, dominates the eastern part, separated by depressions and uplifts. The southern part of the area exhibits a magnetic field of low intensity, ranging from -76.6 to 6.0 nT. The lineaments exhibit varying lengths ranging from 2.5 and 16.0 km. Analyzing the Rose Diagram and the analytical signal amplitude indicates structural styles mainly of E-W and NE-SW orientations, particularly evident in the western, SW and NE regions with an amplitude of 0.0318nT/m. The identified faults in the area demonstrate orientations of NNW-SSE, NNE-SSW and WNW-ESE, situated at depths ranging from 500 to 750 m. Considering the divergence magnetic susceptibility, structural style or orientation of the lineaments, identified fault and their depth, these lithological features could serve as a valuable foundation for assessing ground motion, particularly in the presence of sufficient seismic energy.

Keywords: lineament, aeromagnetic, anomaly, fault, magnetic

Procedia PDF Downloads 43
160 Analyzing the Impact of Migration on HIV and AIDS Incidence Cases in Malaysia

Authors: Ofosuhene O. Apenteng, Noor Azina Ismail

Abstract:

The human immunodeficiency virus (HIV) that causes acquired immune deficiency syndrome (AIDS) remains a global cause of morbidity and mortality. It has caused panic since its emergence. Relationships between migration and HIV/AIDS have become complex. In the absence of prospectively designed studies, dynamic mathematical models that take into account the migration movement which will give very useful information. We have explored the utility of mathematical models in understanding transmission dynamics of HIV and AIDS and in assessing the magnitude of how migration has impact on the disease. The model was calibrated to HIV and AIDS incidence data from Malaysia Ministry of Health from the period of 1986 to 2011 using Bayesian analysis with combination of Markov chain Monte Carlo method (MCMC) approach to estimate the model parameters. From the estimated parameters, the estimated basic reproduction number was 22.5812. The rate at which the susceptible individual moved to HIV compartment has the highest sensitivity value which is more significant as compared to the remaining parameters. Thus, the disease becomes unstable. This is a big concern and not good indicator from the public health point of view since the aim is to stabilize the epidemic at the disease-free equilibrium. However, these results suggest that the government as a policy maker should make further efforts to curb illegal activities performed by migrants. It is shown that our models reflect considerably the dynamic behavior of the HIV/AIDS epidemic in Malaysia and eventually could be used strategically for other countries.

Keywords: epidemic model, reproduction number, HIV, MCMC, parameter estimation

Procedia PDF Downloads 340
159 Prediction of PM₂.₅ Concentration in Ulaanbaatar with Deep Learning Models

Authors: Suriya

Abstract:

Rapid socio-economic development and urbanization have led to an increasingly serious air pollution problem in Ulaanbaatar (UB), the capital of Mongolia. PM₂.₅ pollution has become the most pressing aspect of UB air pollution. Therefore, monitoring and predicting PM₂.₅ concentration in UB is of great significance for the health of the local people and environmental management. As of yet, very few studies have used models to predict PM₂.₅ concentrations in UB. Using data from 0:00 on June 1, 2018, to 23:00 on April 30, 2020, we proposed two deep learning models based on Bayesian-optimized LSTM (Bayes-LSTM) and CNN-LSTM. We utilized hourly observed data, including Himawari8 (H8) aerosol optical depth (AOD), meteorology, and PM₂.₅ concentration, as input for the prediction of PM₂.₅ concentrations. The correlation strengths between meteorology, AOD, and PM₂.₅ were analyzed using the gray correlation analysis method; the comparison of the performance improvement of the model by using the AOD input value was tested, and the performance of these models was evaluated using mean absolute error (MAE) and root mean square error (RMSE). The prediction accuracies of Bayes-LSTM and CNN-LSTM deep learning models were both improved when AOD was included as an input parameter. Improvement of the prediction accuracy of the CNN-LSTM model was particularly enhanced in the non-heating season; in the heating season, the prediction accuracy of the Bayes-LSTM model slightly improved, while the prediction accuracy of the CNN-LSTM model slightly decreased. We propose two novel deep learning models for PM₂.₅ concentration prediction in UB, Bayes-LSTM, and CNN-LSTM deep learning models. Pioneering the use of AOD data from H8 and demonstrating the inclusion of AOD input data improves the performance of our two proposed deep learning models.

Keywords: deep learning, AOD, PM2.5, prediction, Ulaanbaatar

Procedia PDF Downloads 21
158 An Enhanced SAR-Based Tsunami Detection System

Authors: Jean-Pierre Dubois, Jihad S. Daba, H. Karam, J. Abdallah

Abstract:

Tsunami early detection and warning systems have proved to be of ultimate importance, especially after the destructive tsunami that hit Japan in March 2012. Such systems are crucial to inform the authorities of any risk of a tsunami and of the degree of its danger in order to make the right decision and notify the public of the actions they need to take to save their lives. The purpose of this research is to enhance existing tsunami detection and warning systems. We first propose an automated and miniaturized model of an early tsunami detection and warning system. The model for the operation of a tsunami warning system is simulated using the data acquisition toolbox of Matlab and measurements acquired from specified internet pages due to the lack of the required real-life sensors, both seismic and hydrologic, and building a graphical user interface for the system. In the second phase of this work, we implement various satellite image filtering schemes to enhance the acquired synthetic aperture radar images of the tsunami affected region that are masked by speckle noise. This enables us to conduct a post-tsunami damage extent study and calculate the percentage damage. We conclude by proposing improvements to the existing telecommunication infrastructure of existing warning tsunami systems using a migration to IP-based networks and fiber optics links.

Keywords: detection, GIS, GSN, GTS, GPS, speckle noise, synthetic aperture radar, tsunami, wiener filter

Procedia PDF Downloads 358
157 Prediction of Distillation Curve and Reid Vapor Pressure of Dual-Alcohol Gasoline Blends Using Artificial Neural Network for the Determination of Fuel Performance

Authors: Leonard D. Agana, Wendell Ace Dela Cruz, Arjan C. Lingaya, Bonifacio T. Doma Jr.

Abstract:

The purpose of this paper is to study the predict the fuel performance parameters, which include drivability index (DI), vapor lock index (VLI), and vapor lock potential using distillation curve and Reid vapor pressure (RVP) of dual alcohol-gasoline fuel blends. Distillation curve and Reid vapor pressure were predicted using artificial neural networks (ANN) with macroscopic properties such as boiling points, RVP, and molecular weights as the input layers. The ANN consists of 5 hidden layers and was trained using Bayesian regularization. The training mean square error (MSE) and R-value for the ANN of RVP are 91.4113 and 0.9151, respectively, while the training MSE and R-value for the distillation curve are 33.4867 and 0.9927. Fuel performance analysis of the dual alcohol–gasoline blends indicated that highly volatile gasoline blended with dual alcohols results in non-compliant fuel blends with D4814 standard. Mixtures of low-volatile gasoline and 10% methanol or 10% ethanol can still be blended with up to 10% C3 and C4 alcohols. Intermediate volatile gasoline containing 10% methanol or 10% ethanol can still be blended with C3 and C4 alcohols that have low RVPs, such as 1-propanol, 1-butanol, 2-butanol, and i-butanol. Biography: Graduate School of Chemical, Biological, and Materials Engineering and Sciences, Mapua University, Muralla St., Intramuros, Manila, 1002, Philippines

Keywords: dual alcohol-gasoline blends, distillation curve, machine learning, reid vapor pressure

Procedia PDF Downloads 70
156 An Integrated Approach for Risk Management of Transportation of HAZMAT: Use of Quality Function Deployment and Risk Assessment

Authors: Guldana Zhigerbayeva, Ming Yang

Abstract:

Transportation of hazardous materials (HAZMAT) is inevitable in the process industries. The statistics show a significant number of accidents has occurred during the transportation of HAZMAT. This makes risk management of HAZMAT transportation an important topic. The tree-based methods including fault-trees, event-trees and cause-consequence analysis, and Bayesian network, have been applied to risk management of HAZMAT transportation. However, there is limited work on the development of a systematic approach. The existing approaches fail to build up the linkages between the regulatory requirements and the safety measures development. The analysis of historical data from the past accidents’ report databases would limit our focus on the specific incidents and their specific causes. Thus, we may overlook some essential elements in risk management, including regulatory compliance, field expert opinions, and suggestions. A systematic approach is needed to translate the regulatory requirements of HAZMAT transportation into specified safety measures (both technical and administrative) to support the risk management process. This study aims to first adapt the House of Quality (HoQ) to House of Safety (HoS) and proposes a new approach- Safety Function Deployment (SFD). The results of SFD will be used in a multi-criteria decision-support system to develop find an optimal route for HazMats transportation. The proposed approach will be demonstrated through a hypothetical transportation case in Kazakhstan.

Keywords: hazardous materials, risk assessment, risk management, quality function deployment

Procedia PDF Downloads 114
155 An Automatic Bayesian Classification System for File Format Selection

Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan

Abstract:

This paper presents an approach for the classification of an unstructured format description for identification of file formats. The main contribution of this work is the employment of data mining techniques to support file format selection with just the unstructured text description that comprises the most important format features for a particular organisation. Subsequently, the file format indentification method employs file format classifier and associated configurations to support digital preservation experts with an estimation of required file format. Our goal is to make use of a format specification knowledge base aggregated from a different Web sources in order to select file format for a particular institution. Using the naive Bayes method, the decision support system recommends to an expert, the file format for his institution. The proposed methods facilitate the selection of file format and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and specifications of file formats. To facilitate decision-making, the aggregated information about the file formats is presented as a file format vocabulary that comprises most common terms that are characteristic for all researched formats. The goal is to suggest a particular file format based on this vocabulary for analysis by an expert. The sample file format calculation and the calculation results including probabilities are presented in the evaluation section.

Keywords: data mining, digital libraries, digital preservation, file format

Procedia PDF Downloads 473
154 Ambivalence as Ethical Practice: Methodologies to Address Noise, Bias in Care, and Contact Evaluations

Authors: Anthony Townsend, Robyn Fasser

Abstract:

While complete objectivity is a desirable scientific position from which to conduct a care and contact evaluation (CCE), it is precisely the recognition that we are inherently incapable of operating objectively that is the foundation of ethical practice and skilled assessment. Drawing upon recent research from Daniel Kahneman (2021) on the differences between noise and bias, as well as different inherent biases collectively termed “The Elephant in the Brain” by Kevin Simler and Robin Hanson (2019) from Oxford University, this presentation addresses both the various ways in which our judgments, perceptions and even procedures can be distorted and contaminated while conducting a CCE, but also considers the value of second order cybernetics and the psychodynamic concept of ‘ambivalence’ as a conceptual basis to inform our assessment methodologies to limit such errors or at least better identify them. Both a conceptual framework for ambivalence, our higher-order capacity to allow for the convergence and consideration of multiple emotional experiences and cognitive perceptions to inform our reasoning, and a practical methodology for assessment relying on data triangulation, Bayesian inference and hypothesis testing is presented as a means of promoting ethical practice for health care professionals conducting CCEs. An emphasis on widening awareness and perspective, limiting ‘splitting’, is demonstrated both in how this form of emotional processing plays out in alienating dynamics in families as well as the assessment thereof. In addressing this concept, this presentation aims to illuminate the value of ambivalence as foundational to ethical practice for assessors.

Keywords: ambivalence, forensic, psychology, noise, bias, ethics

Procedia PDF Downloads 66
153 Big Data in Telecom Industry: Effective Predictive Techniques on Call Detail Records

Authors: Sara ElElimy, Samir Moustafa

Abstract:

Mobile network operators start to face many challenges in the digital era, especially with high demands from customers. Since mobile network operators are considered a source of big data, traditional techniques are not effective with new era of big data, Internet of things (IoT) and 5G; as a result, handling effectively different big datasets becomes a vital task for operators with the continuous growth of data and moving from long term evolution (LTE) to 5G. So, there is an urgent need for effective Big data analytics to predict future demands, traffic, and network performance to full fill the requirements of the fifth generation of mobile network technology. In this paper, we introduce data science techniques using machine learning and deep learning algorithms: the autoregressive integrated moving average (ARIMA), Bayesian-based curve fitting, and recurrent neural network (RNN) are employed for a data-driven application to mobile network operators. The main framework included in models are identification parameters of each model, estimation, prediction, and final data-driven application of this prediction from business and network performance applications. These models are applied to Telecom Italia Big Data challenge call detail records (CDRs) datasets. The performance of these models is found out using a specific well-known evaluation criteria shows that ARIMA (machine learning-based model) is more accurate as a predictive model in such a dataset than the RNN (deep learning model).

Keywords: big data analytics, machine learning, CDRs, 5G

Procedia PDF Downloads 113
152 Intelligent Chatbot Generating Dynamic Responses Through Natural Language Processing

Authors: Aarnav Singh, Jatin Moolchandani

Abstract:

The proposed research work aims to build a query-based AI chatbot that can answer any question related to any topic. A chatbot is software that converses with users via text messages. In the proposed system, we aim to build a chatbot that generates a response based on the user’s query. For this, we use natural language processing to analyze the query and some set of texts to form a concise answer. The texts are obtained through web-scrapping and filtering all the credible sources from a web search. The objective of this project is to provide a chatbot that is able to provide simple and accurate answers without the user having to read through a large number of articles and websites. Creating an AI chatbot that can answer a variety of user questions on a variety of topics is the goal of the proposed research project. This chatbot uses natural language processing to comprehend user inquiries and provides succinct responses by examining a collection of writings that were scraped from the internet. The texts are carefully selected from reliable websites that are found via internet searches. This project aims to provide users with a chatbot that provides clear and precise responses, removing the need to go through several articles and web pages in great detail. In addition to exploring the reasons for their broad acceptance and their usefulness across many industries, this article offers an overview of the interest in chatbots throughout the world.

Keywords: Chatbot, Artificial Intelligence, natural language processing, web scrapping

Procedia PDF Downloads 35
151 Determining of the Performance of Data Mining Algorithm Determining the Influential Factors and Prediction of Ischemic Stroke: A Comparative Study in the Southeast of Iran

Authors: Y. Mehdipour, S. Ebrahimi, A. Jahanpour, F. Seyedzaei, B. Sabayan, A. Karimi, H. Amirifard

Abstract:

Ischemic stroke is one of the common reasons for disability and mortality. The fourth leading cause of death in the world and the third in some other sources. Only 1/3 of the patients with ischemic stroke fully recover, 1/3 of them end in permanent disability and 1/3 face death. Thus, the use of predictive models to predict stroke has a vital role in reducing the complications and costs related to this disease. Thus, the aim of this study was to specify the effective factors and predict ischemic stroke with the help of DM methods. The present study was a descriptive-analytic study. The population was 213 cases from among patients referring to Ali ibn Abi Talib (AS) Hospital in Zahedan. Data collection tool was a checklist with the validity and reliability confirmed. This study used DM algorithms of decision tree for modeling. Data analysis was performed using SPSS-19 and SPSS Modeler 14.2. The results of the comparison of algorithms showed that CHAID algorithm with 95.7% accuracy has the best performance. Moreover, based on the model created, factors such as anemia, diabetes mellitus, hyperlipidemia, transient ischemic attacks, coronary artery disease, and atherosclerosis are the most effective factors in stroke. Decision tree algorithms, especially CHAID algorithm, have acceptable precision and predictive ability to determine the factors affecting ischemic stroke. Thus, by creating predictive models through this algorithm, will play a significant role in decreasing the mortality and disability caused by ischemic stroke.

Keywords: data mining, ischemic stroke, decision tree, Bayesian network

Procedia PDF Downloads 150
150 Case Study of High-Resolution Marine Seismic Survey in Shallow Water, Arabian Gulf, Saudi Arabia

Authors: Almalki M., Alajmi M., Qadrouh Y., Alzahrani E., Sulaiman A., Aleid M., Albaiji A., Alfaifi H., Alhadadi A., Almotairy H., Alrasheed R., Alhafedh Y.

Abstract:

High-resolution marine seismic survey is a well-established technique that commonly used to characterize near-surface sediments and geological structures at shallow water. We conduct single channel seismic survey to provide high quality seismic images for near-surface sediments upto 100m depth at Jubal costal area, Arabian Gulf. Eight hydrophones streamer has been used to collect stacked seismic traces alone 5km seismic line. To reach the required depth, we have used spark system that discharges energies above 5000 J with expected frequency output span the range from 200 to 2000 Hz. A suitable processing flow implemented to enhance the signal-to-noise ratio of the seismic profile. We have found that shallow sedimentary layers at the study site have complex pattern of reflectivity, which decay significantly due to amount of source energy used as well as the multiples associated to seafloor. In fact, the results reveal that single channel marine seismic at shallow water is a cost-effective technique that can be easily repeated to observe any possibly changes in the wave physical properties at the near surface layers

Keywords: shallow marine single-channel data, high resolution, frequency filtering, shallow water

Procedia PDF Downloads 50
149 Price Effect Estimation of Tobacco on Low-wage Male Smokers: A Causal Mediation Analysis

Authors: Kawsar Ahmed, Hong Wang

Abstract:

The study's goal was to estimate the causal mediation impact of tobacco tax before and after price hikes among low-income male smokers, with a particular emphasis on the effect estimating pathways framework for continuous and dichotomous variables. From July to December 2021, a cross-sectional investigation of observational data (n=739) was collected from Bangladeshi low-wage smokers. The Quasi-Bayesian technique, binomial probit model, and sensitivity analysis using a simulation of the computational tools R mediation package had been used to estimate the effect. After a price rise for tobacco products, the average number of cigarettes or bidis sticks taken decreased from 6.7 to 4.56. Tobacco product rising prices have a direct effect on low-income people's decisions to quit or lessen their daily smoking habits of Average Causal Mediation Effect (ACME) [effect=2.31, 95 % confidence interval (C.I.) = (4.71-0.00), p<0.01], Average Direct Effect (ADE) [effect=8.6, 95 percent (C.I.) = (6.8-0.11), p<0.001], and overall significant effects (p<0.001). Tobacco smoking choice is described by the mediated proportion of income effect, which is 26.1% less of following price rise. The curve of ACME and ADE is based on observational figures of the coefficients of determination that asses the model of hypothesis as the substantial consequence after price rises in the sensitivity analysis. To reduce smoking product behaviors, price increases through taxation have a positive causal mediation with income that affects the decision to limit tobacco use and promote low-income men's healthcare policy.

Keywords: causal mediation analysis, directed acyclic graphs, tobacco price policy, sensitivity analysis, pathway estimation

Procedia PDF Downloads 88
148 A Posteriori Trading-Inspired Model-Free Time Series Segmentation

Authors: Plessen Mogens Graf

Abstract:

Within the context of multivariate time series segmentation, this paper proposes a method inspired by a posteriori optimal trading. After a normalization step, time series are treated channelwise as surrogate stock prices that can be traded optimally a posteriori in a virtual portfolio holding either stock or cash. Linear transaction costs are interpreted as hyperparameters for noise filtering. Trading signals, as well as trading signals obtained on the reversed time series, are used for unsupervised channelwise labeling before a consensus over all channels is reached that determines the final segmentation time instants. The method is model-free such that no model prescriptions for segments are made. Benefits of proposed approach include simplicity, computational efficiency, and adaptability to a wide range of different shapes of time series. Performance is demonstrated on synthetic and real-world data, including a large-scale dataset comprising a multivariate time series of dimension 1000 and length 2709. Proposed method is compared to a popular model-based bottom-up approach fitting piecewise affine models and to a recent model-based top-down approach fitting Gaussian models and found to be consistently faster while producing more intuitive results in the sense of segmenting time series at peaks and valleys.

Keywords: time series segmentation, model-free, trading-inspired, multivariate data

Procedia PDF Downloads 109