Search results for: Gaussian filter
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1122

Search results for: Gaussian filter

402 On the Possibility of Real Time Characterisation of Ambient Toxicity Using Multi-Wavelength Photoacoustic Instrument

Authors: Tibor Ajtai, Máté Pintér, Noémi Utry, Gergely Kiss-Albert, Andrea Palágyi, László Manczinger, Csaba Vágvölgyi, Gábor Szabó, Zoltán Bozóki

Abstract:

According to the best knowledge of the authors, here we experimentally demonstrate first, a quantified correlation between the real-time measured optical feature of the ambient and the off-line measured toxicity data. Finally, using these correlations we are presenting a novel methodology for real time characterisation of ambient toxicity based on the multi wavelength aerosol phase photoacoustic measurement. Ambient carbonaceous particulate matter is one of the most intensively studied atmospheric constituent in climate science nowadays. Beyond their climatic impact, atmospheric soot also plays an important role as an air pollutant that harms human health. Moreover, according to the latest scientific assessments ambient soot is the second most important anthropogenic emission source, while in health aspect its being one of the most harmful atmospheric constituents as well. Despite of its importance, generally accepted standard methodology for the quantitative determination of ambient toxicology is not available yet. Dominantly, ambient toxicology measurement is based on the posterior analysis of filter accumulated aerosol with limited time resolution. Most of the toxicological studies are based on operational definitions using different measurement protocols therefore the comprehensive analysis of the existing data set is really limited in many cases. The situation is further complicated by the fact that even during its relatively short residence time the physicochemical features of the aerosol can be masked significantly by the actual ambient factors. Therefore, decreasing the time resolution of the existing methodology and developing real-time methodology for air quality monitoring are really actual issues in the air pollution research. During the last decades many experimental studies have verified that there is a relation between the chemical composition and the absorption feature quantified by Absorption Angström Exponent (AAE) of the carbonaceous particulate matter. Although the scientific community are in the common platform that the PhotoAcoustic Spectroscopy (PAS) is the only methodology that can measure the light absorption by aerosol with accurate and reliable way so far, the multi-wavelength PAS which are able to selectively characterise the wavelength dependency of absorption has become only available in the last decade. In this study, the first results of the intensive measurement campaign focusing the physicochemical and toxicological characterisation of ambient particulate matter are presented. Here we demonstrate the complete microphysical characterisation of winter time urban ambient including optical absorption and scattering as well as size distribution using our recently developed state of the art multi-wavelength photoacoustic instrument (4λ-PAS), integrating nephelometer (Aurora 3000) as well as single mobility particle sizer and optical particle counter (SMPS+C). Beyond this on-line characterisation of the ambient, we also demonstrate the results of the eco-, cyto- and genotoxicity measurements of ambient aerosol based on the posterior analysis of filter accumulated aerosol with 6h time resolution. We demonstrate a diurnal variation of toxicities and AAE data deduced directly from the multi-wavelength absorption measurement results.

Keywords: photoacoustic spectroscopy, absorption Angström exponent, toxicity, Ames-test

Procedia PDF Downloads 304
401 Local Interpretable Model-agnostic Explanations (LIME) Approach to Email Spam Detection

Authors: Rohini Hariharan, Yazhini R., Blessy Maria Mathew

Abstract:

The task of detecting email spam is a very important one in the era of digital technology that needs effective ways of curbing unwanted messages. This paper presents an approach aimed at making email spam categorization algorithms transparent, reliable and more trustworthy by incorporating Local Interpretable Model-agnostic Explanations (LIME). Our technique assists in providing interpretable explanations for specific classifications of emails to help users understand the decision-making process by the model. In this study, we developed a complete pipeline that incorporates LIME into the spam classification framework and allows creating simplified, interpretable models tailored to individual emails. LIME identifies influential terms, pointing out key elements that drive classification results, thus reducing opacity inherent in conventional machine learning models. Additionally, we suggest a visualization scheme for displaying keywords that will improve understanding of categorization decisions by users. We test our method on a diverse email dataset and compare its performance with various baseline models, such as Gaussian Naive Bayes, Multinomial Naive Bayes, Bernoulli Naive Bayes, Support Vector Classifier, K-Nearest Neighbors, Decision Tree, and Logistic Regression. Our testing results show that our model surpasses all other models, achieving an accuracy of 96.59% and a precision of 99.12%.

Keywords: text classification, LIME (local interpretable model-agnostic explanations), stemming, tokenization, logistic regression.

Procedia PDF Downloads 48
400 Long-Term Exposure Assessments for Cooking Workers Exposed to Polycyclic Aromatic Hydrocarbons and Aldehydes Containing in Cooking Fumes

Authors: Chun-Yu Chen, Kua-Rong Wu, Yu-Cheng Chen, Perng-Jy Tsai

Abstract:

Cooking fumes are known containing polycyclic aromatic hydrocarbons (PAHs) and aldehydes, and some of them have been proven carcinogenic or possibly carcinogenic to humans. Considering their chronic health effects, long-term exposure data is required for assessing cooking workers’ lifetime health risks. Previous exposure assessment studies, due to both time and cost constraints, mostly were based on the cross-sectional data. Therefore, establishing a long-term exposure data has become an important issue for conducting health risk assessment for cooking workers. An approach was proposed in this study. Here, the generation rates of both PAHs and aldehydes from a cooking process were determined by placing a sampling train exactly under the under the exhaust fan under the both the total enclosure condition and normal operating condition, respectively. Subtracting the concentration collected by the former (representing the total emitted concentration) from that of the latter (representing the hood collected concentration), the fugitive emitted concentration was determined. The above data was further converted to determine the generation rates based on the flow rates specified for the exhaust fan. The determinations of the above generation rates were conducted in a testing chamber with a selected cooking process (deep-frying chicken nuggets under 3 L peanut oil at 200°C). The sampling train installed under the exhaust fan consisted respectively an IOM inhalable sampler with a glass fiber filter for collecting particle-phase PAHs, followed by a XAD-2 tube for gas-phase PAHs. The above was also used to sample aldehydes, however, installed with a filter pre-coated with DNPH, and followed by a 2,4-DNPH-cartridge for collecting particle-phase and gas-phase aldehydes, respectively. PAHs and aldehydes samples were analyzed by GC/MS-MS (Agilent 7890B), and HPLC-UV (HITACHI L-7100), respectively. The obtained generation rates of both PAHs and aldehydes were applied to the near-field/ far-field exposure model to estimate the exposures of cooks (the estimated near-field concentration), and helpers (the estimated far-field concentration). For validating purposes, both PAHs and aldehydes samplings were conducted simultaneously using the same sampling train at both near-field and far-field sites of the testing chamber. The sampling results, together with the use of the mixed-effect model, were used to calibrate the estimated near-field/ far-field exposures. In the present study, the obtained emission rates were further converted to emission factor of both PAHs and aldehydes according to the amount of food oil consumed. Applying the long-term food oil consumption records, the emission rates for both PAHs and aldehydes were determined, and the long-term exposure databanks for cooks (the estimated near-field concentration), and helpers (the estimated far-field concentration) were then determined. Results show that the proposed approach was adequate to determine the generation rates of both PAHs and aldehydes under various fan exhaust flow rate conditions. The estimated near-field/ far-field exposures, though were significantly different from that obtained from the field, can be calibrated using the mixed effect model. Finally, the established long-term data bank could provide a useful basis for conducting long-term exposure assessments for cooking workers exposed to PAHs and aldehydes.

Keywords: aldehydes, cooking oil fumes, long-term exposure assessment, modeling, polycyclic aromatic hydrocarbons (PAHs)

Procedia PDF Downloads 142
399 Educating Children with the Child-Friendly Smartphone Operation System

Authors: Wildan Maulana Wildan, Siti Annisa Rahmayani Icha

Abstract:

Nowadays advances in information technology are needed by all the inhabitants of the earth for the sake of ease all their work, but it is worth to introduced the technological advances in the world of children. Before the technology is growing rapidly, children busy with various of traditional games and have high socialization. Moreover, after it presence, almost all of children spend more their time for playing gadget, It can affect the education of children and will change the character and personality children. However, children also can not be separated with the technology. Because the technology insight knowledge of children will be more extensive. Because the world can not be separated with advances in technology as well as with children, there should be developed a smartphone operating system that is child-friendly. The operating system is able to filter contents that do not deserve children, even in this system there is a reminder of a time study, prayer time and play time for children and there are interactive contents that will help the development of education and children's character. Children need technology, and there are some ways to introduce it to children. We must look at the characteristics of children in different environments. Thus advances in technology can be beneficial to the world children and their parents, and educators do not have to worry about advances in technology. We should be able to take advantage of advances in technology best possible.

Keywords: information technology, smartphone operating system, education, character

Procedia PDF Downloads 515
398 Study of Interaction between Ascorbic Acid and Bovine Hemoglobin by Multispectroscopic Methods

Authors: Krishnamoorthy Shanmugaraj, Malaichamy Ilanchelian

Abstract:

Ascorbic acid is an essential component in the diet of humans, and also is a typical long used pharmaceutical agent. In the present contribution, we have carried out a detailed study on the binding interaction of ascorbic acid (AA) with bovine hemoglobin (BHb) using steady state emission, time resolved fluorescence, UV-Vis absorption, circular dichroism (CD), Fourier transform infra-red (FT-IR) and three dimensional emission (3D) spectral studies. The results from the emission spectral studies unveiled that the quenching of BHb emission by AA is attributed to the formation of a complex in the ground state (static in nature) after correcting for inner filter effect. The binding parameters calculated from corrected emission quenching data revealed that BHb exhibited a significant binding affinity towards AA. Moreover, AA induced tertiary and secondary conformational changes of BHb were monitored by UV-Vis absorption, CD, FT-IR and 3D emission spectral studies. The results presented here will help to further understand the credible mechanism of BHb-AA system which is expected to provide insights into conformational and microenvironmental changes of BHb.

Keywords: ascorbic acid, bovine hemoglobin, circular dichroism, three dimensional emission spectral studies

Procedia PDF Downloads 979
397 Fast Robust Switching Control Scheme for PWR-Type Nuclear Power Plants

Authors: Piyush V. Surjagade, Jiamei Deng, Paul Doney, S. R. Shimjith, A. John Arul

Abstract:

In sophisticated and complex systems such as nuclear power plants, maintaining the system's stability in the presence of uncertainties and disturbances and obtaining a fast dynamic response are the most challenging problems. Thus, to ensure the satisfactory and safe operation of nuclear power plants, this work proposes a new fast, robust optimal switching control strategy for pressurized water reactor-type nuclear power plants. The proposed control strategy guarantees a substantial degree of robustness, fast dynamic response over the entire operational envelope, and optimal performance during the nominal operation of the plant. To improve the robustness, obtain a fast dynamic response, and make the system optimal, a bank of controllers is designed. Various controllers, like a baseline proportional-integral-derivative controller, an optimal linear quadratic Gaussian controller, and a robust adaptive L1 controller, are designed to perform distinct tasks in a specific situation. At any instant of time, the most suitable controller from the bank of controllers is selected using the switching logic unit that designates the controller by monitoring the health of the nuclear power plant or transients. The proposed switching control strategy optimizes the overall performance and increases operational safety and efficiency. Simulation studies have been performed considering various uncertainties and disturbances that demonstrate the applicability and effectiveness of the proposed switching control strategy over some conventional control techniques.

Keywords: switching control, robust control, optimal control, nuclear power control

Procedia PDF Downloads 137
396 Spatial Analysis of Flood Vulnerability in Highly Urbanized Area: A Case Study in Taipei City

Authors: Liang Weichien

Abstract:

Without adequate information and mitigation plan for natural disaster, the risk to urban populated areas will increase in the future as populations grow, especially in Taiwan. Taiwan is recognized as the world's high-risk areas, where an average of 5.7 times of floods occur per year should seek to strengthen coherence and consensus in how cities can plan for flood and climate change. Therefore, this study aims at understanding the vulnerability to flooding in Taipei city, Taiwan, by creating indicators and calculating the vulnerability of each study units. The indicators were grouped into sensitivity and adaptive capacity based on the definition of vulnerability of Intergovernmental Panel on Climate Change. The indicators were weighted by using Principal Component Analysis. However, current researches were based on the assumption that the composition and influence of the indicators were the same in different areas. This disregarded spatial correlation that might result in inaccurate explanation on local vulnerability. The study used Geographically Weighted Principal Component Analysis by adding geographic weighting matrix as weighting to get the different main flood impact characteristic in different areas. Cross Validation Method and Akaike Information Criterion were used to decide bandwidth and Gaussian Pattern as the bandwidth weight scheme. The ultimate outcome can be used for the reduction of damage potential by integrating the outputs into local mitigation plan and urban planning.

Keywords: flood vulnerability, geographically weighted principal components analysis, GWPCA, highly urbanized area, spatial correlation

Procedia PDF Downloads 286
395 The Use of X-Ray Computed Microtomography in Petroleum Geology: A Case Study of Unconventional Reservoir Rocks in Poland

Authors: Tomasz Wejrzanowski, Łukasz Kaczmarek, Michał Maksimczuk

Abstract:

High-resolution X-ray computed microtomography (µCT) is a non-destructive technique commonly used to determine the internal structure of reservoir rock sample. This study concerns µCT analysis of Silurian and Ordovician shales and mudstones from a borehole in the Baltic Basin, north of Poland. The spatial resolution of the µCT images obtained was 27 µm, which enabled the authors to create accurate 3-D visualizations and to calculate the ratio of pores and fractures volume to the total sample volume. A total of 1024 µCT slices were used to create a 3-D volume of sample structure geometry. These µCT slices were processed to obtain a clearly visible image and the volume ratio. A copper X-ray source filter was used to reduce image artifacts. Due to accurate technical settings of µCT it was possible to obtain high-resolution 3-D µCT images of low X-ray transparency samples. The presented results confirm the utility of µCT implementations in geoscience and show that µCT has still promising applications for reservoir exploration and characterization.

Keywords: fractures, material density, pores, structure

Procedia PDF Downloads 257
394 Detection and Tracking for the Protection of the Elderly and Socially Vulnerable People in the Video Surveillance System

Authors: Mobarok Hossain Bhuyain

Abstract:

Video surveillance processing has attracted various security fields transforming it into one of the leading research fields. Today's demand for detection and tracking of human mobility for security is very useful for human security, such as in crowded areas. Accordingly, video surveillance technology has seen a rapid advancement in recent years, with algorithms analyzing the behavior of people under surveillance automatically. The main motivation of this research focuses on the detection and tracking of the elderly and socially vulnerable people in crowded areas. Degenerate people are a major health concern, especially for elderly people and socially vulnerable people. One major disadvantage of video surveillance is the need for continuous monitoring, especially in crowded areas. To assist the security monitoring live surveillance video, image processing, and artificial intelligence methods can be used to automatically send warning signals to the monitoring officers about elderly people and socially vulnerable people.

Keywords: human detection, target tracking, neural network, particle filter

Procedia PDF Downloads 166
393 Estimation of Endogenous Brain Noise from Brain Response to Flickering Visual Stimulation Magnetoencephalography Visual Perception Speed

Authors: Alexander N. Pisarchik, Parth Chholak

Abstract:

Intrinsic brain noise was estimated via magneto-encephalograms (MEG) recorded during perception of flickering visual stimuli with frequencies of 6.67 and 8.57 Hz. First, we measured the mean phase difference between the flicker signal and steady-state event-related field (SSERF) in the occipital area where the brain response at the flicker frequencies and their harmonics appeared in the power spectrum. Then, we calculated the probability distribution of the phase fluctuations in the regions of frequency locking and computed its kurtosis. Since kurtosis is a measure of the distribution’s sharpness, we suppose that inverse kurtosis is related to intrinsic brain noise. In our experiments, the kurtosis value varied among subjects from K = 3 to K = 5 for 6.67 Hz and from 2.6 to 4 for 8.57 Hz. The majority of subjects demonstrated leptokurtic kurtosis (K < 3), i.e., the distribution tails approached zero more slowly than Gaussian. In addition, we found a strong correlation between kurtosis and brain complexity measured as the correlation dimension, so that the MEGs of subjects with higher kurtosis exhibited lower complexity. The obtained results are discussed in the framework of nonlinear dynamics and complex network theories. Specifically, in a network of coupled oscillators, phase synchronization is mainly determined by two antagonistic factors, noise, and the coupling strength. While noise worsens phase synchronization, the coupling improves it. If we assume that each neuron and each synapse contribute to brain noise, the larger neuronal network should have stronger noise, and therefore phase synchronization should be worse, that results in smaller kurtosis. The described method for brain noise estimation can be useful for diagnostics of some brain pathologies associated with abnormal brain noise.

Keywords: brain, flickering, magnetoencephalography, MEG, visual perception, perception time

Procedia PDF Downloads 150
392 A Hybrid Fuzzy Clustering Approach for Fertile and Unfertile Analysis

Authors: Shima Soltanzadeh, Mohammad Hosain Fazel Zarandi, Mojtaba Barzegar Astanjin

Abstract:

Diagnosis of male infertility by the laboratory tests is expensive and, sometimes it is intolerable for patients. Filling out the questionnaire and then using classification method can be the first step in decision-making process, so only in the cases with a high probability of infertility we can use the laboratory tests. In this paper, we evaluated the performance of four classification methods including naive Bayesian, neural network, logistic regression and fuzzy c-means clustering as a classification, in the diagnosis of male infertility due to environmental factors. Since the data are unbalanced, the ROC curves are most suitable method for the comparison. In this paper, we also have selected the more important features using a filtering method and examined the impact of this feature reduction on the performance of each methods; generally, most of the methods had better performance after applying the filter. We have showed that using fuzzy c-means clustering as a classification has a good performance according to the ROC curves and its performance is comparable to other classification methods like logistic regression.

Keywords: classification, fuzzy c-means, logistic regression, Naive Bayesian, neural network, ROC curve

Procedia PDF Downloads 340
391 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem

Authors: Ouafa Amira, Jiangshe Zhang

Abstract:

Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.

Keywords: clustering, fuzzy c-means, regularization, relative entropy

Procedia PDF Downloads 259
390 Principal Component Regression in Amylose Content on the Malaysian Market Rice Grains Using Near Infrared Reflectance Spectroscopy

Authors: Syahira Ibrahim, Herlina Abdul Rahim

Abstract:

The amylose content is an essential element in determining the texture and taste of rice grains. This paper evaluates the use of VIS-SWNIRS in estimating the amylose content for seven varieties of rice grains available in the Malaysian market. Each type consists of 30 samples and all the samples are scanned using the spectroscopy to obtain a range of values between 680-1000nm. The Savitzky-Golay (SG) smoothing filter is applied to each sample’s data before the Principal Component Regression (PCR) technique is used to examine the data and produce a single value for each sample. This value is then compared with reference values obtained from the standard iodine colorimetric test in terms of its coefficient of determination, R2. Results show that this technique produced low R2 values of less than 0.50. In order to improve the result, the range should include a wavelength range of 1100-2500nm and the number of samples processed should also be increased.

Keywords: amylose content, diffuse reflectance, Malaysia rice grain, principal component regression (PCR), Visible and Shortwave near-infrared spectroscopy (VIS-SWNIRS)

Procedia PDF Downloads 382
389 Integrated Two Stage Processing of Biomass Conversion to Hydroxymethylfurfural Esters Using Ionic Liquid as Green Solvent and Catalyst: Synthesis of Mono Esters

Authors: Komal Kumar, Sreedevi Upadhyayula

Abstract:

In this study, a two-stage process was established for the synthesis of HMF esters using ionic liquid acid catalyst. Ionic liquid catalyst with different strength of the Bronsted acidity was prepared in the laboratory and characterized using 1H NMR, FT-IR, and 13C NMR spectroscopy. Solid acid catalyst from the ionic liquid catalyst was prepared using the immobilization method. The acidity of the synthesized acid catalyst was measured using Hammett function and titration method. Catalytic performance was evaluated for the biomass conversion to 5-hydroxymethylfurfural (5-HMF) and levulinic acid (LA) in methyl isobutyl ketone (MIBK)-water biphasic system. A good yield of 5-HMF and LA was found at the different composition of MIBK: Water. In the case of MIBK: Water ratio 10:1, good yield of 5-HMF was observed at ambient temperature 150˚C. Upgrading of 5-HMF into monoesters from the reaction of 5-HMF and reactants using biomass-derived monoacid were performed. Ionic liquid catalyst with -SO₃H functional group was found to be best efficient in comparative of a solid acid catalyst for the esterification reaction and biomass conversion. A good yield of 5-HMF esters with high 5-HMF conversion was found to be at 105˚C using the best active catalyst. In this process, process A was the hydrothermal conversion of cellulose and monomer into 5-HMF and LA using acid catalyst. And the process B was the esterification followed by using similar acid catalyst. All monoesters of 5-HMF synthesized here can be used in chemical, cross linker for adhesive or coatings and pharmaceutical industry. A theoretical density functional theory (DFT) study for the optimization of the ionic liquid structure was performed using the Gaussian 09 program to find out the minimum energy configuration of ionic liquid catalyst.

Keywords: biomass conversion, 5-HMF, Ionic liquid, HMF ester

Procedia PDF Downloads 253
388 Programming Language Extension Using Structured Query Language for Database Access

Authors: Chapman Eze Nnadozie

Abstract:

Relational databases constitute a very vital tool for the effective management and administration of both personal and organizational data. Data access ranges from a single user database management software to a more complex distributed server system. This paper intends to appraise the use a programming language extension like structured query language (SQL) to establish links to a relational database (Microsoft Access 2013) using Visual C++ 9 programming language environment. The methodology used involves the creation of tables to form a database using Microsoft Access 2013, which is Object Linking and Embedding (OLE) database compliant. The SQL command is used to query the tables in the database for easy extraction of expected records inside the visual C++ environment. The findings of this paper reveal that records can easily be accessed and manipulated to filter exactly what the user wants, such as retrieval of records with specified criteria, updating of records, and deletion of part or the whole records in a table.

Keywords: data access, database, database management system, OLE, programming language, records, relational database, software, SQL, table

Procedia PDF Downloads 187
387 Analytical Description of Disordered Structures in Continuum Models of Pattern Formation

Authors: Gyula I. Tóth, Shaho Abdalla

Abstract:

Even though numerical simulations indeed have a significant precursory/supportive role in exploring the disordered phase displaying no long-range order in pattern formation models, studying the stability properties of this phase and determining the order of the ordered-disordered phase transition in these models necessitate an analytical description of the disordered phase. First, we will present the results of a comprehensive statistical analysis of a large number (1,000-10,000) of numerical simulations in the Swift-Hohenberg model, where the bulk disordered (or amorphous) phase is stable. We will show that the average free energy density (over configurations) converges, while the variance of the energy density vanishes with increasing system size in numerical simulations, which suggest that the disordered phase is a thermodynamic phase (i.e., its properties are independent of the configuration in the macroscopic limit). Furthermore, the structural analysis of this phase in the Fourier space suggests that the phase can be modeled by a colored isotropic Gaussian noise, where any instant of the noise describes a possible configuration. Based on these results, we developed the general mathematical framework of finding a pool of solutions to partial differential equations in the sense of continuous probability measure, which we will present briefly. Applying the general idea to the Swift-Hohenberg model we show, that the amorphous phase can be found, and its properties can be determined analytically. As the general mathematical framework is not restricted to continuum theories, we hope that the proposed methodology will open a new chapter in studying disordered phases.

Keywords: fundamental theory, mathematical physics, continuum models, analytical description

Procedia PDF Downloads 135
386 Development of an Automatic Sequential Extraction Device for Pu and Am Isotopes in Radioactive Waste Samples

Authors: Myung Ho Lee, Hee Seung Lim, Young Jae Maeng, Chang Hoon Lee

Abstract:

This study presents an automatic sequential extraction device for Pu and Am isotopes in radioactive waste samples from the nuclear power plant with anion exchange resin and TRU resin. After radionuclides were leached from the radioactive waste samples with concentrated HCl and HNO₃, the sample was allowed to evaporate to dryness after filtering the leaching solution with 0.45 micron filter. The Pu isotopes were separated in HNO₃ medium with anion exchange resin. For leaching solution passed through the anion exchange column, the Am isotopes were sequentially separated with TRU resin. Automatic sequential extraction device built-in software information of separation for Pu and Am isotopes was developed. The purified Pu and Am isotopes were measured by alpha spectrometer, respectively, after the micro-precipitation of neodymium. The data of Pu and Am isotopes in radioactive waste with an automatic sequential extraction device developed in this study were validated with the ICP-MS system.

Keywords: automatic sequential extraction device, Pu isotopes, Am isotopes, alpha spectrometer, radioactive waste samples, ICP-MS system

Procedia PDF Downloads 77
385 Hybrid Approximate Structural-Semantic Frequent Subgraph Mining

Authors: Montaceur Zaghdoud, Mohamed Moussaoui, Jalel Akaichi

Abstract:

Frequent subgraph mining refers usually to graph matching and it is widely used in when analyzing big data with large graphs. A lot of research works dealt with structural exact or inexact graph matching but a little attention is paid to semantic matching when graph vertices and/or edges are attributed and typed. Therefore, it seems very interesting to integrate background knowledge into the analysis and that extracted frequent subgraphs should become more pruned by applying a new semantic filter instead of using only structural similarity in graph matching process. Consequently, this paper focuses on developing a new hybrid approximate structuralsemantic graph matching to discover a set of frequent subgraphs. It uses simultaneously an approximate structural similarity function based on graph edit distance function and a possibilistic vertices similarity function based on affinity function. Both structural and semantic filters contribute together to prune extracted frequent set. Indeed, new hybrid structural-semantic frequent subgraph mining approach searches will be suitable to be applied to several application such as community detection in social networks.

Keywords: approximate graph matching, hybrid frequent subgraph mining, graph mining, possibility theory

Procedia PDF Downloads 405
384 Evaluation of Fusion Sonar and Stereo Camera System for 3D Reconstruction of Underwater Archaeological Object

Authors: Yadpiroon Onmek, Jean Triboulet, Sebastien Druon, Bruno Jouvencel

Abstract:

The objective of this paper is to develop the 3D underwater reconstruction of archaeology object, which is based on the fusion between a sonar system and stereo camera system. The underwater images are obtained from a calibrated camera system. The multiples image pairs are input, and we first solve the problem of image processing by applying the well-known filter, therefore to improve the quality of underwater images. The features of interest between image pairs are selected by well-known methods: a FAST detector and FLANN descriptor. Subsequently, the RANSAC method is applied to reject outlier points. The putative inliers are matched by triangulation to produce the local sparse point clouds in 3D space, using a pinhole camera model and Euclidean distance estimation. The SFM technique is used to carry out the global sparse point clouds. Finally, the ICP method is used to fusion the sonar information with the stereo model. The final 3D models have a précised by measurement comparing with the real object.

Keywords: 3D reconstruction, archaeology, fusion, stereo system, sonar system, underwater

Procedia PDF Downloads 299
383 Long Wavelength Coherent Pulse of Sound Propagating in Granular Media

Authors: Rohit Kumar Shrivastava, Amalia Thomas, Nathalie Vriend, Stefan Luding

Abstract:

A mechanical wave or vibration propagating through granular media exhibits a specific signature in time. A coherent pulse or wavefront arrives first with multiply scattered waves (coda) arriving later. The coherent pulse is micro-structure independent i.e. it depends only on the bulk properties of the disordered granular sample, the sound wave velocity of the granular sample and hence bulk and shear moduli. The coherent wavefront attenuates (decreases in amplitude) and broadens with distance from its source. The pulse attenuation and broadening effects are affected by disorder (polydispersity; contrast in size of the granules) and have often been attributed to dispersion and scattering. To study the effect of disorder and initial amplitude (non-linearity) of the pulse imparted to the system on the coherent wavefront, numerical simulations have been carried out on one-dimensional sets of particles (granular chains). The interaction force between the particles is given by a Hertzian contact model. The sizes of particles have been selected randomly from a Gaussian distribution, where the standard deviation of this distribution is the relevant parameter that quantifies the effect of disorder on the coherent wavefront. Since, the coherent wavefront is system configuration independent, ensemble averaging has been used for improving the signal quality of the coherent pulse and removing the multiply scattered waves. The results concerning the width of the coherent wavefront have been formulated in terms of scaling laws. An experimental set-up of photoelastic particles constituting a granular chain is proposed to validate the numerical results.

Keywords: discrete elements, Hertzian contact, polydispersity, weakly nonlinear, wave propagation

Procedia PDF Downloads 205
382 Image Classification with Localization Using Convolutional Neural Networks

Authors: Bhuyain Mobarok Hossain

Abstract:

Image classification and localization research is currently an important strategy in the field of computer vision. The evolution and advancement of deep learning and convolutional neural networks (CNN) have greatly improved the capabilities of object detection and image-based classification. Target detection is important to research in the field of computer vision, especially in video surveillance systems. To solve this problem, we will be applying a convolutional neural network of multiple scales at multiple locations in the image in one sliding window. Most translation networks move away from the bounding box around the area of interest. In contrast to this architecture, we consider the problem to be a classification problem where each pixel of the image is a separate section. Image classification is the method of predicting an individual category or specifying by a shoal of data points. Image classification is a part of the classification problem, including any labels throughout the image. The image can be classified as a day or night shot. Or, likewise, images of cars and motorbikes will be automatically placed in their collection. The deep learning of image classification generally includes convolutional layers; the invention of it is referred to as a convolutional neural network (CNN).

Keywords: image classification, object detection, localization, particle filter

Procedia PDF Downloads 306
381 Switched System Diagnosis Based on Intelligent State Filtering with Unknown Models

Authors: Nada Slimane, Foued Theljani, Faouzi Bouani

Abstract:

The paper addresses the problem of fault diagnosis for systems operating in several modes (normal or faulty) based on states assessment. We use, for this purpose, a methodology consisting of three main processes: 1) sequential data clustering, 2) linear model regression and 3) state filtering. Typically, Kalman Filter (KF) is an algorithm that provides estimation of unknown states using a sequence of I/O measurements. Inevitably, although it is an efficient technique for state estimation, it presents two main weaknesses. First, it merely predicts states without being able to isolate/classify them according to their different operating modes, whether normal or faulty modes. To deal with this dilemma, the KF is endowed with an extra clustering step based fully on sequential version of the k-means algorithm. Second, to provide state estimation, KF requires state space models, which can be unknown. A linear regularized regression is used to identify the required models. To prove its effectiveness, the proposed approach is assessed on a simulated benchmark.

Keywords: clustering, diagnosis, Kalman Filtering, k-means, regularized regression

Procedia PDF Downloads 184
380 Synthesis and Luminescent Properties of Barium-Europium (III) Silicate Systems

Authors: A. Isahakyan, A. Terzyan, V. Stepanyan, N. Zulumyan, H. Beglaryan

Abstract:

The involvement of silica hydrogel derived from serpentine minerals (Mg(Fe))6[Si4O10](OH)8 as a source of silicon dioxide in SiO2–NaOH–BaCl2–H2O system results in precipitating via one-hour stirring of boiling suspension such intermediates that on heating up to 800 °C crystallize into the product composed of barium ortho- Ba2SiO4 and metasilicates BaSiO3. Based on the positive results, this approach has been decided to be adapted to inserting europium (III) ions into the structure of the synthesized compounds. Intermediates previously precipitated in silica hydrogel–NaOH–BaCl2–Eu(NO3)3 system via one-hour stirring at room temperature underwent one-hour heat-treatment at different temperatures (6001200 °C). Prior to calcination, the suspension produced in the mixer was heated on a boiling-water bath until a powder-like sample was obtained. When the silica hydrogel was metered, SiO2 content in the silica hydrogel that is 5.8 % was taken into consideration in order to guaranty the molar ratios of both SiO2 to BaO and SiO2 to Na2O equal to 1:2. BaCl2 and Eu(NO3)3 reagents were weighted so that the formation of appropriate compositions was guaranteed. Samples including various concentrations of Eu3+ ions (1.25, 2.5, 3.75, 5, 6.35, 8.65, 10, 17.5, 18.75 and 20 mol%) were synthesized by the described method. Luminescence excitation, emission spectra of the products were recorded on the Agilent Cary Eclipes fluorescence spectrophotometer using Agilent Xenon flash lamp (80 Hz) as the excitation source (scanning rate=30 nm/min, excitation and emission slits width=5 nm, excitation filter set to auto, emission filter set to auto and PMT detector Voltage=800 V). Prior to optical properties measurements, each of the powder samples was put in the solid sample-holder. X-ray powder diffraction (XRPD) measurements were made on the SmartLab SE diffractometer. Emission spectra recorded for all the samples at an excitation wavelength of 394 nm exhibit peaks centered at around 536, 555, 587, 614, 653, 690 and 702.5 nm. The most intensive emission peak is observed at 614nm due to 5D0→7F2 of europium (III) ions transition. Luminescence intensity achieves its maximum for Eu3+ 17.5 mol% and heat-treatment at 1200 °C. The XRPD patterns revealed that the diffraction peaks recorded for this sample are identical to NaBa6Nd(SiO4)4 reflections. As Nd-containing reagents were not involved into the synthesis, the maximum luminescent intensity is most likely to be conditioned by NaBa6Eu(SiO4)4 formation whose reflections are not available in the ICDD-JCPDS database of crystallographic 2024. Up to Eu3+ 2.5 mol% the samples demonstrate the phases corresponding to Ba2SiO4 and BaSiO3 standards. Subsequent increasing of europium (III) concentration in the system leads to NaBa6Eu(SiO4)4 formation along with Ba2SiO4 and BaSiO3. NaBa6Eu(SiO4)4 share gradually increases and starting from 17.5 mol% and more NaBa6Eu(SiO4)4 phase is only registered. Thus, the variation of europium (III) concentration in silica hydrogel–NaOH–BaCl2–Eu(NO3)3 system allows producing by the precipitation method the products composed of europium (III)-doped Ba2SiO4 and BaSiO3 and/or NaBa6Eu(SiO4)4 distinguished by different luminescent properties. The work was supported by the Science Committee of RA, in the frames of the research projects № 21T-1D131 and № 21SCG-1D013.

Keywords: europium (III)-doped barium ortho- Ba2SiO4 and metasilicates BaSiO₃, NaBa₆Eu(SiO₄)₄, luminescence, precipitation method

Procedia PDF Downloads 41
379 Assessment Using Copulas of Simultaneous Damage to Multiple Buildings Due to Tsunamis

Authors: Yo Fukutani, Shuji Moriguchi, Takuma Kotani, Terada Kenjiro

Abstract:

If risk management of the assets owned by companies, risk assessment of real estate portfolio, and risk identification of the entire region are to be implemented, it is necessary to consider simultaneous damage to multiple buildings. In this research, the Sagami Trough earthquake tsunami that could have a significant effect on the Japanese capital region is focused on, and a method is proposed for simultaneous damage assessment using copulas that can take into consideration the correlation of tsunami depths and building damage between two sites. First, the tsunami inundation depths at two sites were simulated by using a nonlinear long-wave equation. The tsunamis were simulated by varying the slip amount (five cases) and the depths (five cases) for each of 10 sources of the Sagami Trough. For each source, the frequency distributions of the tsunami inundation depth were evaluated by using the response surface method. Then, Monte-Carlo simulation was conducted, and frequency distributions of tsunami inundation depth were evaluated at the target sites for all sources of the Sagami Trough. These are marginal distributions. Kendall’s tau for the tsunami inundation simulation at two sites was 0.83. Based on this value, the Gaussian copula, t-copula, Clayton copula, and Gumbel copula (n = 10,000) were generated. Then, the simultaneous distributions of the damage rate were evaluated using the marginal distributions and the copulas. For the correlation of the tsunami inundation depth at the two sites, the expected value hardly changed compared with the case of no correlation, but the damage rate of the ninety-ninth percentile value was approximately 2%, and the maximum value was approximately 6% when using the Gumbel copula.

Keywords: copulas, Monte-Carlo simulation, probabilistic risk assessment, tsunamis

Procedia PDF Downloads 144
378 Vehicle Detection and Tracking Using Deep Learning Techniques in Surveillance Image

Authors: Abe D. Desta

Abstract:

This study suggests a deep learning-based method for identifying and following moving objects in surveillance video. The proposed method uses a fast regional convolution neural network (F-RCNN) trained on a substantial dataset of vehicle images to first detect vehicles. A Kalman filter and a data association technique based on a Hungarian algorithm are then used to monitor the observed vehicles throughout time. However, in general, F-RCNN algorithms have been shown to be effective in achieving high detection accuracy and robustness in this research study. For example, in one study The study has shown that the vehicle detection and tracking, the system was able to achieve an accuracy of 97.4%. In this study, the F-RCNN algorithm was compared to other popular object detection algorithms and was found to outperform them in terms of both detection accuracy and speed. The presented system, which has application potential in actual surveillance systems, shows the usefulness of deep learning approaches in vehicle detection and tracking.

Keywords: artificial intelligence, computer vision, deep learning, fast-regional convolutional neural networks, feature extraction, vehicle tracking

Procedia PDF Downloads 129
377 Current Methods for Drug Property Prediction in the Real World

Authors: Jacob Green, Cecilia Cabrera, Maximilian Jakobs, Andrea Dimitracopoulos, Mark van der Wilk, Ryan Greenhalgh

Abstract:

Predicting drug properties is key in drug discovery to enable de-risking of assets before expensive clinical trials and to find highly active compounds faster. Interest from the machine learning community has led to the release of a variety of benchmark datasets and proposed methods. However, it remains unclear for practitioners which method or approach is most suitable, as different papers benchmark on different datasets and methods, leading to varying conclusions that are not easily compared. Our large-scale empirical study links together numerous earlier works on different datasets and methods, thus offering a comprehensive overview of the existing property classes, datasets, and their interactions with different methods. We emphasise the importance of uncertainty quantification and the time and, therefore, cost of applying these methods in the drug development decision-making cycle. To the best of the author's knowledge, it has been observed that the optimal approach varies depending on the dataset and that engineered features with classical machine learning methods often outperform deep learning. Specifically, QSAR datasets are typically best analysed with classical methods such as Gaussian Processes, while ADMET datasets are sometimes better described by Trees or deep learning methods such as Graph Neural Networks or language models. Our work highlights that practitioners do not yet have a straightforward, black-box procedure to rely on and sets a precedent for creating practitioner-relevant benchmarks. Deep learning approaches must be proven on these benchmarks to become the practical method of choice in drug property prediction.

Keywords: activity (QSAR), ADMET, classical methods, drug property prediction, empirical study, machine learning

Procedia PDF Downloads 83
376 An Ultrasonic Signal Processing System for Tomographic Imaging of Reinforced Concrete Structures

Authors: Edwin Forero-Garcia, Jaime Vitola, Brayan Cardenas, Johan Casagua

Abstract:

This research article presents the integration of electronic and computer systems, which developed an ultrasonic signal processing system that performs the capture, adaptation, and analog-digital conversion to later carry out its processing and visualization. The capture and adaptation of the signal were carried out from the design and implementation of an analog electronic system distributed in stages: 1. Coupling of impedances; 2. Analog filter; 3. Signal amplifier. After the signal conditioning was carried out, the ultrasonic information was digitized using a digital microcontroller to carry out its respective processing. The digital processing of the signals was carried out in MATLAB software for the elaboration of A-Scan, B and D-Scan types of ultrasonic images. Then, advanced processing was performed using the SAFT technique to improve the resolution of the Scan-B-type images. Thus, the information from the ultrasonic images was displayed in a user interface developed in .Net with Visual Studio. For the validation of the system, ultrasonic signals were acquired, and in this way, the non-invasive inspection of the structures was carried out and thus able to identify the existing pathologies in them.

Keywords: acquisition, signal processing, ultrasound, SAFT, HMI

Procedia PDF Downloads 107
375 Isolation of Bacterial Species with Potential Capacity for Siloxane Removal in Biogas Upgrading

Authors: Ellana Boada, Eric Santos-Clotas, Alba Cabrera-Codony, Maria Martin, Lluis Baneras, Frederic Gich

Abstract:

Volatile methylsiloxanes (VMS) are a group of manmade silicone compounds widely used in household and industrial applications that end up on the biogas produced through the anaerobic digestion of organic matter in landfills and wastewater treatment plants. The presence of VMS during the biogas energy conversion can cause damage on the engines, reducing the efficiency of this renewable energy source. Non regenerative adsorption onto activated carbon is the most widely used technology to remove siloxanes from biogas, while new trends point out that biotechnology offers a low-cost and environmentally friendly alternative to conventional technologies. The first objective of this research was to enrich, isolate and identify bacterial species able to grow using siloxane molecules as a sole carbon source: anoxic wastewater sludge was used as initial inoculum in liquid anoxic enrichments, adding D4 (as representative siloxane compound) previously adsorbed on activated carbon. After several months of acclimatization, liquid enrichments were plated onto solid media containing D4 and thirty-four bacterial isolates were obtained. 16S rRNA gene sequencing allowed the identification of strains belonging to the following species: Ciceribacter lividus, Alicycliphilus denitrificans, Pseudomonas aeruginosa and Pseudomonas citronellolis which are described to be capable to degrade toxic volatile organic compounds. Kinetic assays with 8 representative strains revealed higher cell growth in the presence of D4 compared to the control. Our second objective was to characterize the community composition and diversity of the microbial community present in the enrichments and to elucidate whether the isolated strains were representative members of the community or not. DNA samples were extracted, the 16S rRNA gene was amplified (515F & 806R primer pair), and the microbiome analyzed from sequences obtained with a MiSeq PE250 platform. Results showed that the retrieved isolates only represented a minor fraction of the microorganisms present in the enrichment samples, which were represented by Alpha, Beta, and Gamma proteobacteria as dominant groups in the category class thus suggesting that other microbial species and/or consortia may be important for D4 biodegradation. These results highlight the need of additional protocols for the isolation of relevant D4 degraders. Currently, we are developing molecular tools targeting key genes involved in siloxane biodegradation to identify and quantify the capacity of the isolates to metabolize D4 in batch cultures supplied with a synthetic gas stream of air containing 60 mg m⁻³ of D4 together with other volatile organic compounds found in the biogas mixture (i.e. toluene, hexane and limonene). The isolates were used as inoculum in a biotrickling filter containing lava rocks and activated carbon to assess their capacity for siloxane removal. Preliminary results of biotrickling filter performance showed 35% of siloxane biodegradation in a contact time of 14 minutes, denoting that biological siloxane removal is a promising technology for biogas upgrading.

Keywords: bacterial cultivation, biogas upgrading, microbiome, siloxanes

Procedia PDF Downloads 259
374 Multi-Vehicle Detection Using Histogram of Oriented Gradients Features and Adaptive Sliding Window Technique

Authors: Saumya Srivastava, Rina Maiti

Abstract:

In order to achieve a better performance of vehicle detection in a complex environment, we present an efficient approach for a multi-vehicle detection system using an adaptive sliding window technique. For a given frame, image segmentation is carried out to establish the region of interest. Gradient computation followed by thresholding, denoising, and morphological operations is performed to extract the binary search image. Near-region field and far-region field are defined to generate hypotheses using the adaptive sliding window technique on the resultant binary search image. For each vehicle candidate, features are extracted using a histogram of oriented gradients, and a pre-trained support vector machine is applied for hypothesis verification. Later, the Kalman filter is used for tracking the vanishing point. The experimental results show that the method is robust and effective on various roads and driving scenarios. The algorithm was tested on highways and urban roads in India.

Keywords: gradient, vehicle detection, histograms of oriented gradients, support vector machine

Procedia PDF Downloads 124
373 An Autonomous Passive Acoustic System for Detection, Tracking and Classification of Motorboats in Portofino Sea

Authors: A. Casale, J. Alessi, C. N. Bianchi, G. Bozzini, M. Brunoldi, V. Cappanera, P. Corvisiero, G. Fanciulli, D. Grosso, N. Magnoli, A. Mandich, C. Melchiorre, C. Morri, P. Povero, N. Stasi, M. Taiuti, G. Viano, M. Wurtz

Abstract:

This work describes a real-time algorithm for detecting, tracking and classifying single motorboats, developed using the acoustic data recorded by a hydrophone array within the framework of EU LIFE + project ARION (LIFE09NAT/IT/000190). The project aims to improve the conservation status of bottlenose dolphins through a real-time simultaneous monitoring of their population and surface ship traffic. A Passive Acoustic Monitoring (PAM) system is installed on two autonomous permanent marine buoys, located close to the boundaries of the Marine Protected Area (MPA) of Portofino (Ligurian Sea- Italy). Detecting surface ships is also a necessity in many other sensible areas, such as wind farms, oil platforms, and harbours. A PAM system could be an effective alternative to the usual monitoring systems, as radar or active sonar, for localizing unauthorized ship presence or illegal activities, with the advantage of not revealing its presence. Each ARION buoy consists of a particular type of structure, named meda elastica (elastic beacon) composed of a main pole, about 30-meter length, emerging for 7 meters, anchored to a mooring of 30 tons at 90 m depth by an anti-twist steel wire. Each buoy is equipped with a floating element and a hydrophone tetrahedron array, whose raw data are send via a Wi-Fi bridge to a ground station where real-time analysis is performed. Bottlenose dolphin detection algorithm and ship monitoring algorithm are operating in parallel and in real time. Three modules were developed and commissioned for ship monitoring. The first is the detection algorithm, based on Time Difference Of Arrival (TDOA) measurements, i.e., the evaluation of angular direction of the target respect to each buoy and the triangulation for obtaining the target position. The second is the tracking algorithm, based on a Kalman filter, i.e., the estimate of the real course and speed of the target through a predictor filter. At last, the classification algorithm is based on the DEMON method, i.e., the extraction of the acoustic signature of single vessels. The following results were obtained; the detection algorithm succeeded in evaluating the bearing angle with respect to each buoy and the position of the target, with an uncertainty of 2 degrees and a maximum range of 2.5 km. The tracking algorithm succeeded in reconstructing the real vessel courses and estimating the speed with an accuracy of 20% respect to the Automatic Identification System (AIS) signals. The classification algorithm succeeded in isolating the acoustic signature of single vessels, demonstrating its temporal stability and the consistency of both buoys results. As reference, the results were compared with the Hilbert transform of single channel signals. The algorithm for tracking multiple targets is ready to be developed, thanks to the modularity of the single ship algorithm: the classification module will enumerate and identify all targets present in the study area; for each of them, the detection module and the tracking module will be applied to monitor their course.

Keywords: acoustic-noise, bottlenose-dolphin, hydrophone, motorboat

Procedia PDF Downloads 174