Search results for: automatic incident detection
1666 Literature Review on Text Comparison Techniques: Analysis of Text Extraction, Main Comparison and Visual Representation Tools
Authors: Andriana Mkrtchyan, Vahe Khlghatyan
Abstract:
The choice of a profession is one of the most important decisions people make throughout their life. With the development of modern science, technologies, and all the spheres existing in the modern world, more and more professions are being arisen that complicate even more the process of choosing. Hence, there is a need for a guiding platform to help people to choose a profession and the right career path based on their interests, skills, and personality. This review aims at analyzing existing methods of comparing PDF format documents and suggests that a 3-stage approach is implemented for the comparison, that is – 1. text extraction from PDF format documents, 2. comparison of the extracted text via NLP algorithms, 3. comparison representation using special shape and color psychology methodology.Keywords: color psychology, data acquisition/extraction, data augmentation, disambiguation, natural language processing, outlier detection, semantic similarity, text-mining, user evaluation, visual search
Procedia PDF Downloads 761665 Joubert Syndrome: A Rare Genetic Disorder Reported in Kurdish Family
Authors: Aran Abd Al Rahman
Abstract:
Joubert syndrome regards as a congenital cerebellar ataxia caused by autosomal recessive carried on X chromosome. The disease diagnosed by brain imaging—the so-called molar tooth sign. Neurological signs were present from the neonatal period and include hypotonia progressing to ataxia, global developmental delay, ocular motor apraxia, and breathing dysregulation. These signs are variably associated with multiorgan involvement, mainly of the retina, kidneys, skeleton, and liver. 30 causative genes have been identified so far, all of which encode for proteins of the primary cilium or its apparatus, The purpose of our project was to detect the mutant gene (INPP5E gene) which cause Joubert syndrome. There were many methods used for diagnosis such as MRI and CT- scan and molecular diagnosis by doing ARMS PCR for detection of mutant gene that we were used in this research project. In this research for individual family which reported, the two children with parents, the two children were affected and were carrier.Keywords: Joubert syndrome, genetic disease, Kurdistan region, Sulaimani
Procedia PDF Downloads 1411664 Fault Diagnosis of Squirrel-Cage Induction Motor by a Neural Network Multi-Models
Authors: Yahia. Kourd, N. Guersi D. Lefebvre
Abstract:
In this paper we propose to study the faults diagnosis in squirrel-cage induction motor using MLP neural networks. We use neural healthy and faulty models of the behavior in order to detect and isolate some faults in machine. In the first part of this work, we have created a neural model for the healthy state using Matlab and a motor located in LGEB by acquirins data inputs and outputs of this engine. Then we detected the faults in the machine by residual generation. These residuals are not sufficient to isolate the existing faults. For this reason, we proposed additive neural networks to represent the faulty behaviors. From the analysis of these residuals and the choice of a threshold we propose a method capable of performing the detection and diagnosis of some faults in asynchronous machines with squirrel cage rotor.Keywords: faults diagnosis, neural networks, multi-models, squirrel-cage induction motor
Procedia PDF Downloads 6361663 Neuro-Connectivity Analysis Using Abide Data in Autism Study
Authors: Dulal Bhaumik, Fei Jie, Runa Bhaumik, Bikas Sinha
Abstract:
Human brain is an amazingly complex network. Aberrant activities in this network can lead to various neurological disorders such as multiple sclerosis, Parkinson’s disease, Alzheimer’s disease and autism. fMRI has emerged as an important tool to delineate the neural networks affected by such diseases, particularly autism. In this paper, we propose mixed-effects models together with an appropriate procedure for controlling false discoveries to detect disrupted connectivities in whole brain studies. Results are illustrated with a large data set known as Autism Brain Imaging Data Exchange or ABIDE which includes 361 subjects from 8 medical centers. We believe that our findings have addressed adequately the small sample inference problem, and thus are more reliable for therapeutic target for intervention. In addition, our result can be used for early detection of subjects who are at high risk of developing neurological disorders.Keywords: ABIDE, autism spectrum disorder, fMRI, mixed-effects model
Procedia PDF Downloads 2891662 Detection and Identification of Antibiotic Resistant UPEC Using FTIR-Microscopy and Advanced Multivariate Analysis
Authors: Uraib Sharaha, Ahmad Salman, Eladio Rodriguez-Diaz, Elad Shufan, Klaris Riesenberg, Irving J. Bigio, Mahmoud Huleihel
Abstract:
Antimicrobial drugs have played an indispensable role in controlling illness and death associated with infectious diseases in animals and humans. However, the increasing resistance of bacteria to a broad spectrum of commonly used antibiotics has become a global healthcare problem. Many antibiotics had lost their effectiveness since the beginning of the antibiotic era because many bacteria have adapted defenses against these antibiotics. Rapid determination of antimicrobial susceptibility of a clinical isolate is often crucial for the optimal antimicrobial therapy of infected patients and in many cases can save lives. The conventional methods for susceptibility testing require the isolation of the pathogen from a clinical specimen by culturing on the appropriate media (this culturing stage lasts 24 h-first culturing). Then, chosen colonies are grown on media containing antibiotic(s), using micro-diffusion discs (second culturing time is also 24 h) in order to determine its bacterial susceptibility. Other methods, genotyping methods, E-test and automated methods were also developed for testing antimicrobial susceptibility. Most of these methods are expensive and time-consuming. Fourier transform infrared (FTIR) microscopy is rapid, safe, effective and low cost method that was widely and successfully used in different studies for the identification of various biological samples including bacteria; nonetheless, its true potential in routine clinical diagnosis has not yet been established. The new modern infrared (IR) spectrometers with high spectral resolution enable measuring unprecedented biochemical information from cells at the molecular level. Moreover, the development of new bioinformatics analyses combined with IR spectroscopy becomes a powerful technique, which enables the detection of structural changes associated with resistivity. The main goal of this study is to evaluate the potential of the FTIR microscopy in tandem with machine learning algorithms for rapid and reliable identification of bacterial susceptibility to antibiotics in time span of few minutes. The UTI E.coli bacterial samples, which were identified at the species level by MALDI-TOF and examined for their susceptibility by the routine assay (micro-diffusion discs), are obtained from the bacteriology laboratories in Soroka University Medical Center (SUMC). These samples were examined by FTIR microscopy and analyzed by advanced statistical methods. Our results, based on 700 E.coli samples, were promising and showed that by using infrared spectroscopic technique together with multivariate analysis, it is possible to classify the tested bacteria into sensitive and resistant with success rate higher than 90% for eight different antibiotics. Based on these preliminary results, it is worthwhile to continue developing the FTIR microscopy technique as a rapid and reliable method for identification antibiotic susceptibility.Keywords: antibiotics, E.coli, FTIR, multivariate analysis, susceptibility, UTI
Procedia PDF Downloads 1731661 A Computer-Aided System for Detection and Classification of Liver Cirrhosis
Authors: Abdel Hadi N. Ebraheim, Eman Azomi, Nefisa A. Fahmy
Abstract:
This paper designs and implements a computer-aided system (CAS) to help detect and diagnose liver cirrhosis in patients with Chronic Hepatitis C. Our system reduces the required features (tests) the patient is asked to do to tests to their minimal best most informative subset of tests, with a diagnostic accuracy above 99%, and hence saving both time and costs. We use the Support Vector Machine (SVM) with cross-validation, a Multilayer Perceptron Neural Network (MLP), and a Generalized Regression Neural Network (GRNN) that employs a base of radial functions for functional approximation, as classifiers. Our system is tested on 199 subjects, of them 99 Chronic Hepatitis C.The subjects were selected from among the outpatient clinic in National Herpetology and Tropical Medicine Research Institute (NHTMRI).Keywords: liver cirrhosis, artificial neural network, support vector machine, multi-layer perceptron, classification, accuracy
Procedia PDF Downloads 4611660 Square Wave Anodic Stripping Voltammetry of Copper (II) at the Tetracarbonylmolybdenum(0) MWCNT Paste Electrode
Authors: Illyas Isa, Mohamad Idris Saidin, Mustaffa Ahmad, Norhayati Hashim
Abstract:
A highly selective and sensitive electrode for determination of trace amounts of Cu (II) using square wave anodic stripping voltammetry (SWASV) was proposed. The electrode was made of the paste of multiwall carbon nanotubes (MWCNT) and 2,6–diacetylpyridine-di-(1R)–(-)–fenchone diazine tetracarbonylmolybdenum(0) at 100:5 (w/w). Under optimal conditions the electrode showed a linear relationship with concentration in the range of 1.0 × 10–10 to 1.0 × 10– 6 M Cu (II) and limit of detection 8.0 × 10–11 M Cu (II). The relative standard deviation (n = 5) of response to 1.0 × 10–6 M Cu(II) was 0.036. The interferences of cations such as Ni(II), Mg(II), Cd(II), Co(II), Hg(II), and Zn(II) (in 10 and 100-folds concentration) are negligible except from Pb (II). Electrochemical impedance spectroscopy (EIS) showed that the charge transfer at the electrode-solution interface was favorable. Result of analysis of Cu(II) in several water samples agreed well with those obtained by inductively coupled plasma-optical emission spectrometry (ICP-OES). The proposed electrode was then recommended as an alternative to spectroscopic technique in analyzing Cu (II).Keywords: chemically modified electrode, Cu(II), Square wave anodic stripping voltammetry, tetracarbonylmolybdenum(0)
Procedia PDF Downloads 2621659 The Application of Fuzzy Set Theory to Mobile Internet Advertisement Fraud Detection
Authors: Jinming Ma, Tianbing Xia, Janusz Getta
Abstract:
This paper presents the application of fuzzy set theory to implement of mobile advertisement anti-fraud systems. Mobile anti-fraud is a method aiming to identify mobile advertisement fraudsters. One of the main problems of mobile anti-fraud is the lack of evidence to prove a user to be a fraudster. In this paper, we implement an application by using fuzzy set theory to demonstrate how to detect cheaters. The advantage of our method is that the hardship in detecting fraudsters in small data samples has been avoided. We achieved this by giving each user a suspicious degree showing how likely the user is cheating and decide whether a group of users (like all users of a certain APP) together to be fraudsters according to the average suspicious degree. This makes the process more accurate as the data of a single user is too small to be predictable.Keywords: mobile internet, advertisement, anti-fraud, fuzzy set theory
Procedia PDF Downloads 1811658 Novel Algorithm for Restoration of Retina Images
Authors: P. Subbuthai, S. Muruganand
Abstract:
Diabetic Retinopathy is one of the complicated diseases and it is caused by the changes in the blood vessels of the retina. Extraction of retina image through Fundus camera sometimes produced poor contrast and noises. Because of this noise, detection of blood vessels in the retina is very complicated. So preprocessing is needed, in this paper, a novel algorithm is implemented to remove the noisy pixel in the retina image. The proposed algorithm is Extended Median Filter and it is applied to the green channel of the retina because green channel vessels are brighter than the background. Proposed extended median filter is compared with the existing standard median filter by performance metrics such as PSNR, MSE and RMSE. Experimental results show that the proposed Extended Median Filter algorithm gives a better result than the existing standard median filter in terms of noise suppression and detail preservation.Keywords: fundus retina image, diabetic retinopathy, median filter, microaneurysms, exudates
Procedia PDF Downloads 3421657 A Simple and Empirical Refraction Correction Method for UAV-Based Shallow-Water Photogrammetry
Authors: I GD Yudha Partama, A. Kanno, Y. Akamatsu, R. Inui, M. Goto, M. Sekine
Abstract:
The aerial photogrammetry of shallow water bottoms has the potential to be an efficient high-resolution survey technique for shallow water topography, thanks to the advent of convenient UAV and automatic image processing techniques Structure-from-Motion (SfM) and Multi-View Stereo (MVS)). However, it suffers from the systematic overestimation of the bottom elevation, due to the light refraction at the air-water interface. In this study, we present an empirical method to correct for the effect of refraction after the usual SfM-MVS processing, using common software. The presented method utilizes the empirical relation between the measured true depth and the estimated apparent depth to generate an empirical correction factor. Furthermore, this correction factor was utilized to convert the apparent water depth into a refraction-corrected (real-scale) water depth. To examine its effectiveness, we applied the method to two river sites, and compared the RMS errors in the corrected bottom elevations with those obtained by three existing methods. The result shows that the presented method is more effective than the two existing methods: The method without applying correction factor and the method utilizes the refractive index of water (1.34) as correction factor. In comparison with the remaining existing method, which used the additive terms (offset) after calculating correction factor, the presented method performs well in Site 2 and worse in Site 1. However, we found this linear regression method to be unstable when the training data used for calibration are limited. It also suffers from a large negative bias in the correction factor when the apparent water depth estimated is affected by noise, according to our numerical experiment. Overall, the good accuracy of refraction correction method depends on various factors such as the locations, image acquisition, and GPS measurement conditions. The most effective method can be selected by using statistical selection (e.g. leave-one-out cross validation).Keywords: bottom elevation, MVS, river, SfM
Procedia PDF Downloads 2991656 Using the Weakest Precondition to Achieve Self-Stabilization in Critical Networks
Authors: Antonio Pizzarello, Oris Friesen
Abstract:
Networks, such as the electric power grid, must demonstrate exemplary performance and integrity. Integrity depends on the quality of both the system design model and the deployed software. Integrity of the deployed software is key, for both the original versions and the many that occur throughout numerous maintenance activity. Current software engineering technology and practice do not produce adequate integrity. Distributed systems utilize networks where each node is an independent computer system. The connections between them is realized via a network that is normally redundantly connected to guarantee the presence of a path between two nodes in the case of failure of some branch. Furthermore, at each node, there is software which may fail. Self-stabilizing protocols are usually present that recognize failure in the network and perform a repair action that will bring the node back to a correct state. These protocols first introduced by E. W. Dijkstra are currently present in almost all Ethernets. Super stabilization protocols capable of reacting to a change in the network topology due to the removal or addition of a branch in the network are less common but are theoretically defined and available. This paper describes how to use the Software Integrity Assessment (SIA) methodology to analyze self-stabilizing software. SIA is based on the UNITY formalism for parallel and distributed programming, which allows the analysis of code for verifying the progress property p leads-to q that describes the progress of all computations starting in a state satisfying p to a state satisfying q via the execution of one or more system modules. As opposed to demonstrably inadequate test and evaluation methods SIA allows the analysis and verification of any network self-stabilizing software as well as any other software that is designed to recover from failure without external intervention of maintenance personnel. The model to be analyzed is obtained by automatic translation of the system code to a transition system that is based on the use of the weakest precondition.Keywords: network, power grid, self-stabilization, software integrity assessment, UNITY, weakest precondition
Procedia PDF Downloads 2231655 Binarization and Recognition of Characters from Historical Degraded Documents
Authors: Bency Jacob, S.B. Waykar
Abstract:
Degradations in historical document images appear due to aging of the documents. It is very difficult to understand and retrieve text from badly degraded documents as there is variation between the document foreground and background. Thresholding of such document images either result in broken characters or detection of false texts. Numerous algorithms exist that can separate text and background efficiently in the textual regions of the document; but portions of background are mistaken as text in areas that hardly contain any text. This paper presents a way to overcome these problems by a robust binarization technique that recovers the text from a severely degraded document images and thereby increases the accuracy of optical character recognition systems. The proposed document recovery algorithm efficiently removes degradations from document images. Here we are using the ostus method ,local thresholding and global thresholding and after the binarization training and recognizing the characters in the degraded documents.Keywords: binarization, denoising, global thresholding, local thresholding, thresholding
Procedia PDF Downloads 3441654 Halal Authentication for Some Product Collected from Jordanian Market Using Real-Time PCR
Authors: Omar S. Sharaf
Abstract:
The mitochondrial 12s rRNA (mt-12s rDNA) gene for pig-specific was developed to detect material from pork species in different products collected from Jordanian market. The amplification PCR products of 359 bp and 531 bp were successfully amplified from the cyt b gene of pig the amplification product using mt-12S rDNA gene were successfully produced a single band with a molecular size of 456 bp. In the present work, the PCR amplification of mtDNA of cytochrome b has been shown as a suitable tool for rapid detection of pig DNA. 100 samples from different dairy, gelatin and chocolate based products and 50 samples from baby food formula were collected and tested to a presence of any pig derivatives. It was found that 10% of chocolate based products, 12% of gelatin and 56% from dairy products and 5.2% from baby food formula showed single band from mt-12S rDNA gene.Keywords: halal food, baby infant formula, chocolate based products, PCR, Jordan
Procedia PDF Downloads 5341653 An ANN-Based Predictive Model for Diagnosis and Forecasting of Hypertension
Authors: Obe Olumide Olayinka, Victor Balanica, Eugen Neagoe
Abstract:
The effects of hypertension are often lethal thus its early detection and prevention is very important for everybody. In this paper, a neural network (NN) model was developed and trained based on a dataset of hypertension causative parameters in order to forecast the likelihood of occurrence of hypertension in patients. Our research goal was to analyze the potential of the presented NN to predict, for a period of time, the risk of hypertension or the risk of developing this disease for patients that are or not currently hypertensive. The results of the analysis for a given patient can support doctors in taking pro-active measures for averting the occurrence of hypertension such as recommendations regarding the patient behavior in order to lower his hypertension risk. Moreover, the paper envisages a set of three example scenarios in order to determine the age when the patient becomes hypertensive, i.e. determine the threshold for hypertensive age, to analyze what happens if the threshold hypertensive age is set to a certain age and the weight of the patient if being varied, and, to set the ideal weight for the patient and analyze what happens with the threshold of hypertensive age.Keywords: neural network, hypertension, data set, training set, supervised learning
Procedia PDF Downloads 3921652 Machine Learning Application in Shovel Maintenance
Authors: Amir Taghizadeh Vahed, Adithya Thaduri
Abstract:
Shovels are the main components in the mining transportation system. The productivity of the mines depends on the availability of shovels due to its high capital and operating costs. The unplanned failure/shutdowns of a shovel results in higher repair costs, increase in downtime, as well as increasing indirect cost (i.e. loss of production and company’s reputation). In order to mitigate these failures, predictive maintenance can be useful approach using failure prediction. The modern mining machinery or shovels collect huge datasets automatically; it consists of reliability and maintenance data. However, the gathered datasets are useless until the information and knowledge of data are extracted. Machine learning as well as data mining, which has a major role in recent studies, has been used for the knowledge discovery process. In this study, data mining and machine learning approaches are implemented to detect not only anomalies but also patterns from a dataset and further detection of failures.Keywords: maintenance, machine learning, shovel, conditional based monitoring
Procedia PDF Downloads 2191651 Predictive Spectral Lithological Mapping, Geomorphology and Geospatial Correlation of Structural Lineaments in Bornu Basin, Northeast Nigeria
Authors: Aminu Abdullahi Isyaku
Abstract:
Semi-arid Bornu basin in northeast Nigeria is characterised with flat topography, thick cover sediments and lack of continuous bedrock outcrops discernible for field geology. This paper presents the methodology for the characterisation of neotectonic surface structures and surface lithology in the north-eastern Bornu basin in northeast Nigeria as an alternative approach to field geological mapping using free multispectral Landsat 7 ETM+, SRTM DEM and ASAR Earth Observation datasets. Spectral lithological mapping herein developed utilised spectral discrimination of the surface features identified on Landsat 7 ETM+ images to infer on the lithology using four steps including; computations of band combination images; band ratio images; supervised image classification and inferences of the lithological compositions. Two complementary approaches to lineament mapping are carried out in this study involving manual digitization and automatic lineament extraction to validate the structural lineaments extracted from the Landsat 7 ETM+ image mosaic covering the study. A comparison between the mapped surface lineaments and lineament zones show good geospatial correlation and identified the predominant NE-SW and NW-SE structural trends in the basin. Topographic profiles across different parts of the Bama Beach Ridge palaeoshorelines in the basin appear to show different elevations across the feature. It is determined that most of the drainage systems in the northeastern Bornu basin are structurally controlled with drainage lines terminating against the paleo-lake border and emptying into the Lake Chad mainly arising from the extensive topographic high-stand Bama Beach Ridge palaeoshoreline.Keywords: Bornu Basin, lineaments, spectral lithology, tectonics
Procedia PDF Downloads 1391650 Jordan Curves in the Digital Plane with Respect to the Connectednesses given by Certain Adjacency Graphs
Authors: Josef Slapal
Abstract:
Digital images are approximations of real ones and, therefore, to be able to study them, we need the digital plane Z2 to be equipped with a convenient structure that behaves analogously to the Euclidean topology on the real plane. In particular, it is required that such a structure allows for a digital analogue of the Jordan curve theorem. We introduce certain adjacency graphs on the digital plane and prove digital Jordan curves for them thus showing that the graphs provide convenient structures on Z2 for the study and processing of digital images. Further convenient structures including the wellknown Khalimsky and Marcus-Wyse adjacency graphs may be obtained as quotients of the graphs introduced. Since digital Jordan curves represent borders of objects in digital images, the adjacency graphs discussed may be used as background structures on the digital plane for solving the problems of digital image processing that are closely related to borders like border detection, contour filling, pattern recognition, thinning, etc.Keywords: digital plane, adjacency graph, Jordan curve, quotient adjacency
Procedia PDF Downloads 3791649 Single-Camera Basketball Tracker through Pose and Semantic Feature Fusion
Authors: Adrià Arbués-Sangüesa, Coloma Ballester, Gloria Haro
Abstract:
Tracking sports players is a widely challenging scenario, specially in single-feed videos recorded in tight courts, where cluttering and occlusions cannot be avoided. This paper presents an analysis of several geometric and semantic visual features to detect and track basketball players. An ablation study is carried out and then used to remark that a robust tracker can be built with Deep Learning features, without the need of extracting contextual ones, such as proximity or color similarity, nor applying camera stabilization techniques. The presented tracker consists of: (1) a detection step, which uses a pretrained deep learning model to estimate the players pose, followed by (2) a tracking step, which leverages pose and semantic information from the output of a convolutional layer in a VGG network. Its performance is analyzed in terms of MOTA over a basketball dataset with more than 10k instances.Keywords: basketball, deep learning, feature extraction, single-camera, tracking
Procedia PDF Downloads 1381648 The Effect of Elapsed Time on the Cardiac Troponin-T Degradation and Its Utility as a Time Since Death Marker in Cases of Death Due to Burn
Authors: Sachil Kumar, Anoop K.Verma, Uma Shankar Singh
Abstract:
It’s extremely important to study postmortem interval in different causes of death since it assists in a great way in making an opinion on the exact cause of death following such incident often times. With diligent knowledge of the interval one could really say as an expert that the cause of death is not feigned hence there is a great need in evaluating such death to have been at the CRIME SCENE before performing an autopsy on such body. The approach described here is based on analyzing the degradation or proteolysis of a cardiac protein in cases of deaths due to burn as a marker of time since death. Cardiac tissue samples were collected from (n=6) medico-legal autopsies, (Department of Forensic Medicine and Toxicology), King George’s Medical University, Lucknow India, after informed consent from the relatives and studied post-mortem degradation by incubation of the cardiac tissue at room temperature (20±2 OC) for different time periods (~7.30, 18.20, 30.30, 41.20, 41.40, 54.30, 65.20, and 88.40 Hours). The cases included were the subjects of burn without any prior history of disease who died in the hospital and their exact time of death was known. The analysis involved extraction of the protein, separation by denaturing gel electrophoresis (SDS-PAGE) and visualization by Western blot using cTnT specific monoclonal antibodies. The area of the bands within a lane was quantified by scanning and digitizing the image using Gel Doc. As time postmortem progresses the intact cTnT band degrades to fragments that are easily detected by the monoclonal antibodies. A decreasing trend in the level of cTnT (% of intact) was found as the PM hours increased. A significant difference was observed between <15 h and other PM hours (p<0.01). Significant difference in cTnT level (% of intact) was also observed between 16-25 h and 56-65 h & >75 h (p<0.01). Western blot data clearly showed the intact protein at 42 kDa, three major (28 kDa, 30kDa, 10kDa) fragments, three additional minor fragments (12 kDa, 14kDa, and 15 kDa) and formation of low molecular weight fragments. Overall, both PMI and cardiac tissue of burned corpse had a statistically significant effect where the greatest amount of protein breakdown was observed within the first 41.40 Hrs and after it intact protein slowly disappears. If the percent intact cTnT is calculated from the total area integrated within a Western blot lane, then the percent intact cTnT shows a pseudo-first order relationship when plotted against the time postmortem. A strong significant positive correlation was found between cTnT and PM hours (r=0.87, p=0.0001). The regression analysis showed a good variability explained (R2=0.768) The post-mortem Troponin-T fragmentation observed in this study reveals a sequential, time-dependent process with the potential for use as a predictor of PMI in cases of burning.Keywords: burn, degradation, postmortem interval, troponin-T
Procedia PDF Downloads 4511647 Preparation, Characterization and Photocatalytic Activity of a New Noble Metal Modified TiO2@SrTiO3 and SrTiO3 Photocatalysts
Authors: Ewelina Grabowska, Martyna Marchelek
Abstract:
Among the various semiconductors, nanosized TiO2 has been widely studied due to its high photosensitivity, low cost, low toxicity, and good chemical and thermal stability. However, there are two main drawbacks to the practical application of pure TiO2 films. One is that TiO2 can be induced only by ultraviolet (UV) light due to its intrinsic wide bandgap (3.2 eV for anatase and 3.0 eV for rutile), which limits its practical efficiency for solar energy utilization since UV light makes up only 4-5% of the solar spectrum. The other is that a high electron-hole recombination rate will reduce the photoelectric conversion efficiency of TiO2. In order to overcome the above drawbacks and modify the electronic structure of TiO2, some semiconductors (eg. CdS, ZnO, PbS, Cu2O, Bi2S3, and CdSe) have been used to prepare coupled TiO2 composites, for improving their charge separation efficiency and extending the photoresponse into the visible region. It has been proved that the fabrication of p-n heterostructures by combining n-type TiO2 with p-type semiconductors is an effective way to improve the photoelectric conversion efficiency of TiO2. SrTiO3 is a good candidate for coupling TiO2 and improving the photocatalytic performance of the photocatalyst because its conduction band edge is more negative than TiO2. Due to the potential differences between the band edges of these two semiconductors, the photogenerated electrons transfer from the conduction band of SrTiO3 to that of TiO2. Conversely, the photogenerated electrons transfer from the conduction band of SrTiO3 to that of TiO2. Then the photogenerated charge carriers can be efficiently separated by these processes, resulting in the enhancement of the photocatalytic property in the photocatalyst. Additionally, one of the methods for improving photocatalyst performance is addition of nanoparticles containing one or two noble metals (Pt, Au, Ag and Pd) deposited on semiconductor surface. The mechanisms were proposed as (1) the surface plasmon resonance of noble metal particles is excited by visible light, facilitating the excitation of the surface electron and interfacial electron transfer (2) some energy levels can be produced in the band gap of TiO2 by the dispersion of noble metal nanoparticles in the TiO2 matrix; (3) noble metal nanoparticles deposited on TiO2 act as electron traps, enhancing the electron–hole separation. In view of this, we recently obtained series of TiO2@SrTiO3 and SrTiO3 photocatalysts loaded with noble metal NPs. using photodeposition method. The M- TiO2@SrTiO3 and M-SrTiO3 photocatalysts (M= Rh, Rt, Pt) were studied for photodegradation of phenol in aqueous phase under UV-Vis and visible irradiation. Moreover, in the second part of our research hydroxyl radical formations were investigated. Fluorescence of irradiated coumarin solution was used as a method of ˙OH radical detection. Coumarin readily reacts with generated hydroxyl radicals forming hydroxycoumarins. Although the major hydroxylation product is 5-hydroxycoumarin, only 7-hydroxyproduct of coumarin hydroxylation emits fluorescent light. Thus, this method was used only for hydroxyl radical detection, but not for determining concentration of hydroxyl radicals.Keywords: composites TiO2, SrTiO3, photocatalysis, phenol degradation
Procedia PDF Downloads 2221646 Using Probabilistic Neural Network (PNN) for Extracting Acoustic Microwaves (Bulk Acoustic Waves) in Piezoelectric Material
Authors: Hafdaoui Hichem, Mehadjebia Cherifa, Benatia Djamel
Abstract:
In this paper, we propose a new method for Bulk detection of an acoustic microwave signal during the propagation of acoustic microwaves in a piezoelectric substrate (Lithium Niobate LiNbO3). We have used the classification by probabilistic neural network (PNN) as a means of numerical analysis in which we classify all the values of the real part and the imaginary part of the coefficient attenuation with the acoustic velocity in order to build a model from which we note the Bulk waves easily. These singularities inform us of presence of Bulk waves in piezoelectric materials. By which we obtain accurate values for each of the coefficient attenuation and acoustic velocity for Bulk waves. This study will be very interesting in modeling and realization of acoustic microwaves devices (ultrasound) based on the propagation of acoustic microwaves.Keywords: piezoelectric material, probabilistic neural network (PNN), classification, acoustic microwaves, bulk waves, the attenuation coefficient
Procedia PDF Downloads 4321645 Piezo-Extracted Model Based Chloride/ Carbonation Induced Corrosion Assessment in Reinforced Concrete Structures
Authors: Gupta. Ashok, V. talakokula, S. bhalla
Abstract:
Rebar corrosion is one of the main causes of damage and premature failure of the reinforced concrete (RC) structures worldwide, causing enormous costs for inspection, maintenance, restoration and replacement. Therefore, early detection of corrosion and timely remedial action on the affected portion can facilitate an optimum utilization of the structure, imparting longevity to it. The recent advent of the electro-mechanical impedance (EMI) technique using piezo sensors (PZT) for structural health monitoring (SHM) has provided a new paradigm to the maintenance engineers to diagnose the onset of the damage at the incipient stage itself. This paper presents a model based approach for corrosion assessment based on the equivalent parameters extracted from the impedance spectrum of concrete-rebar system using the EMI technique via the PZT sensors.Keywords: impedance, electro-mechanical, stiffness, mass, damping, equivalent parameters
Procedia PDF Downloads 5431644 Bipolar Impulse Noise Removal and Edge Preservation in Color Images and Video Using Improved Kuwahara Filter
Authors: Reji Thankachan, Varsha PS
Abstract:
Both image capturing devices and human visual systems are nonlinear. Hence nonlinear filtering methods outperforms its linear counterpart in many applications. Linear methods are unable to remove impulsive noise in images by preserving its edges and fine details. In addition, linear algorithms are unable to remove signal dependent or multiplicative noise in images. This paper presents an approach to denoise and smoothen the Bipolar impulse noised images and videos using improved Kuwahara filter. It involves a 2 stage algorithm which includes a noise detection followed by filtering. Numerous simulation demonstrate that proposed method outperforms the existing method by eliminating the painting like flattening effect along the local feature direction while preserving edge with improvement in PSNR and MSE.Keywords: bipolar impulse noise, Kuwahara, PSNR MSE, PDF
Procedia PDF Downloads 4981643 Efficient Fake News Detection Using Machine Learning and Deep Learning Approaches
Authors: Chaima Babi, Said Gadri
Abstract:
The rapid increase in fake news continues to grow at a very fast rate; this requires implementing efficient techniques that allow testing the re-liability of online content. For that, the current research strives to illuminate the fake news problem using deep learning DL and machine learning ML ap-proaches. We have developed the traditional LSTM (Long short-term memory), and the bidirectional BiLSTM model. A such process is to perform a training task on almost of samples of the dataset, validate the model on a subset called the test set to provide an unbiased evaluation of the final model fit on the training dataset, then compute the accuracy of detecting classifica-tion and comparing the results. For the programming stage, we used Tensor-Flow and Keras libraries on Python to support Graphical Processing Units (GPUs) that are being used for developing deep learning applications.Keywords: machine learning, deep learning, natural language, fake news, Bi-LSTM, LSTM, multiclass classification
Procedia PDF Downloads 951642 Application All Digits Number Benford Law in Financial Statement
Authors: Teguh Sugiarto
Abstract:
Background: The research aims to explore if there is fraud in a financial statement, use the Act stated that Benford's distribution all digits must compare the number will follow the trend of lower number. Research methods: This research uses all the analysis number being in Benford's law. After receiving the results of the analysis of all the digits, the author makes a distinction between implementation using the scale above and below 5%, the rate of occurrence of difference. With the number which have differences in the range of 5%, then can do the follow-up and the detection of the onset of fraud against the financial statements. The findings: From the research that has been done can be drawn the conclusion that the average of all numbers appear in the financial statements, and compare the rates of occurrence of numbers according to the characteristics of Benford's law. About the existence of errors and fraud in the financial statements of PT medco Energy Tbk did not occur. Conclusions: The study concludes that Benford's law can serve as indicator tool in detecting the possibility of in financial statements to case studies of PT Medco Energy Tbk for the fiscal year 2000-2010.Keywords: Benford law, first digits, all digits number Benford law, financial statement
Procedia PDF Downloads 2391641 An Early Attempt of Artificial Intelligence-Assisted Language Oral Practice and Assessment
Authors: Paul Lam, Kevin Wong, Chi Him Chan
Abstract:
Constant practicing and accurate, immediate feedback are the keys to improving students’ speaking skills. However, traditional oral examination often fails to provide such opportunities to students. The traditional, face-to-face oral assessment is often time consuming – attending the oral needs of one student often leads to the negligence of others. Hence, teachers can only provide limited opportunities and feedback to students. Moreover, students’ incentive to practice is also reduced by their anxiety and shyness in speaking the new language. A mobile app was developed to use artificial intelligence (AI) to provide immediate feedback to students’ speaking performance as an attempt to solve the above-mentioned problems. Firstly, it was thought that online exercises would greatly increase the learning opportunities of students as they can now practice more without the needs of teachers’ presence. Secondly, the automatic feedback provided by the AI would enhance students’ motivation to practice as there is an instant evaluation of their performance. Lastly, students should feel less anxious and shy compared to directly practicing oral in front of teachers. Technically, the program made use of speech-to-text functions to generate feedback to students. To be specific, the software analyzes students’ oral input through certain speech-to-text AI engine and then cleans up the results further to the point that can be compared with the targeted text. The mobile app has invited English teachers for the pilot use and asked for their feedback. Preliminary trials indicated that the approach has limitations. Many of the users’ pronunciation were automatically corrected by the speech recognition function as wise guessing is already integrated into many of such systems. Nevertheless, teachers have confidence that the app can be further improved for accuracy. It has the potential to significantly improve oral drilling by giving students more chances to practice. Moreover, they believe that the success of this mobile app confirms the potential to extend the AI-assisted assessment to other language skills, such as writing, reading, and listening.Keywords: artificial Intelligence, mobile learning, oral assessment, oral practice, speech-to-text function
Procedia PDF Downloads 1031640 Heavy Metal Concentrations in Sediments of Sta. Maria River, Laguna
Authors: Francis Angelo A. Sta. Ana
Abstract:
Heavy metal pollutants are a major environmental concern in built-up areas in the Philippines. It causes negative effects on aquatic organisms and human health. Heavy metals concentrations of chromium, mercury, lead, copper, arsenic, zinc, cadmium, and nickel were investigated in Sta. Maria river, in Laguna. A total of 16 sediment samples were collected from the river at four stations. Atomic absorption spectroscopy (AAS) was used for element detection. It is found that copper is associated with chromium based on statistical analysis using principal component analysis (PCA). Conduct of Sediment Quality Guideline (SQG) revealed that chromium has high toxicity due to values higher than Sediment Quality Guidelines Probable Effect Level (SQG’s PEL). Copper, Nickel, and Pb fall on average toxicity while others are below PEL and effect range low (ERL).Keywords: heavy metals, pollutants, sediment quality guidelines, atomic absorption spectroscopy
Procedia PDF Downloads 1471639 Fast and Robust Long-term Tracking with Effective Searching Model
Authors: Thang V. Kieu, Long P. Nguyen
Abstract:
Kernelized Correlation Filter (KCF) based trackers have gained a lot of attention recently because of their accuracy and fast calculation speed. However, this algorithm is not robust in cases where the object is lost by a sudden change of direction, being obscured or going out of view. In order to improve KCF performance in long-term tracking, this paper proposes an anomaly detection method for target loss warning by analyzing the response map of each frame, and a classification algorithm for reliable target re-locating mechanism by using Random fern. Being tested with Visual Tracker Benchmark and Visual Object Tracking datasets, the experimental results indicated that the precision and success rate of the proposed algorithm were 2.92 and 2.61 times higher than that of the original KCF algorithm, respectively. Moreover, the proposed tracker handles occlusion better than many state-of-the-art long-term tracking methods while running at 60 frames per second.Keywords: correlation filter, long-term tracking, random fern, real-time tracking
Procedia PDF Downloads 1391638 Image Instance Segmentation Using Modified Mask R-CNN
Authors: Avatharam Ganivada, Krishna Shah
Abstract:
The Mask R-CNN is recently introduced by the team of Facebook AI Research (FAIR), which is mainly concerned with instance segmentation in images. Here, the Mask R-CNN is based on ResNet and feature pyramid network (FPN), where a single dropout method is employed. This paper provides a modified Mask R-CNN by adding multiple dropout methods into the Mask R-CNN. The proposed model has also utilized the concepts of Resnet and FPN to extract stage-wise network feature maps, wherein a top-down network path having lateral connections is used to obtain semantically strong features. The proposed model produces three outputs for each object in the image: class label, bounding box coordinates, and object mask. The performance of the proposed network is evaluated in the segmentation of every instance in images using COCO and cityscape datasets. The proposed model achieves better performance than the state-of-the-networks for the datasets.Keywords: instance segmentation, object detection, convolutional neural networks, deep learning, computer vision
Procedia PDF Downloads 731637 Synthesis of Liposomal Vesicles by a Novel Supercritical Fluid Process
Authors: Wen-Chyan Tsai, Syed S. H. Rizvi
Abstract:
Organic solvent residues are always associated with liposomes produced by the traditional techniques like the thin film hydration and reverse phase evaporation methods, which limit the applications of these vesicles in the pharmaceutical, food and cosmetic industries. Our objective was to develop a novel and benign process of liposomal microencapsulation by using supercritical carbon dioxide (SC-CO2) as the sole phospholipid-dissolving medium and a green substitute for organic solvents. This process consists of supercritical fluid extraction followed by rapid expansion via a nozzle and automatic cargo suction. Lecithin and cholesterol mixed in 10:1 mass ratio were dissolved in SC-CO2 at 20 ± 0.5 MPa and 60 oC. After at least two hours of equilibrium, the lecithin/cholesterol-laden SC-CO2 was passed through a 1000-micron nozzle and immediately mixed with the cargo solution to form liposomes. Liposomal micro-encapsulation was conducted at three pressures (8.27, 12.41, 16.55 MPa), three temperatures (75, 83 and 90 oC) and two flow rates (0.25 ml/sec and 0.5 ml/sec). Liposome size, zeta potential and encapsulation efficiency were characterized as functions of the operating parameters. The average liposomal size varied from 400-500 nm to 1000-1200 nm when the pressure was increased from 8.27 to 16.55 MPa. At 12.41 MPa, 90 oC and 0.25 ml per second of 0.2 M glucose cargo loading rate, the highest encapsulation efficiency of 31.65 % was achieved. Under a confocal laser scanning microscope, large unilamellar vesicles and multivesicular vesicles were observed to make up a majority of the liposomal emulsion. This new approach is a rapid and continuous process for bulk production of liposomes using a green solvent. Based on the results to date, it is feasible to apply this technique to encapsulate hydrophilic compounds inside the aqueous core as well as lipophilic compounds in the phospholipid bilayers of the liposomes for controlled release, solubility improvement and targeted therapy of bioactive compounds.Keywords: liposome, micro encapsulation, supercritical carbon dioxide, non-toxic process
Procedia PDF Downloads 431