Search results for: biomarker detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3662

Search results for: biomarker detection

2552 Scientific Recommender Systems Based on Neural Topic Model

Authors: Smail Boussaadi, Hassina Aliane

Abstract:

With the rapid growth of scientific literature, it is becoming increasingly challenging for researchers to keep up with the latest findings in their fields. Academic, professional networks play an essential role in connecting researchers and disseminating knowledge. To improve the user experience within these networks, we need effective article recommendation systems that provide personalized content.Current recommendation systems often rely on collaborative filtering or content-based techniques. However, these methods have limitations, such as the cold start problem and difficulty in capturing semantic relationships between articles. To overcome these challenges, we propose a new approach that combines BERTopic (Bidirectional Encoder Representations from Transformers), a state-of-the-art topic modeling technique, with community detection algorithms in a academic, professional network. Experiences confirm our performance expectations by showing good relevance and objectivity in the results.

Keywords: scientific articles, community detection, academic social network, recommender systems, neural topic model

Procedia PDF Downloads 100
2551 Dimensionality Reduction in Modal Analysis for Structural Health Monitoring

Authors: Elia Favarelli, Enrico Testi, Andrea Giorgetti

Abstract:

Autonomous structural health monitoring (SHM) of many structures and bridges became a topic of paramount importance for maintenance purposes and safety reasons. This paper proposes a set of machine learning (ML) tools to perform automatic feature selection and detection of anomalies in a bridge from vibrational data and compare different feature extraction schemes to increase the accuracy and reduce the amount of data collected. As a case study, the Z-24 bridge is considered because of the extensive database of accelerometric data in both standard and damaged conditions. The proposed framework starts from the first four fundamental frequencies extracted through operational modal analysis (OMA) and clustering, followed by density-based time-domain filtering (tracking). The fundamental frequencies extracted are then fed to a dimensionality reduction block implemented through two different approaches: feature selection (intelligent multiplexer) that tries to estimate the most reliable frequencies based on the evaluation of some statistical features (i.e., mean value, variance, kurtosis), and feature extraction (auto-associative neural network (ANN)) that combine the fundamental frequencies to extract new damage sensitive features in a low dimensional feature space. Finally, one class classifier (OCC) algorithms perform anomaly detection, trained with standard condition points, and tested with normal and anomaly ones. In particular, a new anomaly detector strategy is proposed, namely one class classifier neural network two (OCCNN2), which exploit the classification capability of standard classifiers in an anomaly detection problem, finding the standard class (the boundary of the features space in normal operating conditions) through a two-step approach: coarse and fine boundary estimation. The coarse estimation uses classics OCC techniques, while the fine estimation is performed through a feedforward neural network (NN) trained that exploits the boundaries estimated in the coarse step. The detection algorithms vare then compared with known methods based on principal component analysis (PCA), kernel principal component analysis (KPCA), and auto-associative neural network (ANN). In many cases, the proposed solution increases the performance with respect to the standard OCC algorithms in terms of F1 score and accuracy. In particular, by evaluating the correct features, the anomaly can be detected with accuracy and an F1 score greater than 96% with the proposed method.

Keywords: anomaly detection, frequencies selection, modal analysis, neural network, sensor network, structural health monitoring, vibration measurement

Procedia PDF Downloads 124
2550 Exploring the Role of Building Information Modeling for Delivering Successful Construction Projects

Authors: Muhammad Abu Bakar Tariq

Abstract:

Construction industry plays a crucial role in the progress of societies and economies. Furthermore, construction projects have social as well as economic implications, thus, their success/failure have wider impacts. However, the industry is lagging behind in terms of efficiency and productivity. Building Information Modeling (BIM) is recognized as a revolutionary development in Architecture, Engineering and Construction (AEC) industry. There are numerous interest groups around the world providing definitions of BIM, proponents describing its advantages and opponents identifying challenges/barriers regarding adoption of BIM. This research is aimed at to determine what actually BIM is, along with its potential role in delivering successful construction projects. The methodology is critical analysis of secondary data sources i.e. information present in public domain, which include peer reviewed journal articles, industry and government reports, conference papers, books, case studies etc. It is discovered that clash detection and visualization are two major advantages of BIM. Clash detection option identifies clashes among structural, architectural and MEP designs before construction actually commences, which subsequently saves time as well as cost and ensures quality during execution phase of a project. Visualization is a powerful tool that facilitates in rapid decision-making in addition to communication and coordination among stakeholders throughout project’s life cycle. By eliminating inconsistencies that consume time besides cost during actual construction, improving collaboration among stakeholders throughout project’s life cycle, BIM can play a positive role to achieve efficiency and productivity that consequently deliver successful construction projects.

Keywords: building information modeling, clash detection, construction project success, visualization

Procedia PDF Downloads 261
2549 Concept Drifts Detection and Localisation in Process Mining

Authors: M. V. Manoj Kumar, Likewin Thomas, Annappa

Abstract:

Process mining provides methods and techniques for analyzing event logs recorded in modern information systems that support real-world operations. While analyzing an event-log, state-of-the-art techniques available in process mining believe that the operational process as a static entity (stationary). This is not often the case due to the possibility of occurrence of a phenomenon called concept drift. During the period of execution, the process can experience concept drift and can evolve with respect to any of its associated perspectives exhibiting various patterns-of-change with a different pace. Work presented in this paper discusses the main aspects to consider while addressing concept drift phenomenon and proposes a method for detecting and localizing the sudden concept drifts in control-flow perspective of the process by using features extracted by processing the traces in the process log. Our experimental results are promising in the direction of efficiently detecting and localizing concept drift in the context of process mining research discipline.

Keywords: abrupt drift, concept drift, sudden drift, control-flow perspective, detection and localization, process mining

Procedia PDF Downloads 348
2548 Novel p22-Monoclonal Antibody Based Blocking ELISA for the Detection of African Swine Fever Virus Antibodies in Serum

Authors: Ghebremedhin Tsegay, Weldu Tesfagaber, Yuanmao Zhu, Xijun He, Wan Wang, Zhenjiang Zhang, Encheng Sun, Jinya Zhang, Yuntao Guan, Fang Li, Renqiang Liu, Zhigao Bu, Dongming Zhao*

Abstract:

African swine fever (ASF) is a highly infectious viral disease of pigs, resulting in significant economic loss worldwide. As there is no approved vaccines and treatments, the control of ASF entirely depends on early diagnosis and culling of infected pigs. Thus, highly specific and sensitive diagnostic assays are required for accurate and early diagnosis of ASF virus (ASFV). Currently, only a few recombinant proteins have been tested and validated for use as reagents in ASF diagnostic assays. The most promising ones for ASFV antibody detection were p72, p30, p54, and pp62. So far, three ELISA kits based on these recombinant proteins have been commercialized. Due to the complex nature of the virus and variety forms of the disease, robust serodiagnostic assays are still required. ASFV p22 protein, encoded by KP177R gene, is located in the inner membrane of viral particle and appeared transiently in the plasma membrane early after virus infection. The p22 protein interacts with numerous cellular proteins, involved in processes of phagocytosis and endocytosis through different cellular pathways. However, p22 does not seem to be involved in virus replication or swine pathogenicity. In this study, E.coli expressed recombinant p22 protein was used to generate a monoclonal antibody (mAb), and its potential use for the development of blocking ELISA (bELISA) was evaluated. A total of 806 pig serum samples were tested to evaluate the bELISA. Acording the ROC (Reciever operating chracteristic) analysis, 100% sensitivity and 98.10% of specificity was recorded when the PI cut-off value was set at 47%. The novel assay was able to detect the antibodies as early as 9 days post infection. Finaly, a highly sensitive, specific and rapid novel p22-mAb based bELISA assay was developed, and optimized for detection of antibodies against genotype I and II ASFVs. It is a promising candidate for an early and acurate detection of the antibodies and is highly expected to have a valuable role in the containment and prevention of ASF.

Keywords: ASFV, blocking ELISA, diagnosis, monoclonal antibodies, sensitivity, specificity

Procedia PDF Downloads 77
2547 Electrochemical Biosensor for Rutin Detection with Multiwall Carbon Nanotubes and Cerium Dioxide Nanoparticles

Authors: Stephen Rathinaraj Benjamin, Flavio Colmati Junior, Maria Izabel Florindo Guedes, Rosa Amalia Fireman Dutra

Abstract:

A new enzymatic electrochemical biosensor based on multiwall carbon nanotubes and cerium oxide nanoparticles for the detection of rutin has been developed. The cerium oxide nanoparticles /HRP/ multiwall carbon nanotubes/ carbon paste electrode (HRP/ CeO2/MWCNTs/CPE) was prepared by ensuing addition of MWCNTs and HRP on the CPE, followed by the mixing with cerium oxide nanoparticles. Surface physical characteristics of the modified electrode and the electrochemical properties of the composite were investigated by scanning electron microscopy (SEM), transmission electron microscopy (TEM), cylic voltammetry (CV), differential pulse voltammetry (DPV) and square wave voltammetry (SWV). The HRP/ CeO2/MWCNTs/CPE showed good selectivity, stability and reproducibility, which was further applied to detect rutin tablet and capsule samples with satisfactory results.

Keywords: cerium dioxide nanoparticles, horseradish peroxidase, multiwall carbon nanotubes, rutin

Procedia PDF Downloads 395
2546 Graphen-Based Nanocomposites for Glucose and Ethanol Enzymatic Biosensor Fabrication

Authors: Tesfaye Alamirew, Delele Worku, Solomon W. Fanta, Nigus Gabbiye

Abstract:

Recently graphen based nanocomposites are become an emerging research areas for fabrication of enzymatic biosensors due to their property of large surface area, conductivity and biocompatibility. This review summarizes recent research reports of graphen based nanocomposites for the fabrication of glucose and ethanol enzymatic biosensors. The newly fabricated enzyme free microwave treated nitrogen doped graphen (MN-d-GR) had provided highest sensitivity towards glucose and GCE/rGO/AuNPs/ADH composite had provided far highest sensitivity towards ethanol compared to other reported graphen based nanocomposites. The MWCNT/GO/GOx and GCE/ErGO/PTH/ADH nanocomposites had also enhanced wide linear range for glucose and ethanol detection respectively. Generally, graphen based nanocomposite enzymatic biosensors had fast direct electron transfer rate, highest sensitivity and wide linear detection ranges during glucose and ethanol sensing.

Keywords: glucose, ethanol, enzymatic biosensor, graphen, nanocomposite

Procedia PDF Downloads 126
2545 Automatic Censoring in K-Distribution for Multiple Targets Situations

Authors: Naime Boudemagh, Zoheir Hammoudi

Abstract:

The parameters estimation of the K-distribution is an essential part in radar detection. In fact, presence of interfering targets in reference cells causes a decrease in detection performances. In such situation, the estimate of the shape and the scale parameters are far from the actual values. In the order to avoid interfering targets, we propose an Automatic Censoring (AC) algorithm of radar interfering targets in K-distribution. The censoring technique used in this work offers a good discrimination between homogeneous and non-homogeneous environments. The homogeneous population is then used to estimate the unknown parameters by the classical Method of Moment (MOM). The AC algorithm does not need any prior information about the clutter parameters nor does it require both the number and the position of interfering targets. The accuracy of the estimation parameters obtained by this algorithm are validated and compared to various actual values of the shape parameter, using Monte Carlo simulations, this latter show that the probability of censing in multiple target situations are in good agreement.

Keywords: parameters estimation, method of moments, automatic censoring, K distribution

Procedia PDF Downloads 373
2544 Detecting Heartbeat Architectural Tactic in Source Code Using Program Analysis

Authors: Ananta Kumar Das, Sujit Kumar Chakrabarti

Abstract:

Architectural tactics such as heartbeat, ping-echo, encapsulate, encrypt data are techniques that are used to achieve quality attributes of a system. Detecting architectural tactics has several benefits: it can aid system comprehension (e.g., legacy systems) and in the estimation of quality attributes such as safety, security, maintainability, etc. Architectural tactics are typically spread over the source code and are implicit. For large codebases, manual detection is often not feasible. Therefore, there is a need for automated methods of detection of architectural tactics. This paper presents a formalization of the heartbeat architectural tactic and a program analytic approach to detect this tactic in source code. The experiment of the proposed method is done on a set of Java applications. The outcome of the experiment strongly suggests that the method compares well with a manual approach in terms of its sensitivity and specificity, and far supersedes a manual exercise in terms of its scalability.

Keywords: software architecture, architectural tactics, detecting architectural tactics, program analysis, AST, alias analysis

Procedia PDF Downloads 161
2543 Pharmacokinetic Monitoring of Glimepiride and Ilaprazole in Rat Plasma by High Performance Liquid Chromatography with Diode Array Detection

Authors: Anil P. Dewani, Alok S. Tripathi, Anil V. Chandewar

Abstract:

Present manuscript reports the development and validation of a quantitative high performance liquid chromatography method for the pharmacokinetic evaluation of Glimepiride (GLM) and Ilaprazole (ILA) in rat plasma. The plasma samples were involved with Solid phase extraction process (SPE). The analytes were resolved on a Phenomenex C18 column (4.6 mm× 250 mm; 5 µm particle size) using a isocratic elution mode comprising methanol:water (80:20 % v/v) with pH of water modified to 3 using Formic acid, the total run time was 10 min at 225 nm as common wavelength, the flow rate throughout was 1ml/min. The method was validated over the concentration range from 10 to 600 ng/mL for GLM and ILA, in rat plasma. Metformin (MET) was used as Internal Standard. Validation data demonstrated the method to be selective, sensitive, accurate and precise. The limit of detection was 1.54 and 4.08 and limit of quantification was 5.15 and 13.62 for GLM and ILA respectively, the method demonstrated excellent linearity with correlation coefficients (r2) 0.999. The intra and inter-day precision (RSD%) values were < 2.0% for both ILA and GLM. The method was successfully applied in pharmacokinetic studies followed by oral administration in rats.

Keywords: pharmacokinetics, glimepiride, ilaprazole, HPLC, SPE

Procedia PDF Downloads 369
2542 Visual Detection of Escherichia coli (E. coli) through Formation of Beads Aggregation in Capillary Tube by Rolling Circle Amplification

Authors: Bo Ram Choi, Ji Su Kim, Juyeon Cho, Hyukjin Lee

Abstract:

Food contaminated by bacteria (E.coli), causes food poisoning, which occurs to many patients worldwide annually. We have introduced an application of rolling circle amplification (RCA) as a versatile biosensor and developed a diagnostic platform composed of capillary tube and microbeads for rapid and easy detection of Escherichia coli (E. coli). When specific mRNA of E.coli is extracted from cell lysis, rolling circle amplification (RCA) of DNA template can be achieved and can be visualized by beads aggregation in capillary tube. In contrast, if there is no bacterial pathogen in sample, no beads aggregation can be seen. This assay is possible to detect visually target gene without specific equipment. It is likely to the development of a genetic kit for point of care testing (POCT) that can detect target gene using microbeads.

Keywords: rolling circle amplification (RCA), Escherichia coli (E. coli), point of care testing (POCT), beads aggregation, capillary tube

Procedia PDF Downloads 367
2541 Unsupervised Detection of Burned Area from Remote Sensing Images Using Spatial Correlation and Fuzzy Clustering

Authors: Tauqir A. Moughal, Fusheng Yu, Abeer Mazher

Abstract:

Land-cover and land-use change information are important because of their practical uses in various applications, including deforestation, damage assessment, disasters monitoring, urban expansion, planning, and land management. Therefore, developing change detection methods for remote sensing images is an important ongoing research agenda. However, detection of change through optical remote sensing images is not a trivial task due to many factors including the vagueness between the boundaries of changed and unchanged regions and spatial dependence of the pixels to its neighborhood. In this paper, we propose a binary change detection technique for bi-temporal optical remote sensing images. As in most of the optical remote sensing images, the transition between the two clusters (change and no change) is overlapping and the existing methods are incapable of providing the accurate cluster boundaries. In this regard, a methodology has been proposed which uses the fuzzy c-means clustering to tackle the problem of vagueness in the changed and unchanged class by formulating the soft boundaries between them. Furthermore, in order to exploit the neighborhood information of the pixels, the input patterns are generated corresponding to each pixel from bi-temporal images using 3×3, 5×5 and 7×7 window. The between images and within image spatial dependence of the pixels to its neighborhood is quantified by using Pearson product moment correlation and Moran’s I statistics, respectively. The proposed technique consists of two phases. At first, between images and within image spatial correlation is calculated to utilize the information that the pixels at different locations may not be independent. Second, fuzzy c-means technique is used to produce two clusters from input feature by not only taking care of vagueness between the changed and unchanged class but also by exploiting the spatial correlation of the pixels. To show the effectiveness of the proposed technique, experiments are conducted on multispectral and bi-temporal remote sensing images. A subset (2100×1212 pixels) of a pan-sharpened, bi-temporal Landsat 5 thematic mapper optical image of Los Angeles, California, is used in this study which shows a long period of the forest fire continued from July until October 2009. Early forest fire and later forest fire optical remote sensing images were acquired on July 5, 2009 and October 25, 2009, respectively. The proposed technique is used to detect the fire (which causes change on earth’s surface) and compared with the existing K-means clustering technique. Experimental results showed that proposed technique performs better than the already existing technique. The proposed technique can be easily extendable for optical hyperspectral images and is suitable for many practical applications.

Keywords: burned area, change detection, correlation, fuzzy clustering, optical remote sensing

Procedia PDF Downloads 170
2540 Literature Review: Adversarial Machine Learning Defense in Malware Detection

Authors: Leidy M. Aldana, Jorge E. Camargo

Abstract:

Adversarial Machine Learning has gained importance in recent years as Cybersecurity has gained too, especially malware, it has affected different entities and people in recent years. This paper shows a literature review about defense methods created to prevent adversarial machine learning attacks, firstable it shows an introduction about the context and the description of some terms, in the results section some of the attacks are described, focusing on detecting adversarial examples before coming to the machine learning algorithm and showing other categories that exist in defense. A method with five steps is proposed in the method section in order to define a way to make the literature review; in addition, this paper summarizes the contributions in this research field in the last seven years to identify research directions in this area. About the findings, the category with least quantity of challenges in defense is the Detection of adversarial examples being this one a viable research route with the adaptive approach in attack and defense.

Keywords: Malware, adversarial, machine learning, defense, attack

Procedia PDF Downloads 72
2539 Relation between Electrical Properties and Application of Chitosan Nanocomposites

Authors: Evgen Prokhorov, Gabriel Luna-Barcenas

Abstract:

The polysaccharide chitosan (CS) is an attractive biopolymer for the stabilization of several nanoparticles in acidic aqueous media. This is due in part to the presence of abundant primary NH2 and OH groups which may lead to steric or chemical stabilization. Applications of most CS nanocomposites are based upon the interaction of high surface area nanoparticles (NPs) with different substance. Therefore, agglomeration of NPs leads to decreasing effective surface area such that it may decrease the efficiency of nanocomposites. The aim of this work is to measure nanocomposite’s electrical conductivity phenomena that will allow one to formulate optimal concentrations of conductivity NPs in CS-based nanocomposites. Additionally, by comparing the efficiency of such nanocomposites, one can guide applications in the biomedical (antibacterial properties and tissue regeneration) and sensor fields (detection of copper and nitrate ions in aqueous solutions). It was shown that the best antibacterial (CS-AgNPs, CS-AgNPs-carbon nanotubes) and would healing properties (CS-AuNPs) are observed in nanocomposites with concentrations of NPs near the percolation threshold. In this regard, the best detection limit in potentiometric and impedimetric sensors for detection of copper ions (using CS-AuNPs membrane) and nitrate ions (using CS-clay membrane) in aqueous solutions have been observed for membranes with concentrations of NPs near percolation threshold. It is well known that at the percolation concentration of NPs an abrupt increasing of conductivity is observed due to the presence of physical contacts between NPs; above this concentration, agglomeration of NPs takes place such that a decrease in the effective surface and performance of nanocomposite appear. The obtained relationship between electrical percolation threshold and performance of polymer nanocomposites with conductivity NPs is important for the design and optimization of polymer-based nanocomposites for different applications.

Keywords: chitosan, conductivity nanoparticles, percolation threshold, polymer nanocomposites

Procedia PDF Downloads 212
2538 Detection of Curvilinear Structure via Recursive Anisotropic Diffusion

Authors: Sardorbek Numonov, Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Dongeun Choi, Byung-Woo Hong

Abstract:

The detection of curvilinear structures often plays an important role in the analysis of images. In particular, it is considered as a crucial step for the diagnosis of chronic respiratory diseases to localize the fissures in chest CT imagery where the lung is divided into five lobes by the fissures that are characterized by linear features in appearance. However, the characteristic linear features for the fissures are often shown to be subtle due to the high intensity variability, pathological deformation or image noise involved in the imaging procedure, which leads to the uncertainty in the quantification of anatomical or functional properties of the lung. Thus, it is desired to enhance the linear features present in the chest CT images so that the distinctiveness in the delineation of the lobe is improved. We propose a recursive diffusion process that prefers coherent features based on the analysis of structure tensor in an anisotropic manner. The local image features associated with certain scales and directions can be characterized by the eigenanalysis of the structure tensor that is often regularized via isotropic diffusion filters. However, the isotropic diffusion filters involved in the computation of the structure tensor generally blur geometrically significant structure of the features leading to the degradation of the characteristic power in the feature space. Thus, it is required to take into consideration of local structure of the feature in scale and direction when computing the structure tensor. We apply an anisotropic diffusion in consideration of scale and direction of the features in the computation of the structure tensor that subsequently provides the geometrical structure of the features by its eigenanalysis that determines the shape of the anisotropic diffusion kernel. The recursive application of the anisotropic diffusion with the kernel the shape of which is derived from the structure tensor leading to the anisotropic scale-space where the geometrical features are preserved via the eigenanalysis of the structure tensor computed from the diffused image. The recursive interaction between the anisotropic diffusion based on the geometry-driven kernels and the computation of the structure tensor that determines the shape of the diffusion kernels yields a scale-space where geometrical properties of the image structure are effectively characterized. We apply our recursive anisotropic diffusion algorithm to the detection of curvilinear structure in the chest CT imagery where the fissures present curvilinear features and define the boundary of lobes. It is shown that our algorithm yields precise detection of the fissures while overcoming the subtlety in defining the characteristic linear features. The quantitative evaluation demonstrates the robustness and effectiveness of the proposed algorithm for the detection of fissures in the chest CT in terms of the false positive and the true positive measures. The receiver operating characteristic curves indicate the potential of our algorithm as a segmentation tool in the clinical environment. This work was supported by the MISP(Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by the IITP(Institute for Information and Communications Technology Promotion).

Keywords: anisotropic diffusion, chest CT imagery, chronic respiratory disease, curvilinear structure, fissure detection, structure tensor

Procedia PDF Downloads 233
2537 Detection of Extrusion Blow Molding Defects by Airflow Analysis

Authors: Eva Savy, Anthony Ruiz

Abstract:

In extrusion blow molding, there is great variability in product quality due to the sensitivity of the machine settings. These variations lead to unnecessary rejects and loss of time. Yet production control is a major challenge for companies in this sector to remain competitive within their market. Current quality control methods only apply to finished products (vision control, leak test...). It has been shown that material melt temperature, blowing pressure, and ambient temperature have a significant impact on the variability of product quality. Since blowing is a key step in the process, we have studied this parameter in this paper. The objective is to determine if airflow analysis allows the identification of quality problems before the full completion of the manufacturing process. We conducted tests to determine if it was possible to identify a leakage defect and an obstructed defect, two common defects on products. The results showed that it was possible to identify a leakage defect by airflow analysis.

Keywords: extrusion blow molding, signal, sensor, defects, detection

Procedia PDF Downloads 154
2536 Spectral Anomaly Detection and Clustering in Radiological Search

Authors: Thomas L. McCullough, John D. Hague, Marylesa M. Howard, Matthew K. Kiser, Michael A. Mazur, Lance K. McLean, Johanna L. Turk

Abstract:

Radiological search and mapping depends on the successful recognition of anomalies in large data sets which contain varied and dynamic backgrounds. We present a new algorithmic approach for real-time anomaly detection which is resistant to common detector imperfections, avoids the limitations of a source template library and provides immediate, and easily interpretable, user feedback. This algorithm is based on a continuous wavelet transform for variance reduction and evaluates the deviation between a foreground measurement and a local background expectation using methods from linear algebra. We also present a technique for recognizing and visualizing spectrally similar clusters of data. This technique uses Laplacian Eigenmap Manifold Learning to perform dimensional reduction which preserves the geometric "closeness" of the data while maintaining sensitivity to outlying data. We illustrate the utility of both techniques on real-world data sets.

Keywords: radiological search, radiological mapping, radioactivity, radiation protection

Procedia PDF Downloads 696
2535 Development of Folding Based Aptasensor for Ochratoxin a Using Different Pulse Voltammetry

Authors: Rupesh K. Mishra, Gaëlle Catanante, Akhtar Hayat, Jean-Louis Marty

Abstract:

Ochratoxins (OTA) are secondary metabolites present in a wide variety of food stuff. They are dangerous by-products mainly produced by several species of storage fungi including the Aspergillus and Penicillium genera. OTA is known to have nephrotoxic, immunotoxic, teratogenic and carcinogenic effects. Thus, needs a special attention for a highly sensitive and selective detection system that can quantify these organic toxins in various matrices such as cocoa beans. This work presents a folding based aptasensors by employing an aptamer conjugated redox probe (methylene blue) specifically designed for OTA. The aptamers were covalently attached to the screen printed carbon electrodes using diazonium grafting. Upon sensing the OTA, it binds with the immobilized aptamer on the electrode surface, which induces the conformational changes of the aptamer, consequently increased in the signal. This conformational change of the aptamer before and after biosensing of target OTA could produce the distinguishable electrochemical signal. The obtained limit of detection was 0.01 ng/ml for OTA samples with recovery of up to 88% in contaminated cocoa samples.

Keywords: ochratoxin A, cocoa, DNA aptamer, labelled probe

Procedia PDF Downloads 286
2534 Real-Time Monitoring of Drinking Water Quality Using Advanced Devices

Authors: Amani Abdallah, Isam Shahrour

Abstract:

The quality of drinking water is a major concern of public health. The control of this quality is generally performed in the laboratory, which requires a long time. This type of control is not adapted for accidental pollution from sudden events, which can have serious consequences on population health. Therefore, it is of major interest to develop real-time innovative solutions for the detection of accidental contamination in drinking water systems This paper presents researches conducted within the SunRise Demonstrator for ‘Smart and Sustainable Cities’ with a particular focus on the supervision of the water quality. This work aims at (i) implementing a smart water system in a large water network (Campus of the University Lille1) including innovative equipment for real-time detection of abnormal events, such as those related to the contamination of drinking water and (ii) develop a numerical modeling of the contamination diffusion in the water distribution system. The first step included verification of the water quality sensors and their effectiveness on a network prototype of 50m length. This part included the evaluation of the efficiency of these sensors in the detection both bacterial and chemical contamination events in drinking water distribution systems. An on-line optical sensor integral with a laboratory-scale distribution system (LDS) was shown to respond rapidly to changes in refractive index induced by injected loads of chemical (cadmium, mercury) and biological contaminations (Escherichia coli). All injected substances were detected by the sensor; the magnitude of the response depends on the type of contaminant introduced and it is proportional to the injected substance concentration.

Keywords: distribution system, drinking water, refraction index, sensor, real-time

Procedia PDF Downloads 357
2533 Virulence Factors and Drug Resistance of Enterococci Species Isolated from the Intensive Care Units of Assiut University Hospitals, Egypt

Authors: Nahla Elsherbiny, Ahmed Ahmed, Hamada Mohammed, Mohamed Ali

Abstract:

Background: The enterococci may be considered as opportunistic agents particularly in immunocompromised patients. It is one of the top three pathogens causing many healthcare associated infections (HAIs). Resistance to several commonly used antimicrobial agents is a remarkable characteristic of most species which may carry various genes contributing to virulence. Objectives: to determine the prevalence of enterococci species in different intensive care units (ICUs) causing health care-associated infections (HAIs), intestinal carriage and environmental contamination. Also, to study the antimicrobial susceptibility pattern of the isolates with special reference to vancomycin resistance. In addition to phenotypic and genotypic detection of gelatinase, cytolysin and biofilm formation among isolates. Patients and Methods: This study was carried out in the infection control laboratory at Assiut University Hospitals over a period of one year. Clinical samples were collected from 285 patients with various (HAIs) acquired after admission to different ICUs. Rectal swabs were taken from 14 cases for detection of enterococci carriage. In addition, 1377 environmental samples were collected from the surroundings of the patients. Identification was done by conventional bacteriological methods and confirmed by analytical profile index (API). Antimicrobial sensitivity testing was performed by Kirby Bauer disc diffusion method and detection of vancomycin resistance was done by agar screen method. For the isolates, phenotypic detection of cytolysin, gelatinase production and detection of biofilm by tube method, Congo red method and microtiter plate. We performed polymerase chain reaction (PCR) for detection of some virulence genes (gelE, cylA, vanA, vanB and esp). Results: Enterococci caused 10.5% of the HAIs. Respiratory tract infection was the predominant type (86.7%). The commonest species were E.gallinarum (36.7%), E.casseliflavus (30%), E.faecalis (30%), and E.durans (3.4 %). Vancomycin resistance was detected in a total of 40% (12/30) of those isolates. The risk factors associated with acquiring vancomycin resistant enterococci (VRE) were immune suppression (P= 0.031) and artificial feeding (P= 0.008). For the rectal swabs, enterococci species were detected in 71.4% of samples with the predominance of E. casseliflavus (50%). Most of the isolates were vancomycin resistant (70%). Out of a total 1377 environmental samples, 577 (42%) samples were contaminated with different microorganisms. Enterococci were detected in 1.7% (10/577) of total contaminated samples, 50% of which were vancomycin resistant. All isolates were resistant to penicillin, ampicillin, oxacillin, ciprofloxacin, amikacin, erythromycin, clindamycin and trimethoprim-sulfamethaxazole. For the remaining antibiotics, variable percentages of resistance were reported. Cytolysin and gelatinase were detected phenotypically in 16% and 48 % of the isolates respectively. The microtiter plate method showed the highest percentages of detection of biofilm among all isolated species (100%). The studied virulence genes gelE, esp, vanA and vanB were detected in 62%, 12%, 2% and 12% respectively, while cylA gene was not detected in any isolates. Conclusions: A significant percentage of enterococci was isolated from patients and environments in the ICUs. Many virulence factors were detected phenotypically and genotypically among isolates. The high percentage of resistance, coupled with the risk of cross transmission to other patients make enterococci infections a significant infection control issue in hospitals.

Keywords: antimicrobial resistance, enterococci, ICUs, virulence factors

Procedia PDF Downloads 286
2532 Alphabet Recognition Using Pixel Probability Distribution

Authors: Vaidehi Murarka, Sneha Mehta, Dishant Upadhyay

Abstract:

Our project topic is “Alphabet Recognition using pixel probability distribution”. The project uses techniques of Image Processing and Machine Learning in Computer Vision. Alphabet recognition is the mechanical or electronic translation of scanned images of handwritten, typewritten or printed text into machine-encoded text. It is widely used to convert books and documents into electronic files etc. Alphabet Recognition based OCR application is sometimes used in signature recognition which is used in bank and other high security buildings. One of the popular mobile applications includes reading a visiting card and directly storing it to the contacts. OCR's are known to be used in radar systems for reading speeders license plates and lots of other things. The implementation of our project has been done using Visual Studio and Open CV (Open Source Computer Vision). Our algorithm is based on Neural Networks (machine learning). The project was implemented in three modules: (1) Training: This module aims “Database Generation”. Database was generated using two methods: (a) Run-time generation included database generation at compilation time using inbuilt fonts of OpenCV library. Human intervention is not necessary for generating this database. (b) Contour–detection: ‘jpeg’ template containing different fonts of an alphabet is converted to the weighted matrix using specialized functions (contour detection and blob detection) of OpenCV. The main advantage of this type of database generation is that the algorithm becomes self-learning and the final database requires little memory to be stored (119kb precisely). (2) Preprocessing: Input image is pre-processed using image processing concepts such as adaptive thresholding, binarizing, dilating etc. and is made ready for segmentation. “Segmentation” includes extraction of lines, words, and letters from the processed text image. (3) Testing and prediction: The extracted letters are classified and predicted using the neural networks algorithm. The algorithm recognizes an alphabet based on certain mathematical parameters calculated using the database and weight matrix of the segmented image.

Keywords: contour-detection, neural networks, pre-processing, recognition coefficient, runtime-template generation, segmentation, weight matrix

Procedia PDF Downloads 390
2531 UWB Open Spectrum Access for a Smart Software Radio

Authors: Hemalatha Rallapalli, K. Lal Kishore

Abstract:

In comparison to systems that are typically designed to provide capabilities over a narrow frequency range through hardware elements, the next generation cognitive radios are intended to implement a broader range of capabilities through efficient spectrum exploitation. This offers the user the promise of greater flexibility, seamless roaming possible on different networks, countries, frequencies, etc. It requires true paradigm shift i.e., liberalization over a wide band of spectrum as well as a growth path to more and greater capability. This work contributes towards the design and implementation of an open spectrum access (OSA) feature to unlicensed users thus offering a frequency agile radio platform that is capable of performing spectrum sensing over a wideband. Thus, an ultra-wideband (UWB) radio, which has the intelligence of spectrum sensing only, unlike the cognitive radio with complete intelligence, is named as a Smart Software Radio (SSR). The spectrum sensing mechanism is implemented based on energy detection. Simulation results show the accuracy and validity of this method.

Keywords: cognitive radio, energy detection, software radio, spectrum sensing

Procedia PDF Downloads 429
2530 Tracing Back the Bot Master

Authors: Sneha Leslie

Abstract:

The current situation in the cyber world is that crimes performed by Botnets are increasing and the masterminds (botmaster) are not detectable easily. The botmaster in the botnet compromises the legitimate host machines in the network and make them bots or zombies to initiate the cyber-attacks. This paper will focus on the live detection of the botmaster in the network by using the strong framework 'metasploit', when distributed denial of service (DDOS) attack is performed by the botnet. The affected victim machine will be continuously monitoring its incoming packets. Once the victim machine gets to know about the excessive count of packets from any IP, that particular IP is noted and details of the noted systems are gathered. Using the vulnerabilities present in the zombie machines (already compromised by botmaster), the victim machine will compromise them. By gaining access to the compromised systems, applications are run remotely. By analyzing the incoming packets of the zombies, the victim comes to know the address of the botmaster. This is an effective and a simple system where no specific features of communication protocol are considered.

Keywords: bonet, DDoS attack, network security, detection system, metasploit framework

Procedia PDF Downloads 254
2529 Ontology based Fault Detection and Diagnosis system Querying and Reasoning examples

Authors: Marko Batic, Nikola Tomasevic, Sanja Vranes

Abstract:

One of the strongholds in the ubiquitous efforts related to the energy conservation and energy efficiency improvement is represented by the retrofit of high energy consumers in buildings. In general, HVAC systems represent the highest energy consumers in buildings. However they usually suffer from mal-operation and/or malfunction, causing even higher energy consumption than necessary. Various Fault Detection and Diagnosis (FDD) systems can be successfully employed for this purpose, especially when it comes to the application at a single device/unit level. In the case of more complex systems, where multiple devices are operating in the context of the same building, significant energy efficiency improvements can only be achieved through application of comprehensive FDD systems relying on additional higher level knowledge, such as their geographical location, served area, their intra- and inter- system dependencies etc. This paper presents a comprehensive FDD system that relies on the utilization of common knowledge repository that stores all critical information. The discussed system is deployed as a test-bed platform at the two at Fiumicino and Malpensa airports in Italy. This paper aims at presenting advantages of implementation of the knowledge base through the utilization of ontology and offers improved functionalities of such system through examples of typical queries and reasoning that enable derivation of high level energy conservation measures (ECM). Therefore, key SPARQL queries and SWRL rules, based on the two instantiated airport ontologies, are elaborated. The detection of high level irregularities in the operation of airport heating/cooling plants is discussed and estimation of energy savings is reported.

Keywords: airport ontology, knowledge management, ontology modeling, reasoning

Procedia PDF Downloads 540
2528 Post-Earthquake Damage Detection Using System Identification with a Pair of Seismic Recordings

Authors: Lotfi O. Gargab, Ruichong R. Zhang

Abstract:

A wave-based framework is presented for modeling seismic motion in multistory buildings and using measured response for system identification which can be utilized to extract important information regarding structure integrity. With one pair of building response at two locations, a generalized model response is formulated based on wave propagation features and expressed as frequency and time response functions denoted, respectively, as GFRF and GIRF. In particular, GIRF is fundamental in tracking arrival times of impulsive wave motion initiated at response level which is dependent on local model properties. Matching model and measured-structure responses can help in identifying model parameters and infer building properties. To show the effectiveness of this approach, the Millikan Library in Pasadena, California is identified with recordings of the Yorba Linda earthquake of September 3, 2002.

Keywords: system identification, continuous-discrete mass modeling, damage detection, post-earthquake

Procedia PDF Downloads 370
2527 Bayesian System and Copula for Event Detection and Summarization of Soccer Videos

Authors: Dhanuja S. Patil, Sanjay B. Waykar

Abstract:

Event detection is a standout amongst the most key parts for distinctive sorts of area applications of video data framework. Recently, it has picked up an extensive interest of experts and in scholastics from different zones. While detecting video event has been the subject of broad study efforts recently, impressively less existing methodology has considered multi-model data and issues related efficiency. Start of soccer matches different doubtful circumstances rise that can't be effectively judged by the referee committee. A framework that checks objectively image arrangements would prevent not right interpretations because of some errors, or high velocity of the events. Bayesian networks give a structure for dealing with this vulnerability using an essential graphical structure likewise the probability analytics. We propose an efficient structure for analysing and summarization of soccer videos utilizing object-based features. The proposed work utilizes the t-cherry junction tree, an exceptionally recent advancement in probabilistic graphical models, to create a compact representation and great approximation intractable model for client’s relationships in an interpersonal organization. There are various advantages in this approach firstly; the t-cherry gives best approximation by means of junction trees class. Secondly, to construct a t-cherry junction tree can be to a great extent parallelized; and at last inference can be performed utilizing distributed computation. Examination results demonstrates the effectiveness, adequacy, and the strength of the proposed work which is shown over a far reaching information set, comprising more soccer feature, caught at better places.

Keywords: summarization, detection, Bayesian network, t-cherry tree

Procedia PDF Downloads 327
2526 Neural Network based Risk Detection for Dyslexia and Dysgraphia in Sinhala Language Speaking Children

Authors: Budhvin T. Withana, Sulochana Rupasinghe

Abstract:

The educational system faces a significant concern with regards to Dyslexia and Dysgraphia, which are learning disabilities impacting reading and writing abilities. This is particularly challenging for children who speak the Sinhala language due to its complexity and uniqueness. Commonly used methods to detect the risk of Dyslexia and Dysgraphia rely on subjective assessments, leading to limited coverage and time-consuming processes. Consequently, delays in diagnoses and missed opportunities for early intervention can occur. To address this issue, the project developed a hybrid model that incorporates various deep learning techniques to detect the risk of Dyslexia and Dysgraphia. Specifically, Resnet50, VGG16, and YOLOv8 models were integrated to identify handwriting issues. The outputs of these models were then combined with other input data and fed into an MLP model. Hyperparameters of the MLP model were fine-tuned using Grid Search CV, enabling the identification of optimal values for the model. This approach proved to be highly effective in accurately predicting the risk of Dyslexia and Dysgraphia, providing a valuable tool for early detection and intervention. The Resnet50 model exhibited a training accuracy of 0.9804 and a validation accuracy of 0.9653. The VGG16 model achieved a training accuracy of 0.9991 and a validation accuracy of 0.9891. The MLP model demonstrated impressive results with a training accuracy of 0.99918, a testing accuracy of 0.99223, and a loss of 0.01371. These outcomes showcase the high accuracy achieved by the proposed hybrid model in predicting the risk of Dyslexia and Dysgraphia.

Keywords: neural networks, risk detection system, dyslexia, dysgraphia, deep learning, learning disabilities, data science

Procedia PDF Downloads 66
2525 Clinical and Analytical Performance of Glial Fibrillary Acidic Protein and Ubiquitin C-Terminal Hydrolase L1 Biomarkers for Traumatic Brain Injury in the Alinity Traumatic Brain Injury Test

Authors: Raj Chandran, Saul Datwyler, Jaime Marino, Daniel West, Karla Grasso, Adam Buss, Hina Syed, Zina Al Sahouri, Jennifer Yen, Krista Caudle, Beth McQuiston

Abstract:

The Alinity i TBI test is Therapeutic Goods Administration (TGA) registered and is a panel of in vitro diagnostic chemiluminescent microparticle immunoassays for the measurement of glial fibrillary acidic protein (GFAP) and ubiquitin C-terminal hydrolase L1 (UCH-L1) in plasma and serum. The Alinity i TBI performance was evaluated in a multi-center pivotal study to demonstrate the capability to assist in determining the need for a CT scan of the head in adult subjects (age 18+) presenting with suspected mild TBI (traumatic brain injury) with a Glasgow Coma Scale score of 13 to 15. TBI has been recognized as an important cause of death and disability and is a growing public health problem. An estimated 69 million people globally experience a TBI annually1. Blood-based biomarkers such as glial fibrillary acidic protein (GFAP) and ubiquitin C-terminal hydrolase L1 (UCH-L1) have shown utility to predict acute traumatic intracranial injury on head CT scans after TBI. A pivotal study using prospectively collected archived (frozen) plasma specimens was conducted to establish the clinical performance of the TBI test on the Alinity i system. The specimens were originally collected in a prospective, multi-center clinical study. Testing of the specimens was performed at three clinical sites in the United States. Performance characteristics such as detection limits, imprecision, linearity, measuring interval, expected values, and interferences were established following Clinical and Laboratory Standards Institute (CLSI) guidance. Of the 1899 mild TBI subjects, 120 had positive head CT scan results; 116 of the 120 specimens had a positive TBI interpretation (Sensitivity 96.7%; 95% CI: 91.7%, 98.7%). Of the 1779 subjects with negative CT scan results, 713 had a negative TBI interpretation (Specificity 40.1%; 95% CI: 37.8, 42.4). The negative predictive value (NPV) of the test was 99.4% (713/717, 95% CI: 98.6%, 99.8%). The analytical measuring interval (AMI) extends from the limit of quantitation (LoQ) to the upper LoQ and is determined by the range that demonstrates acceptable performance for linearity, imprecision, and bias. The AMI is 6.1 to 42,000 pg/mL for GFAP and 26.3 to 25,000 pg/mL for UCH-L1. Overall, within-laboratory imprecision (20 day) ranged from 3.7 to 5.9% CV for GFAP and 3.0 to 6.0% CV for UCH-L1, when including lot and instrument variances. The Alinity i TBI clinical performance results demonstrated high sensitivity and high NPV, supporting the utility to assist in determining the need for a head CT scan in subjects presenting to the emergency department with suspected mild TBI. The GFAP and UCH-L1 assays show robust analytical performance across a broad concentration range of GFAP and UCH-L1 and may serve as a valuable tool to help evaluate TBI patients across the spectrum of mild to severe injury.

Keywords: biomarker, diagnostic, neurology, TBI

Procedia PDF Downloads 68
2524 Low-Cost Image Processing System for Evaluating Pavement Surface Distress

Authors: Keerti Kembhavi, M. R. Archana, V. Anjaneyappa

Abstract:

Most asphalt pavement condition evaluation use rating frameworks in which asphalt pavement distress is estimated by type, extent, and severity. Rating is carried out by the pavement condition rating (PCR), which is tedious and expensive. This paper presents the development of a low-cost technique for image pavement distress analysis that permits the identification of pothole and cracks. The paper explores the application of image processing tools for the detection of potholes and cracks. Longitudinal cracking and pothole are detected using Fuzzy-C- Means (FCM) and proceeded with the Spectral Theory algorithm. The framework comprises three phases, including image acquisition, processing, and extraction of features. A digital camera (Gopro) with the holder is used to capture pavement distress images on a moving vehicle. FCM classifier and Spectral Theory algorithms are used to compute features and classify the longitudinal cracking and pothole. The Matlab2016Ra Image preparing tool kit utilizes performance analysis to identify the viability of pavement distress on selected urban stretches of Bengaluru city, India. The outcomes of image evaluation with the utilization semi-computerized image handling framework represented the features of longitudinal crack and pothole with an accuracy of about 80%. Further, the detected images are validated with the actual dimensions, and it is seen that dimension variability is about 0.46. The linear regression model y=1.171x-0.155 is obtained using the existing and experimental / image processing area. The R2 correlation square obtained from the best fit line is 0.807, which is considered in the linear regression model to be ‘large positive linear association’.

Keywords: crack detection, pothole detection, spectral clustering, fuzzy-c-means

Procedia PDF Downloads 182
2523 Land Use Change Detection Using Remote Sensing and GIS

Authors: Naser Ahmadi Sani, Karim Solaimani, Lida Razaghnia, Jalal Zandi

Abstract:

In recent decades, rapid and incorrect changes in land-use have been associated with consequences such as natural resources degradation and environmental pollution. Detecting changes in land-use is one of the tools for natural resource management and assessment of changes in ecosystems. The target of this research is studying the land-use changes in Haraz basin with an area of 677000 hectares in a 15 years period (1996 to 2011) using LANDSAT data. Therefore, the quality of the images was first evaluated. Various enhancement methods for creating synthetic bonds were used in the analysis. Separate training sites were selected for each image. Then the images of each period were classified in 9 classes using supervised classification method and the maximum likelihood algorithm. Finally, the changes were extracted in GIS environment. The results showed that these changes are an alarm for the HARAZ basin status in future. The reason is that 27% of the area has been changed, which is related to changing the range lands to bare land and dry farming and also changing the dense forest to sparse forest, horticulture, farming land and residential area.

Keywords: Haraz basin, change detection, land-use, satellite data

Procedia PDF Downloads 415