Search results for: cocaine detection
2154 A Survey of Feature-Based Steganalysis for JPEG Images
Authors: Syeda Mainaaz Unnisa, Deepa Suresh
Abstract:
Due to the increase in usage of public domain channels, such as the internet, and communication technology, there is a concern about the protection of intellectual property and security threats. This interest has led to growth in researching and implementing techniques for information hiding. Steganography is the art and science of hiding information in a private manner such that its existence cannot be recognized. Communication using steganographic techniques makes not only the secret message but also the presence of hidden communication, invisible. Steganalysis is the art of detecting the presence of this hidden communication. Parallel to steganography, steganalysis is also gaining prominence, since the detection of hidden messages can prevent catastrophic security incidents from occurring. Steganalysis can also be incredibly helpful in identifying and revealing holes with the current steganographic techniques, which makes them vulnerable to attacks. Through the formulation of new effective steganalysis methods, further research to improve the resistance of tested steganography techniques can be developed. Feature-based steganalysis method for JPEG images calculates the features of an image using the L1 norm of the difference between a stego image and the calibrated version of the image. This calibration can help retrieve some of the parameters of the cover image, revealing the variations between the cover and stego image and enabling a more accurate detection. Applying this method to various steganographic schemes, experimental results were compared and evaluated to derive conclusions and principles for more protected JPEG steganography.Keywords: cover image, feature-based steganalysis, information hiding, steganalysis, steganography
Procedia PDF Downloads 2162153 Performing Diagnosis in Building with Partially Valid Heterogeneous Tests
Authors: Houda Najeh, Mahendra Pratap Singh, Stéphane Ploix, Antoine Caucheteux, Karim Chabir, Mohamed Naceur Abdelkrim
Abstract:
Building system is highly vulnerable to different kinds of faults and human misbehaviors. Energy efficiency and user comfort are directly targeted due to abnormalities in building operation. The available fault diagnosis tools and methodologies particularly rely on rules or pure model-based approaches. It is assumed that model or rule-based test could be applied to any situation without taking into account actual testing contexts. Contextual tests with validity domain could reduce a lot of the design of detection tests. The main objective of this paper is to consider fault validity when validate the test model considering the non-modeled events such as occupancy, weather conditions, door and window openings and the integration of the knowledge of the expert on the state of the system. The concept of heterogeneous tests is combined with test validity to generate fault diagnoses. A combination of rules, range and model-based tests known as heterogeneous tests are proposed to reduce the modeling complexity. Calculation of logical diagnoses coming from artificial intelligence provides a global explanation consistent with the test result. An application example shows the efficiency of the proposed technique: an office setting at Grenoble Institute of Technology.Keywords: heterogeneous tests, validity, building system, sensor grids, sensor fault, diagnosis, fault detection and isolation
Procedia PDF Downloads 2912152 The Application of Video Segmentation Methods for the Purpose of Action Detection in Videos
Authors: Nassima Noufail, Sara Bouhali
Abstract:
In this work, we develop a semi-supervised solution for the purpose of action detection in videos and propose an efficient algorithm for video segmentation. The approach is divided into video segmentation, feature extraction, and classification. In the first part, a video is segmented into clips, and we used the K-means algorithm for this segmentation; our goal is to find groups based on similarity in the video. The application of k-means clustering into all the frames is time-consuming; therefore, we started by the identification of transition frames where the scene in the video changes significantly, and then we applied K-means clustering into these transition frames. We used two image filters, the gaussian filter and the Laplacian of Gaussian. Each filter extracts a set of features from the frames. The Gaussian filter blurs the image and omits the higher frequencies, and the Laplacian of gaussian detects regions of rapid intensity changes; we then used this vector of filter responses as an input to our k-means algorithm. The output is a set of cluster centers. Each video frame pixel is then mapped to the nearest cluster center and painted with a corresponding color to form a visual map. The resulting visual map had similar pixels grouped. We then computed a cluster score indicating how clusters are near each other and plotted a signal representing frame number vs. clustering score. Our hypothesis was that the evolution of the signal would not change if semantically related events were happening in the scene. We marked the breakpoints at which the root mean square level of the signal changes significantly, and each breakpoint is an indication of the beginning of a new video segment. In the second part, for each segment from part 1, we randomly selected a 16-frame clip, then we extracted spatiotemporal features using convolutional 3D network C3D for every 16 frames using a pre-trained model. The C3D final output is a 512-feature vector dimension; hence we used principal component analysis (PCA) for dimensionality reduction. The final part is the classification. The C3D feature vectors are used as input to a multi-class linear support vector machine (SVM) for the training model, and we used a multi-classifier to detect the action. We evaluated our experiment on the UCF101 dataset, which consists of 101 human action categories, and we achieved an accuracy that outperforms the state of art by 1.2%.Keywords: video segmentation, action detection, classification, Kmeans, C3D
Procedia PDF Downloads 772151 Evaluation of the Microscopic-Observation Drug-Susceptibility Assay Drugs Concentration for Detection of Multidrug-Resistant Tuberculosis
Authors: Anita, Sari Septiani Tangke, Rusdina Bte Ladju, Nasrum Massi
Abstract:
New diagnostic tools are urgently needed to interrupt the transmission of tuberculosis and multidrug-resistant tuberculosis. The microscopic-observation drug-susceptibility (MODS) assay is a rapid, accurate and simple liquid culture method to detect multidrug-resistant tuberculosis (MDR-TB). MODS were evaluated to determine a lower and same concentration of isoniazid and rifampin for detection of MDR-TB. Direct drug-susceptibility testing was performed with the use of the MODS assay. Drug-sensitive control strains were tested daily. The drug concentrations that used for both isoniazid and rifampin were at the same concentration: 0.16, 0.08 and 0.04μg per milliliter. We tested 56 M. tuberculosis clinical isolates and the control strains M. tuberculosis H37RV. All concentration showed same result. Of 53 M. tuberculosis clinical isolates, 14 were MDR-TB, 38 were susceptible with isoniazid and rifampin, 1 was resistant with isoniazid only. Drug-susceptibility testing was performed with the use of the proportion method using Mycobacteria Growth Indicator Tube (MGIT) system as reference. The result of MODS assay using lower concentration was significance (P<0.001) compare with the reference methods. A lower and same concentration of isoniazid and rifampin can be used to detect MDR-TB. Operational cost and application can be more efficient and easier in resource-limited environments. However, additional studies evaluating the MODS using lower and same concentration of isoniazid and rifampin must be conducted with a larger number of clinical isolates.Keywords: isoniazid, MODS assay, MDR-TB, rifampin
Procedia PDF Downloads 3192150 Avidity and IgE versus IgG and IgM in Diagnosis of Maternal Toxoplasmosis
Authors: Ghada A. Gamea, Nabila A. Yaseen, Ahmed A. Othman, Ahmed S. Tawfik
Abstract:
Infection with Toxoplasma gondii can cause serious complications in pregnant women, leading to abortion, stillbirth, and congenital anomalies in the fetus. Definitive diagnosis of T. gondii acute infection is therefore critical for the clinical management of a mother and her fetus. This study was conducted on 250 pregnant females in the first trimester who were inpatients or outpatients at Obstetrics and Gynaecology Department at Tanta University Hospital. Screening of the selected females was done for the detection of immunoglobulin (IgG and IgM), and all subjects were submitted to history taking through a questionnaire including personal data, risk factors for Toxoplasma, complaint and history of the present illness. Thirty-eight samples, including 18 IgM +ve and 20 IgM-ve cases were further investigated by the avidity and IgE ELISA tests. The seroprevalence of toxoplasmosis in pregnant women was (42.8%) based on the presence of IgG antibodies in their sera. Contact with cats and consumption of raw or undercooked meat are important risk factors that were associated with toxoplasmosis in pregnant women. By serology, it could be observed that in the IgM +ve group, only one case (5.6%) showed an acute pattern by using the avidity test, though 10 (55.6%) cases were found to be acute by the IgE assay. On the other hand, in the IgM –ve group, 3 (15%) showed low avidity, but none of them was positive by using the IgE assay. In conclusion, there is no single serological test that can be used to confirm whether T. gondii infection is recent or was acquired in the distant past. A panel of tests for detection of toxoplasmosis will certainly have higher discriminatory power than any test alone.Keywords: diagnosis, serology, seroprevalence, toxoplasmosis
Procedia PDF Downloads 1512149 Surface Plasmon Resonance Imaging-Based Epigenetic Assay for Blood DNA Post-Traumatic Stress Disorder Biomarkers
Authors: Judy M. Obliosca, Olivia Vest, Sandra Poulos, Kelsi Smith, Tammy Ferguson, Abigail Powers Lott, Alicia K. Smith, Yang Xu, Christopher K. Tison
Abstract:
Post-Traumatic Stress Disorder (PTSD) is a mental health problem that people may develop after experiencing traumatic events such as combat, natural disasters, and major emotional challenges. Tragically, the number of military personnel with PTSD correlates directly with the number of veterans who attempt suicide, with the highest rate in the Army. Research has shown epigenetic risks in those who are prone to several psychiatric dysfunctions, particularly PTSD. Once initiated in response to trauma, epigenetic alterations in particular, the DNA methylation in the form of 5-methylcytosine (5mC) alters chromatin structure and represses gene expression. Current methods to detect DNA methylation, such as bisulfite-based genomic sequencing techniques, are laborious and have massive analysis workflow while still having high error rates. A faster and simpler detection method of high sensitivity and precision would be useful in a clinical setting to confirm potential PTSD etiologies, prevent other psychiatric disorders, and improve military health. A nano-enhanced Surface Plasmon Resonance imaging (SPRi)-based assay that simultaneously detects site-specific 5mC base (termed as PTSD base) in methylated genes related to PTSD is being developed. The arrays on a sensing chip were first constructed for parallel detection of PTSD bases using synthetic and genomic DNA (gDNA) samples. For the gDNA sample extracted from the whole blood of a PTSD patient, the sample was first digested using specific restriction enzymes, and fragments were denatured to obtain single-stranded methylated target genes (ssDNA). The resulting mixture of ssDNA was then injected into the assay platform, where targets were captured by specific DNA aptamer probes previously immobilized on the surface of a sensing chip. The PTSD bases in targets were detected by anti-5-methylcytosine antibody (anti-5mC), and the resulting signals were then enhanced by the universal nanoenhancer. Preliminary results showed successful detection of a PTSD base in a gDNA sample. Brighter spot images and higher delta values (control-subtracted reflectivity signal) relative to those of the control were observed. We also implemented the in-house surface activation system for detection and developed SPRi disposable chips. Multiplexed PTSD base detection of target methylated genes in blood DNA from PTSD patients of severity conditions (asymptomatic and severe) was conducted. This diagnostic capability being developed is a platform technology, and upon successful implementation for PTSD, it could be reconfigured for the study of a wide variety of neurological disorders such as traumatic brain injury, Alzheimer’s disease, schizophrenia, and Huntington's disease and can be extended to the analyses of other sample matrices such as urine and saliva.Keywords: epigenetic assay, DNA methylation, PTSD, whole blood, multiplexing
Procedia PDF Downloads 1212148 A Spatio-Temporal Analysis and Change Detection of Wetlands in Diamond Harbour, West Bengal, India Using Normalized Difference Water Index
Authors: Lopita Pal, Suresh V. Madha
Abstract:
Wetlands are areas of marsh, fen, peat land or water, whether natural or artificial, permanent or temporary, with water that is static or flowing, fresh, brackish or salt, including areas of marine water the depth of which at low tide does not exceed six metres. The rapidly expanding human population, large scale changes in land use/land cover, burgeoning development projects and improper use of watersheds all has caused a substantial decline of wetland resources in the world. Major degradations have been impacted from agricultural, industrial and urban developments leading to various types of pollutions and hydrological perturbations. Regular fishing activities and unsustainable grazing of animals are degrading the wetlands in a slow pace. The paper focuses on the spatio-temporal change detection of the area of the water body and the main cause of this depletion. The total area under study (22°19’87’’ N, 88°20’23’’ E) is a wetland region in West Bengal of 213 sq.km. The procedure used is the Normalized Difference Water Index (NDWI) from multi-spectral imagery and Landsat to detect the presence of surface water, and the datasets have been compared of the years 2016, 2006 and 1996. The result shows a sharp decline in the area of water body due to a rapid increase in the agricultural practices and the growing urbanization.Keywords: spatio-temporal change, NDWI, urbanization, wetland
Procedia PDF Downloads 2812147 A Practical and Theoretical Study on the Electromotor Bearing Defect Detection in a Wet Mill Using the Vibration Analysis Method and Defect Length Calculation in the Bearing
Authors: Mostafa Firoozabadi, Alireza Foroughi Nematollahi
Abstract:
Wet mills are one of the most important equipment in the mining industries and any defect occurrence in them can stop the production line and it can make some irrecoverable damages to the system. Electromotors are the significant parts of a mill and their monitoring is a necessary process to prevent unwanted defects. The purpose of this study is to investigate the Electromotor bearing defects, theoretically and practically, using the vibration analysis method. When a defect happens in a bearing, it can be transferred to the other parts of the equipment like inner ring, outer ring, balls, and the bearing cage. The electromotor defects source can be electrical or mechanical. Sometimes, the electrical and mechanical defect frequencies are modulated and the bearing defect detection becomes difficult. In this paper, to detect the electromotor bearing defects, the electrical and mechanical defect frequencies are extracted firstly. Then, by calculating the bearing defect frequencies, and the spectrum and time signal analysis, the bearing defects are detected. In addition, the obtained frequency determines that the bearing level in which the defect has happened and by comparing this level to the standards it determines the bearing remaining lifetime. Finally, the defect length is calculated by theoretical equations to demonstrate that there is no need to replace the bearing. The results of the proposed method, which has been implemented on the wet mills in the Golgohar mining and industrial company in Iran, show that this method is capable of detecting the electromotor bearing defects accurately and on time.Keywords: bearing defect length, defect frequency, electromotor defects, vibration analysis
Procedia PDF Downloads 5002146 Information Retrieval from Internet Using Hand Gestures
Authors: Aniket S. Joshi, Aditya R. Mane, Arjun Tukaram
Abstract:
In the 21st century, in the era of e-world, people are continuously getting updated by daily information such as weather conditions, news, stock exchange market updates, new projects, cricket updates, sports and other such applications. In the busy situation, they want this information on the little use of keyboard, time. Today in order to get such information user have to repeat same mouse and keyboard actions which includes time and inconvenience. In India due to rural background many people are not much familiar about the use of computer and internet also. Also in small clinics, small offices, and hotels and in the airport there should be a system which retrieves daily information with the minimum use of keyboard and mouse actions. We plan to design application based project that can easily retrieve information with minimum use of keyboard and mouse actions and make our task more convenient and easier. This can be possible with an image processing application which takes real time hand gestures which will get matched by system and retrieve information. Once selected the functions with hand gestures, the system will report action information to user. In this project we use real time hand gesture movements to select required option which is stored on the screen in the form of RSS Feeds. Gesture will select the required option and the information will be popped and we got the information. A real time hand gesture makes the application handier and easier to use.Keywords: hand detection, hand tracking, hand gesture recognition, HSV color model, Blob detection
Procedia PDF Downloads 2872145 Development of Advanced Virtual Radiation Detection and Measurement Laboratory (AVR-DML) for Nuclear Science and Engineering Students
Authors: Lily Ranjbar, Haori Yang
Abstract:
Online education has been around for several decades, but the importance of online education became evident after the COVID-19 pandemic. Eventhough the online delivery approach works well for knowledge building through delivering content and oversight processes, it has limitations in developing hands-on laboratory skills, especially in the STEM field. During the pandemic, many education institutions faced numerous challenges in delivering lab-based courses, especially in the STEM field. Also, many students worldwide were unable to practice working with lab equipment due to social distancing or the significant cost of highly specialized equipment. The laboratory plays a crucial role in nuclear science and engineering education. It can engage students and improve their learning outcomes. In addition, online education and virtual labs have gained substantial popularity in engineering and science education. Therefore, developing virtual labs is vital for institutions to deliver high-class education to their students, including their online students. The School of Nuclear Science and Engineering (NSE) at Oregon State University, in partnership with SpectralLabs company, has developed an Advanced Virtual Radiation Detection and Measurement Lab (AVR-DML) to offer a fully online Master of Health Physics program. It was essential for us to use a system that could simulate nuclear modules that accurately replicate the underlying physics, the nature of radiation and radiation transport, and the mechanics of the instrumentations used in the real radiation detection lab. It was all accomplished using a Realistic, Adaptive, Interactive Learning System (RAILS). RAILS is a comprehensive software simulation-based learning system for use in training. It is comprised of a web-based learning management system that is located on a central server, as well as a 3D-simulation package that is downloaded locally to user machines. Users will find that the graphics, animations, and sounds in RAILS create a realistic, immersive environment to practice detecting different radiation sources. These features allow students to coexist, interact and engage with a real STEM lab in all its dimensions. It enables them to feel like they are in a real lab environment and to see the same system they would in a lab. Unique interactive interfaces were designed and developed by integrating all the tools and equipment needed to run each lab. These interfaces provide students full functionality for data collection, changing the experimental setup, and live data collection with real-time updates for each experiment. Students can manually do all experimental setups and parameter changes in this lab. Experimental results can then be tracked and analyzed in an oscilloscope, a multi-channel analyzer, or a single-channel analyzer (SCA). The advanced virtual radiation detection and measurement laboratory developed in this study enabled the NSE school to offer a fully online MHP program. This flexibility of course modality helped us to attract more non-traditional students, including international students. It is a valuable educational tool as students can walk around the virtual lab, make mistakes, and learn from them. They have an unlimited amount of time to repeat and engage in experiments. This lab will also help us speed up training in nuclear science and engineering.Keywords: advanced radiation detection and measurement, virtual laboratory, realistic adaptive interactive learning system (rails), online education in stem fields, student engagement, stem online education, stem laboratory, online engineering education
Procedia PDF Downloads 882144 Advanced Magnetic Field Mapping Utilizing Vertically Integrated Deployment Platforms
Authors: John E. Foley, Martin Miele, Raul Fonda, Jon Jacobson
Abstract:
This paper presents development and implementation of new and innovative data collection and analysis methodologies based on deployment of total field magnetometer arrays. Our research has focused on the development of a vertically-integrated suite of platforms all utilizing common data acquisition, data processing and analysis tools. These survey platforms include low-altitude helicopters and ground-based vehicles, including robots, for terrestrial mapping applications. For marine settings the sensor arrays are deployed from either a hydrodynamic bottom-following wing towed from a surface vessel or from a towed floating platform for shallow-water settings. Additionally, sensor arrays are deployed from tethered remotely operated vehicles (ROVs) for underwater settings where high maneuverability is required. While the primary application of these systems is the detection and mapping of unexploded ordnance (UXO), these system are also used for various infrastructure mapping and geologic investigations. For each application, success is driven by the integration of magnetometer arrays, accurate geo-positioning, system noise mitigation, and stable deployment of the system in appropriate proximity of expected targets or features. Each of the systems collects geo-registered data compatible with a web-enabled data management system providing immediate access of data and meta-data for remote processing, analysis and delivery of results. This approach allows highly sophisticated magnetic processing methods, including classification based on dipole modeling and remanent magnetization, to be efficiently applied to many projects. This paper also briefly describes the initial development of magnetometer-based detection systems deployed from low-altitude helicopter platforms and the subsequent successful transition of this technology to the marine environment. Additionally, we present examples from a range of terrestrial and marine settings as well as ongoing research efforts related to sensor miniaturization for unmanned aerial vehicle (UAV) magnetic field mapping applications.Keywords: dipole modeling, magnetometer mapping systems, sub-surface infrastructure mapping, unexploded ordnance detection
Procedia PDF Downloads 4632143 External Noise Distillation in Quantum Holography with Undetected Light
Authors: Sebastian Töpfer, Jorge Fuenzalida, Marta Gilaberte Basset, Juan P. Torres, Markus Gräfe
Abstract:
This work presents an experimental and theoretical study about the noise resilience of quantum holography with undetected photons. Quantum imaging has become an important research topic in the recent years after its first publication in 2014. Following this research, advances towards different spectral ranges in detection and different optical geometries have been made. Especially an interest in the field of near infrared to mid infrared measurements has developed, because of the unique characteristic, that allows to sample a probe with photons in a different wavelength than the photons arriving at the detector. This promising effect can be used for medical applications, to measure in the so-called molecule fingerprint region, while using broadly available detectors for the visible spectral range. Further advance the development of quantum imaging methods have been made by new measurement and detection schemes. One of which is quantum holography with undetected light. It combines digital phase shifting holography with quantum imaging to extent the obtainable sample information, by measuring not only the object transmission, but also its influence on the phase shift experienced by the transmitted light. This work will present extended research for the quantum holography with undetected light scheme regarding the influence of external noise. It is shown experimentally and theoretically that the samples information can still be at noise levels of 250 times higher than the signal level, because of its information being transmitted by the interferometric pattern. A detailed theoretic explanation is also provided.Keywords: distillation, quantum holography, quantum imaging, quantum metrology
Procedia PDF Downloads 732142 Detection and Dissemination of Putative Virulence Genes from Brucella Species Isolated from Livestock in Eastern Cape Province of South Africa
Authors: Rudzani Manafe, Ezekiel Green
Abstract:
Brucella, has many different virulence factors that act as a causative agent of brucellosis, depending on the environment and other factors, some factors may play a role more than others during infection and as a result, play a role in becoming a causative agent for pathogenesis. Brucella melitensis and Brucella abortus are considered to be pathogenic to humans. The genetic regularity of nine potential causes of virulence of two Brucella species in Eastern Cape livestock have been examined. A hundred and twenty isolates obtained from Molecular Pathogenesis and Molecular Epidemiology Research Group (MPMERG) were used for this study. All isolates were grown on Brucella agar medium. Nine primer pairs were used for the detection of virB2, virB5, vceC, btpA, btpB, prpA, betB, bpe275, and bspB virulence factors using Polymerase chain reaction (PCR). Approximately 100% was observed for genes BecC and BetB from B. arbotus. While the lowest gene observed was PrpA at 4.6% from B. arbotus. BetB was detected in 34.7%, while virB2 and prpA (0%) were not detected in B. melitensis. The results from this research suggest that most isolates of Brucella have virulence-related genes associated with disease pathogenesis. Finally, our findings showed that Brucella strains in the Eastern Cape Province are extremely virulent as virulence characteristics exist in most strains investigated.Keywords: putative virulence genes, brucella, polymerase chain reaction, milk
Procedia PDF Downloads 1372141 Colloid-Based Biodetection at Aqueous Electrical Interfaces Using Fluidic Dielectrophoresis
Authors: Francesca Crivellari, Nicholas Mavrogiannis, Zachary Gagnon
Abstract:
Portable diagnostic methods have become increasingly important for a number of different purposes: point-of-care screening in developing nations, environmental contamination studies, bio/chemical warfare agent detection, and end-user use for commercial health monitoring. The cheapest and most portable methods currently available are paper-based – lateral flow and dipstick methods are widely available in drug stores for use in pregnancy detection and blood glucose monitoring. These tests are successful because they are cheap to produce, easy to use, and require minimally invasive sampling. While adequate for their intended uses, in the realm of blood-borne pathogens and numerous cancers, these paper-based methods become unreliable, as they lack the nM/pM sensitivity currently achieved by clinical diagnostic methods. Clinical diagnostics, however, utilize techniques involving surface plasmon resonance (SPR) and enzyme-linked immunosorbent assays (ELISAs), which are expensive and unfeasible in terms of portability. To develop a better, competitive biosensor, we must reduce the cost of one, or increase the sensitivity of the other. Electric fields are commonly utilized in microfluidic devices to manipulate particles, biomolecules, and cells. Applications in this area, however, are primarily limited to interfaces formed between immiscible interfaces. Miscible, liquid-liquid interfaces are common in microfluidic devices, and are easily reproduced with simple geometries. Here, we demonstrate the use of electrical fields at liquid-liquid electrical interfaces, known as fluidic dielectrophoresis, (fDEP) for biodetection in a microfluidic device. In this work, we apply an AC electric field across concurrent laminar streams with differing conductivities and permittivities to polarize the interface and induce a discernible, near-immediate, frequency-dependent interfacial tilt. We design this aqueous electrical interface, which becomes the biosensing “substrate,” to be intelligent – it “moves” only when a target of interest is present. This motion requires neither labels nor expensive electrical equipment, so the biosensor is inexpensive and portable, yet still capable of sensitive detection. Nanoparticles, due to their high surface-area-to-volume ratio, are often incorporated to enhance detection capabilities of schemes like SPR and fluorimetric assays. Most studies currently investigate binding at an immobilized solid-liquid or solid-gas interface, where particles are adsorbed onto a planar surface, functionalized with a receptor to create a reactive substrate, and subsequently flushed with a fluid or gas with the relevant analyte. These typically involve many preparation and rinsing steps, and are susceptible to surface fouling. Our microfluidic device is continuously flowing and renewing the “substrate,” and is thus not subject to fouling. In this work, we demonstrate the ability to electrokinetically detect biomolecules binding to functionalized nanoparticles at liquid-liquid interfaces using fDEP. In biotin-streptavidin experiments, we report binding detection limits on the order of 1-10 pM, without amplifying signals or concentrating samples. We also demonstrate the ability to detect this interfacial motion, and thus the presence of binding, using impedance spectroscopy, allowing this scheme to become non-optical, in addition to being label-free.Keywords: biodetection, dielectrophoresis, microfluidics, nanoparticles
Procedia PDF Downloads 3872140 Deep Learning-Based Object Detection on Low Quality Images: A Case Study of Real-Time Traffic Monitoring
Authors: Jean-Francois Rajotte, Martin Sotir, Frank Gouineau
Abstract:
The installation and management of traffic monitoring devices can be costly from both a financial and resource point of view. It is therefore important to take advantage of in-place infrastructures to extract the most information. Here we show how low-quality urban road traffic images from cameras already available in many cities (such as Montreal, Vancouver, and Toronto) can be used to estimate traffic flow. To this end, we use a pre-trained neural network, developed for object detection, to count vehicles within images. We then compare the results with human annotations gathered through crowdsourcing campaigns. We use this comparison to assess performance and calibrate the neural network annotations. As a use case, we consider six months of continuous monitoring over hundreds of cameras installed in the city of Montreal. We compare the results with city-provided manual traffic counting performed in similar conditions at the same location. The good performance of our system allows us to consider applications which can monitor the traffic conditions in near real-time, making the counting usable for traffic-related services. Furthermore, the resulting annotations pave the way for building a historical vehicle counting dataset to be used for analysing the impact of road traffic on many city-related issues, such as urban planning, security, and pollution.Keywords: traffic monitoring, deep learning, image annotation, vehicles, roads, artificial intelligence, real-time systems
Procedia PDF Downloads 1982139 Investigation of Ascochyta Blight Resistance in Registered Turkish Chickpea (Cicer arietinum L.) Varieties by Using Molecular Techniques
Authors: Ibrahim Ilker Ozyigit, Fatih Tabanli, Sezin Adinir
Abstract:
In this study, Ascochyta blight resistance was investigated in 34 registered chickpea varieties, which are widely planting in different regions of Turkey. For this aim, molecular marker techniques, such as STMS, RAPD and ISSR were used. Ta2, Ta146 and Ts54 primers were used for STMS, while UBC733 and UBC681 primers for RAPD, and UBC836 and UBC858 primers for ISSR. Ta2, Ts54 and Ta146 (STMS), and UBC733 (RAPD) primers demonstrated the distinctive feature for Ascochyta blight resistance. Ta2, Ts54 and Ta146 primers yielded the quite effective results in detection of resistant and sensitive varieties. Besides, UBC 733 primer distinguished all kinds of standard did not give any reliable results for other varieties since it demonstrated all as resistant. In addition, monomorphic bands were obtained from UBC681 (RAPD), and UBC836 and UBC858 (ISSR) primers, not demonstrating reliable results in detection of resistance against Ascochyta blight disease. Obtained results informed us about both disease resistance and genetic diversity in registered Turkish chickpea varieties. This project was funded through the Scientific Research Projects of Marmara University under Grant Number FEN-C-YLP-070617-0365 and The Scientific and Technological Research Council of Turkey (TUBITAK) under Grant Number 113O070.Keywords: plant genetics, ISSR, RAPD, STMS
Procedia PDF Downloads 1962138 A Spatial Approach to Model Mortality Rates
Authors: Yin-Yee Leong, Jack C. Yue, Hsin-Chung Wang
Abstract:
Human longevity has been experiencing its largest increase since the end of World War II, and modeling the mortality rates is therefore often the focus of many studies. Among all mortality models, the Lee–Carter model is the most popular approach since it is fairly easy to use and has good accuracy in predicting mortality rates (e.g., for Japan and the USA). However, empirical studies from several countries have shown that the age parameters of the Lee–Carter model are not constant in time. Many modifications of the Lee–Carter model have been proposed to deal with this problem, including adding an extra cohort effect and adding another period effect. In this study, we propose a spatial modification and use clusters to explain why the age parameters of the Lee–Carter model are not constant. In spatial analysis, clusters are areas with unusually high or low mortality rates than their neighbors, where the “location” of mortality rates is measured by age and time, that is, a 2-dimensional coordinate. We use a popular cluster detection method—Spatial scan statistics, a local statistical test based on the likelihood ratio test to evaluate where there are locations with mortality rates that cannot be described well by the Lee–Carter model. We first use computer simulation to demonstrate that the cluster effect is a possible source causing the problem of the age parameters not being constant. Next, we show that adding the cluster effect can solve the non-constant problem. We also apply the proposed approach to mortality data from Japan, France, the USA, and Taiwan. The empirical results show that our approach has better-fitting results and smaller mean absolute percentage errors than the Lee–Carter model.Keywords: mortality improvement, Lee–Carter model, spatial statistics, cluster detection
Procedia PDF Downloads 1702137 Urban Growth Analysis Using Multi-Temporal Satellite Images, Non-stationary Decomposition Methods and Stochastic Modeling
Authors: Ali Ben Abbes, ImedRiadh Farah, Vincent Barra
Abstract:
Remotely sensed data are a significant source for monitoring and updating databases for land use/cover. Nowadays, changes detection of urban area has been a subject of intensive researches. Timely and accurate data on spatio-temporal changes of urban areas are therefore required. The data extracted from multi-temporal satellite images are usually non-stationary. In fact, the changes evolve in time and space. This paper is an attempt to propose a methodology for changes detection in urban area by combining a non-stationary decomposition method and stochastic modeling. We consider as input of our methodology a sequence of satellite images I1, I2, … In at different periods (t = 1, 2, ..., n). Firstly, a preprocessing of multi-temporal satellite images is applied. (e.g. radiometric, atmospheric and geometric). The systematic study of global urban expansion in our methodology can be approached in two ways: The first considers the urban area as one same object as opposed to non-urban areas (e.g. vegetation, bare soil and water). The objective is to extract the urban mask. The second one aims to obtain a more knowledge of urban area, distinguishing different types of tissue within the urban area. In order to validate our approach, we used a database of Tres Cantos-Madrid in Spain, which is derived from Landsat for a period (from January 2004 to July 2013) by collecting two frames per year at a spatial resolution of 25 meters. The obtained results show the effectiveness of our method.Keywords: multi-temporal satellite image, urban growth, non-stationary, stochastic model
Procedia PDF Downloads 4262136 Sensing Study through Resonance Energy and Electron Transfer between Föster Resonance Energy Transfer Pair of Fluorescent Copolymers and Nitro-Compounds
Authors: Vishal Kumar, Soumitra Satapathi
Abstract:
Föster Resonance Energy Transfer (FRET) is a powerful technique used to probe close-range molecular interactions. Physically, the FRET phenomenon manifests as a dipole–dipole interaction between closely juxtaposed fluorescent molecules (10–100 Å). Our effort is to employ this FRET technique to make a prototype device for highly sensitive detection of environment pollutant. Among the most common environmental pollutants, nitroaromatic compounds (NACs) are of particular interest because of their durability and toxicity. That’s why, sensitive and selective detection of small amounts of nitroaromatic explosives, in particular, 2,4,6-trinitrophenol (TNP), 2,4-dinitrotoluene (DNT) and 2,4,6-trinitrotoluene (TNT) has been a critical challenge due to the increasing threat of explosive-based terrorism and the need of environmental monitoring of drinking and waste water. In addition, the excessive utilization of TNP in several other areas such as burn ointment, pesticides, glass and the leather industry resulted in environmental accumulation, and is eventually contaminating the soil and aquatic systems. To the date, high number of elegant methods, including fluorimetry, gas chromatography, mass, ion-mobility and Raman spectrometry have been successfully applied for explosive detection. Among these efforts, fluorescence-quenching methods based on the mechanism of FRET show good assembly flexibility, high selectivity and sensitivity. Here, we report a FRET-based sensor system for the highly selective detection of NACs, such as TNP, DNT and TNT. The sensor system is composed of a copolymer Poly [(N,N-dimethylacrylamide)-co-(Boc-Trp-EMA)] (RP) bearing tryptophan derivative in the side chain as donor and dansyl tagged copolymer P(MMA-co-Dansyl-Ala-HEMA) (DCP) as an acceptor. Initially, the inherent fluorescence of RP copolymer is quenched by non-radiative energy transfer to DCP which only happens once the two molecules are within Förster critical distance (R0). The excellent spectral overlap (Jλ= 6.08×10¹⁴ nm⁴M⁻¹cm⁻¹) between donors’ (RP) emission profile and acceptors’ (DCP) absorption profile makes them an exciting and efficient FRET pair i.e. further confirmed by the high rate of energy transfer from RP to DCP i.e. 0.87 ns⁻¹ and lifetime measurement by time correlated single photon counting (TCSPC) to validate the 64% FRET efficiency. This FRET pair exhibited a specific fluorescence response to NACs such as DNT, TNT and TNP with 5.4, 2.3 and 0.4 µM LODs, respectively. The detection of NACs occurs with high sensitivity by photoluminescence quenching of FRET signal induced by photo-induced electron transfer (PET) from electron-rich FRET pair to electron-deficient NAC molecules. The estimated stern-volmer constant (KSV) values for DNT, TNT and TNP are 6.9 × 10³, 7.0 × 10³ and 1.6 × 104 M⁻¹, respectively. The mechanistic details of molecular interactions are established by time-resolved fluorescence, steady-state fluorescence and absorption spectroscopy confirmed that the sensing process is of mixed type, i.e. both dynamic and static quenching as lifetime of FRET system (0.73 ns) is reduced to 0.55, 0.57 and 0.61 ns DNT, TNT and TNP, respectively. In summary, the simplicity and sensitivity of this novel FRET sensor opens up the possibility of designing optical sensor of various NACs in one single platform for developing multimodal sensor for environmental monitoring and future field based study.Keywords: FRET, nitroaromatic, stern-Volmer constant, tryptophan and dansyl tagged copolymer
Procedia PDF Downloads 1332135 Functionalized Carbon-Base Fluorescent Nanoparticles for Emerging Contaminants Targeted Analysis
Authors: Alexander Rodríguez-Hernández, Arnulfo Rojas-Perez, Liz Diaz-Vazquez
Abstract:
The rise in consumerism over the past century has resulted in the creation of higher amounts of plasticizers, personal care products and other chemical substances, which enter and accumulate in water systems. Other sources of pollutants in Neotropical regions experience large inputs of nutrients with these pollutants resulting in eutrophication of water which consume large quantities of oxygen, resulting in high fish mortality. This dilemma has created a need for the development of targeted detection in complex matrices and remediation of emerging contaminants. We have synthesized carbon nanoparticles from macro algae (Ulva fasciata) by oxidizing the graphitic carbon network under extreme acidic conditions. The resulting material was characterized by STEM, yielding a spherical 12 nm average diameter nanoparticles, which can be fixed into a polysaccharide aerogel synthesized from the same macro algae. Spectrophotometer analyses show a pH dependent fluorescent behavior varying from 450-620 nm in aqueous media. Heavily oxidized edges provide for easy functionalization with enzymes for a more targeted analysis and remediation technique. Given the optical properties of the carbon base nanoparticles and the numerous possibilities of functionalization, we have developed a selective and robust targeted bio-detection and bioremediation technique for the treatment of emerging contaminants in complex matrices like estuarine embayment.Keywords: aerogels, carbon nanoparticles, fluorescent, targeted analysis
Procedia PDF Downloads 2402134 Fusion Neutron Generator Dosimetry and Applications for Medical, Security, and Industry
Authors: Kaouther Bergaui, Nafaa Reguigui, Charles Gary
Abstract:
Characterization and the applications of deuterium-deuterium (DD) neutron generator developed by Adelphie technology and acquired by the National Centre of Nuclear Science and Technology (NCNST) were presented in this work. We study the performance of the neutron generator in terms of neutron yield, production efficiency, and the ionic current as a function of the acceleration voltage at various RF powers. We provide the design and optimization of the PGNAA chamber and thus give insight into the capabilities of the planned PGNAA facility. Additional non-destructive techniques were studied employing the DD neutron generator, such as PGNAA and neutron radiography: The PGNAA is used for determining the concentration of 10B in Si and SiO2 matrices by using a germanium detector HPGe and the results obtained are compared with PGNAA system using a Sodium Iodide detector (NaI (Tl)); Neutron radiography facility was tested and simulated, using a camera device CCD and simulated by the Monte Carlo code; and the explosive detection system (EDS) also simulated using the Monte Carlo code. The study allows us to show that the new models of DD neutron generators are feasible and that superior-quality neutron beams could be produced and used for various applications. The feasibility of Boron neutron capture therapy (BNCT) for cancer treatment using a neutron generator was assessed by optimizing Beam Shaping Assembly (BSA) on a phantom using Monte-Carlo (MCNP6) simulations.Keywords: neutron generator deuterium-deuterium, Monte Carlo method, radiation, neutron flux, neutron activation analysis, born, neutron radiography, explosive detection, BNCT
Procedia PDF Downloads 1912133 Using Geospatial Analysis to Reconstruct the Thunderstorm Climatology for the Washington DC Metropolitan Region
Authors: Mace Bentley, Zhuojun Duan, Tobias Gerken, Dudley Bonsal, Henry Way, Endre Szakal, Mia Pham, Hunter Donaldson, Chelsea Lang, Hayden Abbott, Leah Wilcynzski
Abstract:
Air pollution has the potential to modify the lifespan and intensity of thunderstorms and the properties of lightning. Using data mining and geovisualization, we investigate how background climate and weather conditions shape variability in urban air pollution and how this, in turn, shapes thunderstorms as measured by the intensity, distribution, and frequency of cloud-to-ground lightning. A spatiotemporal analysis was conducted in order to identify thunderstorms using high-resolution lightning detection network data. Over seven million lightning flashes were used to identify more than 196,000 thunderstorms that occurred between 2006 - 2020 in the Washington, DC Metropolitan Region. Each lightning flash in the dataset was grouped into thunderstorm events by means of a temporal and spatial clustering algorithm. Once the thunderstorm event database was constructed, hourly wind direction, wind speed, and atmospheric thermodynamic data were added to the initiation and dissipation times and locations for the 196,000 identified thunderstorms. Hourly aerosol and air quality data for the thunderstorm initiation times and locations were also incorporated into the dataset. Developing thunderstorm climatologies using a lightning tracking algorithm and lightning detection network data was found to be useful for visualizing the spatial and temporal distribution of urban augmented thunderstorms in the region.Keywords: lightning, urbanization, thunderstorms, climatology
Procedia PDF Downloads 732132 Molecular Epidemiology of Egyptian Biomphalaria Snail: The Identification of Species, Diagnostic of the Parasite in Snails and Host Parasite Relationship
Authors: Hanaa M. Abu El Einin, Ahmed T. Sharaf El- Din
Abstract:
Biomphalaria snails play an integral role in the transmission of Schistosoma mansoni, the causative agent for human schistosomiasis. Two species of Biomphalaria were reported from Egypt, Biomphalaria alexandrina and Biomphalaria glabrata, and later on a hybrid of B. alexandrina and B. glabrata was reported in streams at Nile Delta. All were known to be excellent hosts of S. mansoni. Host-parasite relationship can be viewed in terms of snail susceptibility and parasite infectivity. The objective of this study will highlight the progress that has been made in using molecular approaches to describe the correct identification of snail species that participating in transmission of schistosomiasis, rapid diagnose of infection in addition to susceptibility and resistance type. Snails were identified using of molecular methods involving Randomly Amplified Polymorphic DNA (RAPD), Polymerase Chain Reaction, Restriction Fragment Length Polymorphisms (PCR-RFLP) and Species - specific- PCR. Molecular approaches to diagnose parasite in snails from Egypt: Nested PCR assay and small subunit (SSU) rRNA gene. Also RAPD PCR for study susceptible and resistance phenotype. The results showed that RAPD- PCR, PCR-RFLP and species-specific-PCR techniques were confirmed that: no evidence for the presence of B. glabrata in Egypt, All Biomphalaria snails collected identified as B. alexandrina snail i-e B alexandrinia is a common and no evidence for hybridization with B. glabrata. The adopted specific nested PCR assay revealed much higher sensitivity which enables the detection of S. mansoni infected snails down to 3 days post infection. Nested PCR method for detection of infected snails using S. mansoni fructose -1,6- bisphosphate aldolase (SMALDO) primer, these primers are specific only for S. mansoni and not cross reactive with other schistosomes or molluscan aldolases Nested PCR for such gene is sensitive enough to detect one cercariae. Genetic variations between B. alexandrina strains that are susceptible and resistant to Schistosoma infec¬tion using a RAPD-PCR showed that 39.8% of the examined snails collected from the field were resistant, while 60.2% of these snails showed high infection rates. In conclusion the genetics of the intermediate host plays a more important role in the epidemiological control of schistosomiasis.Keywords: biomphalaria, molecular differentiation, parasite detection, schistosomiasis
Procedia PDF Downloads 1972131 Row Detection and Graph-Based Localization in Tree Nurseries Using a 3D LiDAR
Authors: Ionut Vintu, Stefan Laible, Ruth Schulz
Abstract:
Agricultural robotics has been developing steadily over recent years, with the goal of reducing and even eliminating pesticides used in crops and to increase productivity by taking over human labor. The majority of crops are arranged in rows. The first step towards autonomous robots, capable of driving in fields and performing crop-handling tasks, is for robots to robustly detect the rows of plants. Recent work done towards autonomous driving between plant rows offers big robotic platforms equipped with various expensive sensors as a solution to this problem. These platforms need to be driven over the rows of plants. This approach lacks flexibility and scalability when it comes to the height of plants or distance between rows. This paper proposes instead an algorithm that makes use of cheaper sensors and has a higher variability. The main application is in tree nurseries. Here, plant height can range from a few centimeters to a few meters. Moreover, trees are often removed, leading to gaps within the plant rows. The core idea is to combine row detection algorithms with graph-based localization methods as they are used in SLAM. Nodes in the graph represent the estimated pose of the robot, and the edges embed constraints between these poses or between the robot and certain landmarks. This setup aims to improve individual plant detection and deal with exception handling, like row gaps, which are falsely detected as an end of rows. Four methods were developed for detecting row structures in the fields, all using a point cloud acquired with a 3D LiDAR as an input. Comparing the field coverage and number of damaged plants, the method that uses a local map around the robot proved to perform the best, with 68% covered rows and 25% damaged plants. This method is further used and combined with a graph-based localization algorithm, which uses the local map features to estimate the robot’s position inside the greater field. Testing the upgraded algorithm in a variety of simulated fields shows that the additional information obtained from localization provides a boost in performance over methods that rely purely on perception to navigate. The final algorithm achieved a row coverage of 80% and an accuracy of 27% damaged plants. Future work would focus on achieving a perfect score of 100% covered rows and 0% damaged plants. The main challenges that the algorithm needs to overcome are fields where the height of the plants is too small for the plants to be detected and fields where it is hard to distinguish between individual plants when they are overlapping. The method was also tested on a real robot in a small field with artificial plants. The tests were performed using a small robot platform equipped with wheel encoders, an IMU and an FX10 3D LiDAR. Over ten runs, the system achieved 100% coverage and 0% damaged plants. The framework built within the scope of this work can be further used to integrate data from additional sensors, with the goal of achieving even better results.Keywords: 3D LiDAR, agricultural robots, graph-based localization, row detection
Procedia PDF Downloads 1392130 Web Proxy Detection via Bipartite Graphs and One-Mode Projections
Authors: Zhipeng Chen, Peng Zhang, Qingyun Liu, Li Guo
Abstract:
With the Internet becoming the dominant channel for business and life, many IPs are increasingly masked using web proxies for illegal purposes such as propagating malware, impersonate phishing pages to steal sensitive data or redirect victims to other malicious targets. Moreover, as Internet traffic continues to grow in size and complexity, it has become an increasingly challenging task to detect the proxy service due to their dynamic update and high anonymity. In this paper, we present an approach based on behavioral graph analysis to study the behavior similarity of web proxy users. Specifically, we use bipartite graphs to model host communications from network traffic and build one-mode projections of bipartite graphs for discovering social-behavior similarity of web proxy users. Based on the similarity matrices of end-users from the derived one-mode projection graphs, we apply a simple yet effective spectral clustering algorithm to discover the inherent web proxy users behavior clusters. The web proxy URL may vary from time to time. Still, the inherent interest would not. So, based on the intuition, by dint of our private tools implemented by WebDriver, we examine whether the top URLs visited by the web proxy users are web proxies. Our experiment results based on real datasets show that the behavior clusters not only reduce the number of URLs analysis but also provide an effective way to detect the web proxies, especially for the unknown web proxies.Keywords: bipartite graph, one-mode projection, clustering, web proxy detection
Procedia PDF Downloads 2432129 Durian Marker Kit for Durian (Durio zibethinus Murr.) Identity
Authors: Emma K. Sales
Abstract:
Durian is the flagship fruit of Mindanao and there is an abundance of several cultivars with many confusing identities/ names. The project was conducted to develop procedure for reliable and rapid detection and sorting of durian planting materials. Moreover, it is also aimed to establish specific genetic or DNA markers for routine testing and authentication of durian cultivars in question. The project developed molecular procedures for routine testing. SSR primers were also screened and identified for their utility in discriminating durian cultivars collected. Results of the study showed the following accomplishments; 1. Twenty (29) SSR primers were selected and identified based on their ability to discriminate durian cultivars, 2. Optimized and established standard procedure for identification and authentication of Durian cultivars 3. Genetic profile of durian is now available at Biotech Unit. Our results demonstrate the relevance of using molecular techniques in evaluating and identifying durian clones. The most polymorphic primers tested in this study could be useful tools for detecting variation even at the early stage of the plant especially for commercial purposes. The process developed combines the efficiency of the microsatellites development process with the optimization of non-radioactive detection process resulting in a user-friendly protocol that can be performed in two (2) weeks and easily incorporated into laboratories about to start microsatellite development projects. This can be of great importance to extend microsatellite analyses to other crop species where minimal genetic information is currently available. With this, the University can now be a service laboratory for routine testing and authentication of durian clones.Keywords: DNA, SSR analysis, genotype, genetic diversity, cultivars
Procedia PDF Downloads 4522128 Clinical Impact of Ultra-Deep Versus Sanger Sequencing Detection of Minority Mutations on the HIV-1 Drug Resistance Genotype Interpretations after Virological Failure
Authors: S. Mohamed, D. Gonzalez, C. Sayada, P. Halfon
Abstract:
Drug resistance mutations are routinely detected using standard Sanger sequencing, which does not detect minor variants with a frequency below 20%. The impact of detecting minor variants generated by ultra-deep sequencing (UDS) on HIV drug-resistance (DR) interpretations has not yet been studied. Fifty HIV-1 patients who experienced virological failure were included in this retrospective study. The HIV-1 UDS protocol allowed the detection and quantification of HIV-1 protease and reverse transcriptase variants related to genotypes A, B, C, E, F, and G. DeepChek®-HIV simplified DR interpretation software was used to compare Sanger sequencing and UDS. The total time required for the UDS protocol was found to be approximately three times longer than Sanger sequencing with equivalent reagent costs. UDS detected all of the mutations found by population sequencing and identified additional resistance variants in all patients. An analysis of DR revealed a total of 643 and 224 clinically relevant mutations by UDS and Sanger sequencing, respectively. Three resistance mutations with > 20% prevalence were detected solely by UDS: A98S (23%), E138A (21%) and V179I (25%). A significant difference in the DR interpretations for 19 antiretroviral drugs was observed between the UDS and Sanger sequencing methods. Y181C and T215Y were the most frequent mutations associated with interpretation differences. A combination of UDS and DeepChek® software for the interpretation of DR results would help clinicians provide suitable treatments. A cut-off of 1% allowed a better characterisation of the viral population by identifying additional resistance mutations and improving the DR interpretation.Keywords: HIV-1, ultra-deep sequencing, Sanger sequencing, drug resistance
Procedia PDF Downloads 3332127 A Gradient Orientation Based Efficient Linear Interpolation Method
Authors: S. Khan, A. Khan, Abdul R. Soomrani, Raja F. Zafar, A. Waqas, G. Akbar
Abstract:
This paper proposes a low-complexity image interpolation method. Image interpolation is used to convert a low dimension video/image to high dimension video/image. The objective of a good interpolation method is to upscale an image in such a way that it provides better edge preservation at the cost of very low complexity so that real-time processing of video frames can be made possible. However, low complexity methods tend to provide real-time interpolation at the cost of blurring, jagging and other artifacts due to errors in slope calculation. Non-linear methods, on the other hand, provide better edge preservation, but at the cost of high complexity and hence they can be considered very far from having real-time interpolation. The proposed method is a linear method that uses gradient orientation for slope calculation, unlike conventional linear methods that uses the contrast of nearby pixels. Prewitt edge detection is applied to separate uniform regions and edges. Simple line averaging is applied to unknown uniform regions, whereas unknown edge pixels are interpolated after calculation of slopes using gradient orientations of neighboring known edge pixels. As a post-processing step, bilateral filter is applied to interpolated edge regions in order to enhance the interpolated edges.Keywords: edge detection, gradient orientation, image upscaling, linear interpolation, slope tracing
Procedia PDF Downloads 2582126 Anomaly Detection in Financial Markets Using Tucker Decomposition
Authors: Salma Krafessi
Abstract:
The financial markets have a multifaceted, intricate environment, and enormous volumes of data are produced every day. To find investment possibilities, possible fraudulent activity, and market oddities, accurate anomaly identification in this data is essential. Conventional methods for detecting anomalies frequently fail to capture the complex organization of financial data. In order to improve the identification of abnormalities in financial time series data, this study presents Tucker Decomposition as a reliable multi-way analysis approach. We start by gathering closing prices for the S&P 500 index across a number of decades. The information is converted to a three-dimensional tensor format, which contains internal characteristics and temporal sequences in a sliding window structure. The tensor is then broken down using Tucker Decomposition into a core tensor and matching factor matrices, allowing latent patterns and relationships in the data to be captured. A possible sign of abnormalities is the reconstruction error from Tucker's Decomposition. We are able to identify large deviations that indicate unusual behavior by setting a statistical threshold. A thorough examination that contrasts the Tucker-based method with traditional anomaly detection approaches validates our methodology. The outcomes demonstrate the superiority of Tucker's Decomposition in identifying intricate and subtle abnormalities that are otherwise missed. This work opens the door for more research into multi-way data analysis approaches across a range of disciplines and emphasizes the value of tensor-based methods in financial analysis.Keywords: tucker decomposition, financial markets, financial engineering, artificial intelligence, decomposition models
Procedia PDF Downloads 682125 Comparison of Serological and Molecular Diagnosis of Cerebral Toxoplasmosis in Blood and Cerebrospinal Fluid in HIV Infected Patients
Authors: Berredjem Hajira, Benlaifa Meriem, Becheker Imene, Bardi Rafika, Djebar Med Reda
Abstract:
Recent acquired or reactivation T.gondii infection is a serious complication in HIV patients. Classical serological diagnosis relies on the detection of anti-Toxoplasma immunoglobulin ; however, serology may be unreliable in HIV immunodeficient patients who fail to produce significant titers of specific antibodies. PCR assays allow a rapid diagnosis of Toxoplasma infection. In this study, we compared the value of the PCR for diagnosing active toxoplasmosis in cerebrospinal fluid and blood samples from HIV patients. Anti-Toxoplasma antibodies IgG and IgM titers were determined by ELISA. In parallel, nested PCR targeting B1 gene and conventional PCR-ELISA targeting P30 gene were used to detect T. gondii DNA in 25 blood samples and 12 cerebrospinal fluid samples from patients in whom toxoplasmic encephalitis was confirmed by clinical investigations. A total of 15 negative controls were used. Serology did not contribute to confirm toxoplasmic infection, as IgG and IgM titers decreased early. Only 8 out 25 blood samples and 5 out 12 cerebrospinal fluid samples PCRs yielded a positive result. 5 patients with confirmed toxoplasmosis had positive PCR results in either blood or cerebrospinal fluid samples. However, conventional nested B1 PCR gave best results than the P30 gene one for the detection of T.gondii DNA in both samples. All samples from control patients were negative. This study demonstrates the unusefulness of the serological tests and the high sensitivity and specificity of PCR in the diagnosis of toxoplasmic encephalitis in HIV patients.Keywords: cerebrospinal fluid, HIV, Toxoplasmosis, PCR
Procedia PDF Downloads 374