Search results for: edge detection algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7072

Search results for: edge detection algorithm

6142 Deep Convolutional Neural Network for Detection of Microaneurysms in Retinal Fundus Images at Early Stage

Authors: Goutam Kumar Ghorai, Sandip Sadhukhan, Arpita Sarkar, Debprasad Sinha, G. Sarkar, Ashis K. Dhara

Abstract:

Diabetes mellitus is one of the most common chronic diseases in all countries and continues to increase in numbers significantly. Diabetic retinopathy (DR) is damage to the retina that occurs with long-term diabetes. DR is a major cause of blindness in the Indian population. Therefore, its early diagnosis is of utmost importance towards preventing progression towards imminent irreversible loss of vision, particularly in the huge population across rural India. The barriers to eye examination of all diabetic patients are socioeconomic factors, lack of referrals, poor access to the healthcare system, lack of knowledge, insufficient number of ophthalmologists, and lack of networking between physicians, diabetologists and ophthalmologists. A few diabetic patients often visit a healthcare facility for their general checkup, but their eye condition remains largely undetected until the patient is symptomatic. This work aims to focus on the design and development of a fully automated intelligent decision system for screening retinal fundus images towards detection of the pathophysiology caused by microaneurysm in the early stage of the diseases. Automated detection of microaneurysm is a challenging problem due to the variation in color and the variation introduced by the field of view, inhomogeneous illumination, and pathological abnormalities. We have developed aconvolutional neural network for efficient detection of microaneurysm. A loss function is also developed to handle severe class imbalance due to very small size of microaneurysms compared to background. The network is able to locate the salient region containing microaneurysms in case of noisy images captured by non-mydriatic cameras. The ground truth of microaneurysms is created by expert ophthalmologists for MESSIDOR database as well as private database, collected from Indian patients. The network is trained from scratch using the fundus images of MESSIDOR database. The proposed method is evaluated on DIARETDB1 and the private database. The method is successful in detection of microaneurysms for dilated and non-dilated types of fundus images acquired from different medical centres. The proposed algorithm could be used for development of AI based affordable and accessible system, to provide service at grass root-level primary healthcare units spread across the country to cater to the need of the rural people unaware of the severe impact of DR.

Keywords: retinal fundus image, deep convolutional neural network, early detection of microaneurysms, screening of diabetic retinopathy

Procedia PDF Downloads 129
6141 Automatic Extraction of Arbitrarily Shaped Buildings from VHR Satellite Imagery

Authors: Evans Belly, Imdad Rizvi, M. M. Kadam

Abstract:

Satellite imagery is one of the emerging technologies which are extensively utilized in various applications such as detection/extraction of man-made structures, monitoring of sensitive areas, creating graphic maps etc. The main approach here is the automated detection of buildings from very high resolution (VHR) optical satellite images. Initially, the shadow, the building and the non-building regions (roads, vegetation etc.) are investigated wherein building extraction is mainly focused. Once all the landscape is collected a trimming process is done so as to eliminate the landscapes that may occur due to non-building objects. Finally the label method is used to extract the building regions. The label method may be altered for efficient building extraction. The images used for the analysis are the ones which are extracted from the sensors having resolution less than 1 meter (VHR). This method provides an efficient way to produce good results. The additional overhead of mid processing is eliminated without compromising the quality of the output to ease the processing steps required and time consumed.

Keywords: building detection, shadow detection, landscape generation, label, partitioning, very high resolution (VHR) satellite imagery

Procedia PDF Downloads 304
6140 Graph Codes - 2D Projections of Multimedia Feature Graphs for Fast and Effective Retrieval

Authors: Stefan Wagenpfeil, Felix Engel, Paul McKevitt, Matthias Hemmje

Abstract:

Multimedia Indexing and Retrieval is generally designed and implemented by employing feature graphs. These graphs typically contain a significant number of nodes and edges to reflect the level of detail in feature detection. A higher level of detail increases the effectiveness of the results but also leads to more complex graph structures. However, graph-traversal-based algorithms for similarity are quite inefficient and computation intensive, especially for large data structures. To deliver fast and effective retrieval, an efficient similarity algorithm, particularly for large graphs, is mandatory. Hence, in this paper, we define a graph-projection into a 2D space (Graph Code) as well as the corresponding algorithms for indexing and retrieval. We show that calculations in this space can be performed more efficiently than graph-traversals due to a simpler processing model and a high level of parallelization. In consequence, we prove that the effectiveness of retrieval also increases substantially, as Graph Codes facilitate more levels of detail in feature fusion. Thus, Graph Codes provide a significant increase in efficiency and effectiveness (especially for Multimedia indexing and retrieval) and can be applied to images, videos, audio, and text information.

Keywords: indexing, retrieval, multimedia, graph algorithm, graph code

Procedia PDF Downloads 146
6139 One-Dimension Model for Positive Displacement Pump with Cavitation Algorithm

Authors: Francesco Rizzuto, Matthew Stickland, Stephan Hannot

Abstract:

The simulation of a positive displacement pump system with commercial software for Computer Fluid Dynamics (CFD), will result in an enormous computational effort due to the complexity of the pump system. This drawback restricts the use of it to a specific part of the pump in one simulation. This research focuses on developing an algorithm that provides a suitable result in agreement with experiment data, without that computational effort. The compressible equations are solved with an explicit algorithm. A comparison is presented between the FV method with Monotonic Upwind scheme for Conservative Laws (MUSCL) with slope limiter and experimental results. The source term for cavitation and friction is introduced into the algorithm with a slipping strategy and solved with a 4th order Runge-Kutta scheme (RK4). Different pumps are modeled and analyzed to evaluate the flexibility of the code. The simulation required minimal computation time and resources without compromising the accuracy of the simulation results. Therefore, this algorithm highlights the feasibility of pressure pulsation simulation as a design tool for an industrial purpose.

Keywords: cavitation, diaphragm, DVCM, finite volume, MUSCL, positive displacement pump

Procedia PDF Downloads 144
6138 Improving Temporal Correlations in Empirical Orthogonal Function Expansions for Data Interpolating Empirical Orthogonal Function Algorithm

Authors: Ping Bo, Meng Yunshan

Abstract:

Satellite-derived sea surface temperature (SST) is a key parameter for many operational and scientific applications. However, the disadvantage of SST data is a high percentage of missing data which is mainly caused by cloud coverage. Data Interpolating Empirical Orthogonal Function (DINEOF) algorithm is an EOF-based technique for reconstructing the missing data and has been widely used in oceanographic field. The reconstruction of SST images within a long time series using DINEOF can cause large discontinuities and one solution for this problem is to filter the temporal covariance matrix to reduce the spurious variability. Based on the previous researches, an algorithm is presented in this paper to improve the temporal correlations in EOF expansion. Similar with the previous researches, a filter, such as Laplacian filter, is implemented on the temporal covariance matrix, but the temporal relationship between two consecutive images which is used in the filter is considered in the presented algorithm, for example, two images in the same season are more likely correlated than those in the different seasons, hence the latter one is less weighted in the filter. The presented approach is tested for the monthly nighttime 4-km Advanced Very High Resolution Radiometer (AVHRR) Pathfinder SST for the long-term period spanning from 1989 to 2006. The results obtained from the presented algorithm are compared to those from the original DINEOF algorithm without filtering and from the DINEOF algorithm with filtering but without taking temporal relationship into account.

Keywords: data interpolating empirical orthogonal function, image reconstruction, sea surface temperature, temporal filter

Procedia PDF Downloads 319
6137 Securing Mobile Ad-Hoc Network Utilizing OPNET Simulator

Authors: Tariq A. El Shheibia, Halima Mohamed Belhamad

Abstract:

This paper is considered securing data based on multi-path protocol (SDMP) in mobile ad hoc network utilizing OPNET simulator modular 14.5, including the AODV routing protocol at the network as based multi-path algorithm for message security in MANETs. The main idea of this work is to present a way that is able to detect the attacker inside the MANETs. The detection for this attacker will be performed by adding some effective parameters to the network.

Keywords: MANET, AODV, malicious node, OPNET

Procedia PDF Downloads 279
6136 Adaptive Dehazing Using Fusion Strategy

Authors: M. Ramesh Kanthan, S. Naga Nandini Sujatha

Abstract:

The goal of haze removal algorithms is to enhance and recover details of scene from foggy image. In enhancement the proposed method focus into two main categories: (i) image enhancement based on Adaptive contrast Histogram equalization, and (ii) image edge strengthened Gradient model. Many circumstances accurate haze removal algorithms are needed. The de-fog feature works through a complex algorithm which first determines the fog destiny of the scene, then analyses the obscured image before applying contrast and sharpness adjustments to the video in real-time to produce image the fusion strategy is driven by the intrinsic properties of the original image and is highly dependent on the choice of the inputs and the weights. Then the output haze free image has reconstructed using fusion methodology. In order to increase the accuracy, interpolation method has used in the output reconstruction. A promising retrieval performance is achieved especially in particular examples.

Keywords: single image, fusion, dehazing, multi-scale fusion, per-pixel, weight map

Procedia PDF Downloads 458
6135 Performance Assessment of Carbon Nano Tube Based Cutting Fluid in Machining Process

Authors: Alluru Gopala Krishna, Thella Babu Rao

Abstract:

In machining, there is always a problem with heat generation and friction produced during the process as they consequently affect tool wear and surface finish. An instant heat transfer mechanism could protect the cutting tool edge and enhance the tool life by cooling the cutting edge of the tool. In the present work, carbon nanotube (CNT) based nano-cutting fluid is proposed for machining a hard-to-cut material. Tool wear and surface roughness are considered for the evaluation of the nano-cutting fluid in turning process. The performance of nanocoolant is assessed against the conventional coolant and dry machining conditions and it is observed that the proposed nanocoolant has produced better performance than the conventional coolant.

Keywords: CNT based nano cutting fluid, tool wear, turning, surface roughness

Procedia PDF Downloads 255
6134 Multi-Level Priority Based Task Scheduling Algorithm for Workflows in Cloud Environment

Authors: Anju Bala, Inderveer Chana

Abstract:

Task scheduling is the key concern for the execution of performance-driven workflow applications. As efficient scheduling can have major impact on the performance of the system, task scheduling is often chosen for assigning the request to resources in an efficient way based on cloud resource characteristics. In this paper, priority based task scheduling algorithm has been proposed that prioritizes the tasks based on the length of the instructions. The proposed scheduling approach prioritize the tasks of Cloud applications according to the limits set by six sigma control charts based on dynamic threshold values. Further, the proposed algorithm has been validated through the CloudSim toolkit. The experimental results demonstrate that the proposed algorithm is effective for handling multiple task lists from workflows and in considerably reducing Makespan and Execution time.

Keywords: cloud computing, priority based scheduling, task scheduling, VM allocation

Procedia PDF Downloads 507
6133 An Investigation into Fraud Detection in Financial Reporting Using Sugeno Fuzzy Classification

Authors: Mohammad Sarchami, Mohsen Zeinalkhani

Abstract:

Always, financial reporting system faces some problems to win public ear. The increase in the number of fraud and representation, often combined with the bankruptcy of large companies, has raised concerns about the quality of financial statements. So, investors, legislators, managers, and auditors have focused on significant fraud detection or prevention in financial statements. This article aims to investigate the Sugeno fuzzy classification to consider fraud detection in financial reporting of accepted firms by Tehran stock exchange. The hypothesis is: Sugeno fuzzy classification may detect fraud in financial reporting by financial ratio. Hypothesis was tested using Matlab software. Accuracy average was 81/80 in Sugeno fuzzy classification; so the hypothesis was confirmed.

Keywords: fraud, financial reporting, Sugeno fuzzy classification, firm

Procedia PDF Downloads 237
6132 Difference Expansion Based Reversible Data Hiding Scheme Using Edge Directions

Authors: Toshanlal Meenpal, Ankita Meenpal

Abstract:

A very important technique in reversible data hiding field is Difference expansion. Secret message as well as the cover image may be completely recovered without any distortion after data extraction process due to reversibility feature. In general, in any difference expansion scheme embedding is performed by integer transform in the difference image acquired by grouping two neighboring pixel values. This paper proposes an improved reversible difference expansion embedding scheme. We mainly consider edge direction for embedding by modifying the difference of two neighboring pixels values. In general, the larger difference tends to bring a degraded stego image quality than the smaller difference. Image quality in the range of 0.5 to 3.7 dB in average is achieved by the proposed scheme, which is shown through the experimental results. However payload wise it achieves almost similar capacity in comparisons with previous method.

Keywords: information hiding, wedge direction, difference expansion, integer transform

Procedia PDF Downloads 472
6131 A Coordinate-Based Heuristic Route Search Algorithm for Delivery Truck Routing Problem

Authors: Ahmed Tarek, Ahmed Alveed

Abstract:

Vehicle routing problem is a well-known re-search avenue in computing. Modern vehicle routing is more focused with the GPS-based coordinate system, as the state-of-the-art vehicle, and trucking systems are equipped with digital navigation. In this paper, a new two dimensional coordinate-based algorithm for addressing the vehicle routing problem for a supply chain network is proposed and explored, and the algorithm is compared with other available, and recently devised heuristics. For the algorithms discussed, which includes the pro-posed coordinate-based search heuristic as well, the advantages and the disadvantages associated with the heuristics are explored. The proposed algorithm is studied from the stand point of a small supermarket chain delivery network that supplies to its stores in four different states around the East Coast area, and is trying to optimize its trucking delivery cost. Minimizing the delivery cost for the supply network of a supermarket chain is important to ensure its business success.

Keywords: coordinate-based optimal routing, Hamiltonian Circuit, heuristic algorithm, traveling salesman problem, vehicle routing problem

Procedia PDF Downloads 138
6130 One Pot Synthesis of Cu–Ni–S/Ni Foam for the Simultaneous Removal and Detection of Norfloxacin

Authors: Xincheng Jiang, Yanyan An, Yaoyao Huang, Wei Ding, Manli Sun, Hong Li, Huaili Zheng

Abstract:

The residual antibiotics in the environment will pose a threat to the environment and human health. Thus, efficient removal and rapid detection of norfloxacin (NOR) in wastewater is very important. The main sources of NOR pollution are the agricultural, pharmaceutical industry and hospital wastewater. The total consumption of NOR in China can reach 5440 tons per year. It is found that neither animals nor humans can totally absorb and metabolize NOR, resulting in the excretion of NOR into the environment. Therefore, residual NOR has been detected in water bodies. The hazards of NOR in wastewater lie in three aspects: (1) the removal capacity of the wastewater treatment plant for NOR is limited (it is reported that the average removal efficiency of NOR in the wastewater treatment plant is only 68%); (2) NOR entering the environment will lead to the emergence of drug-resistant strains; (3) NOR is toxic to many aquatic species. At present, the removal and detection technologies of NOR are applied separately, which leads to a cumbersome operation process. The development of simultaneous adsorption-flocculation removal and FTIR detection of pollutants has three advantages: (1) Adsorption-flocculation technology promotes the detection technology (the enrichment effect on the material surface improves the detection ability); (2) The integration of adsorption-flocculation technology and detection technology reduces the material cost and makes the operation easier; (3) FTIR detection technology endows the water treatment agent with the ability of molecular recognition and semi-quantitative detection for pollutants. Thus, it is of great significance to develop a smart water treatment material with high removal capacity and detection ability for pollutants. This study explored the feasibility of combining NOR removal method with the semi-quantitative detection method. A magnetic Cu-Ni-S/Ni foam was synthesized by in-situ loading Cu-Ni-S nanostructures on the surface of Ni foam. The novelty of this material is the combination of adsorption-flocculation technology and semi-quantitative detection technology. Batch experiments showed that Cu-Ni-S/Ni foam has a high removal rate of NOR (96.92%), wide pH adaptability (pH=4.0-10.0) and strong ion interference resistance (0.1-100 mmol/L). According to the Langmuir fitting model, the removal capacity can reach 417.4 mg/g at 25 °C, which is much higher than that of other water treatment agents reported in most studies. Characterization analysis indicated that the main removal mechanisms are surface complexation, cation bridging, electrostatic attraction, precipitation and flocculation. Transmission FTIR detection experiments showed that NOR on Cu-Ni-S/Ni foam has easily recognizable FTIR fingerprints; the intensity of characteristic peaks roughly reflects the concentration information to some extent. This semi-quantitative detection method has a wide linear range (5-100 mg/L) and a low limit of detection (4.6 mg/L). These results show that Cu-Ni-S/Ni foam has excellent removal performance and semi-quantitative detection ability of NOR molecules. This paper provides a new idea for designing and preparing multi-functional water treatment materials to achieve simultaneous removal and semi-quantitative detection of organic pollutants in water.

Keywords: adsorption-flocculation, antibiotics detection, Cu-Ni-S/Ni foam, norfloxacin

Procedia PDF Downloads 67
6129 The Qualitative and Quantitative Detection of Pistachio in Processed Food Products Using Florescence Dye Based PCR

Authors: Ergün Şakalar, Şeyma Özçirak Ergün

Abstract:

Pistachio nuts, the fruits of the pistachio tree (Pistacia vera), are edible tree nuts highly valued for their organoleptic properties. Pistachio nuts used in snack foods, chocolates, baklava, meat products, ice-cream industries and other gourmet products as ingredients. Undeclared pistachios may be present in food products as a consequence of fraudulent substitution. Control of food samples is very important for safety and fraud. Mix of pistachio, peanut (Arachis hypogaea), pea (Pisum sativum L.) used instead of pistachio in food products, because pistachio is a considerably expensive nut. To solve this problem, a sensitive polymerase chain reaction PCR has been developed. A real-time PCR assay for the detection of pea, peanut and pistachio in baklava was designed by using EvaGreen fluorescence dye. Primers were selected from powerful regions for identification of pea, peanut and pistachio. DNA from reference samples and industrial products were successfully extracted with the GIDAGEN® Multi-Fast DNA Isolation Kit. Genomes were identified based on their specific melting peaks (Mp) which are 77°C, 85.5°C and 82.5°C for pea, peanut and pistachio, respectively. Homogenized mixtures of raw pistachio, pea and peanut were prepared with the ratio of 0.01%, 0.1%, 1%, 10%, 40% and 70% of pistachio. Quantitative detection limit of assay was 0.1% for pistachio. Also, real-time PCR technique used in this study allowed the qualitative detection of as little as 0.001% level of peanut DNA, 0,000001% level of pistachio DNA and 0.000001% level of pea DNA in the experimental admixtures. This assay represents a potentially valuable diagnostic method for detection of nut species adulterated with pistachio as well as for highly specific and relatively rapid detection of small amounts of pistachio in food samples.

Keywords: pea, peanut, pistachio, real-time PCR

Procedia PDF Downloads 257
6128 Performance Analysis of New Types of Reference Targets Based on Spaceborne and Airborne SAR Data

Authors: Y. S. Zhou, C. R. Li, L. L. Tang, C. X. Gao, D. J. Wang, Y. Y. Guo

Abstract:

Triangular trihedral corner reflector (CR) has been widely used as point target for synthetic aperture radar (SAR) calibration and image quality assessment. The additional “tip” of the triangular plate does not contribute to the reflector’s theoretical RCS and if it interacts with a perfectly reflecting ground plane, it will yield an increase of RCS at the radar bore-sight and decrease the accuracy of SAR calibration and image quality assessment. Regarding this problem, two types of CRs were manufactured. One was the hexagonal trihedral CR. It is a self-illuminating CR with relatively small plate edge length, while large edge length usually introduces unexpected edge diffraction error. The other was the triangular trihedral CR with extended bottom plate which considers the effect of ‘tip’ into the total RCS. In order to assess the performance of the two types of new CRs, flight campaign over the National Calibration and Validation Site for High Resolution Remote Sensors was carried out. Six hexagonal trihedral CRs and two bottom-extended trihedral CRs, as well as several traditional triangular trihedral CRs, were deployed. KOMPSAT-5 X-band SAR image was acquired for the performance analysis of the hexagonal trihedral CRs. C-band airborne SAR images were acquired for the performance analysis of the bottom-extended trihedral CRs. The analysis results showed that the impulse response function of both the hexagonal trihedral CRs and bottom-extended trihedral CRs were much closer to the ideal sinc-function than the traditional triangular trihedral CRs. The flight campaign results validated the advantages of new types of CRs and they might be useful in the future SAR calibration mission.

Keywords: synthetic aperture radar, calibration, corner reflector, KOMPSAT-5

Procedia PDF Downloads 262
6127 A Weighted K-Medoids Clustering Algorithm for Effective Stability in Vehicular Ad Hoc Networks

Authors: Rejab Hajlaoui, Tarek Moulahi, Hervé Guyennet

Abstract:

In a highway scenario, the vehicle speed can exceed 120 kmph. Therefore, any vehicle can enter or leave the network within a very short time. This mobility adversely affects the network connectivity and decreases the life time of all established links. To ensure an effective stability in vehicular ad hoc networks with minimum broadcasting storm, we have developed a weighted algorithm based on the k-medoids clustering algorithm (WKCA). Indeed, the number of clusters and the initial cluster heads will not be selected randomly as usual, but considering the available transmission range and the environment size. Then, to ensure optimal assignment of nodes to clusters in both k-medoids phases, the combined weight of any node will be computed according to additional metrics including direction, relative speed and proximity. Empirical results prove that in addition to the convergence speed that characterizes the k-medoids algorithm, our proposed model performs well both AODV-Clustering and OLSR-Clustering protocols under different densities and velocities in term of end-to-end delay, packet delivery ratio, and throughput.

Keywords: communication, clustering algorithm, k-medoids, sensor, vehicular ad hoc network

Procedia PDF Downloads 227
6126 Sweepline Algorithm for Voronoi Diagram of Polygonal Sites

Authors: Dmitry A. Koptelov, Leonid M. Mestetskiy

Abstract:

Voronoi Diagram (VD) of finite set of disjoint simple polygons, called sites, is a partition of plane into loci (for each site at the locus) – regions, consisting of points that are closer to a given site than to all other. Set of polygons is a universal model for many applications in engineering, geoinformatics, design, computer vision, and graphics. VD of polygons construction usually done with a reduction to task of constructing VD of segments, for which there are effective O(n log n) algorithms for n segments. Preprocessing – constructing segments from polygons’ sides, and postprocessing – polygon’s loci construction by merging the loci of the sides of each polygon are also included in reduction. This approach doesn’t take into account two specific properties of the resulting segment sites. Firstly, all this segments are connected in pairs in the vertices of the polygons. Secondly, on the one side of each segment lies the interior of the polygon. The polygon is obviously included in its locus. Using this properties in the algorithm for VD construction is a resource to reduce computations. The article proposes an algorithm for the direct construction of VD of polygonal sites. Algorithm is based on sweepline paradigm, allowing to effectively take into account these properties. The solution is performed based on reduction. Preprocessing is the constructing of set of sites from vertices and edges of polygons. Each site has an orientation such that the interior of the polygon lies to the left of it. Proposed algorithm constructs VD for set of oriented sites with sweepline paradigm. Postprocessing is a selecting of edges of this VD formed by the centers of empty circles touching different polygons. Improving the efficiency of the proposed sweepline algorithm in comparison with the general Fortune algorithm is achieved due to the following fundamental solutions: 1. Algorithm constructs only such VD edges, which are on the outside of polygons. Concept of oriented sites allowed to avoid construction of VD edges located inside the polygons. 2. The list of events in sweepline algorithm has a special property: the majority of events are connected with “medium” polygon vertices, where one incident polygon side lies behind the sweepline and the other in front of it. The proposed algorithm processes such events in constant time and not in logarithmic time, as in the general Fortune algorithm. The proposed algorithm is fully implemented and tested on a large number of examples. The high reliability and efficiency of the algorithm is also confirmed by computational experiments with complex sets of several thousand polygons. It should be noted that, despite the considerable time that has passed since the publication of Fortune's algorithm in 1986, a full-scale implementation of this algorithm for an arbitrary set of segment sites has not been made. The proposed algorithm fills this gap for an important special case - a set of sites formed by polygons.

Keywords: voronoi diagram, sweepline, polygon sites, fortunes' algorithm, segment sites

Procedia PDF Downloads 168
6125 Chemiluminescent Detection of Microorganisms in Food/Drug Product Using Reducing Agents and Gold Nanoplates

Authors: Minh-Phuong Ngoc Bui, Abdennour Abbas

Abstract:

Microbial spoilage of food/drug has been a constant nuisance and an unavoidable problem throughout history that affects food/drug quality and safety in a variety of ways. A simple and rapid test of fungi and bacteria in food/drugs and environmental clinical samples is essential for proper management of contamination. A number of different techniques have been developed for detection and enumeration of foodborne microorganism including plate counting, enzyme-linked immunosorbent assay (ELISA), polymer chain reaction (PCR), nucleic acid sensor, electrical and microscopy methods. However, the significant drawbacks of these techniques are highly demand of operation skills and the time and cost involved. In this report, we introduce a rapid method for detection of bacteria and fungi in food/drug products using a specific interaction between a reducing agent (tris(2-carboxylethyl)phosphine (TCEP)) and the microbial surface proteins. The chemical reaction was transferred to a transduction system using gold nanoplates-enhanced chemiluminescence. We have optimized our nanoplates synthetic conditions, characterized the chemiluminescence parameters and optimized conditions for the microbial assay. The new detection method was applied for rapid detection of bacteria (E.coli sp. and Lactobacillus sp.) and fungi (Mucor sp.), with limit of detection as low as single digit cells per mL within 10 min using a portable luminometer. We expect our simple and rapid detection method to be a powerful alternative to the conventional plate counting and immunoassay methods for rapid screening of microorganisms in food/drug products.

Keywords: microorganism testing, gold nanoplates, chemiluminescence, reducing agents, luminol

Procedia PDF Downloads 288
6124 Sensor and Actuator Fault Detection in Connected Vehicles under a Packet Dropping Network

Authors: Z. Abdollahi Biron, P. Pisu

Abstract:

Connected vehicles are one of the promising technologies for future Intelligent Transportation Systems (ITS). A connected vehicle system is essentially a set of vehicles communicating through a network to exchange their information with each other and the infrastructure. Although this interconnection of the vehicles can be potentially beneficial in creating an efficient, sustainable, and green transportation system, a set of safety and reliability challenges come out with this technology. The first challenge arises from the information loss due to unreliable communication network which affects the control/management system of the individual vehicles and the overall system. Such scenario may lead to degraded or even unsafe operation which could be potentially catastrophic. Secondly, faulty sensors and actuators can affect the individual vehicle’s safe operation and in turn will create a potentially unsafe node in the vehicular network. Further, sending that faulty sensor information to other vehicles and failure in actuators may significantly affect the safe operation of the overall vehicular network. Therefore, it is of utmost importance to take these issues into consideration while designing the control/management algorithms of the individual vehicles as a part of connected vehicle system. In this paper, we consider a connected vehicle system under Co-operative Adaptive Cruise Control (CACC) and propose a fault diagnosis scheme that deals with these aforementioned challenges. Specifically, the conventional CACC algorithm is modified by adding a Kalman filter-based estimation algorithm to suppress the effect of lost information under unreliable network. Further, a sliding mode observer-based algorithm is used to improve the sensor reliability under faults. The effectiveness of the overall diagnostic scheme is verified via simulation studies.

Keywords: fault diagnostics, communication network, connected vehicles, packet drop out, platoon

Procedia PDF Downloads 228
6123 Chinese Event Detection Technique Based on Dependency Parsing and Rule Matching

Authors: Weitao Lin

Abstract:

To quickly extract adequate information from large-scale unstructured text data, this paper studies the representation of events in Chinese scenarios and performs the regularized abstraction. It proposes a Chinese event detection technique based on dependency parsing and rule matching. The method first performs dependency parsing on the original utterance, then performs pattern matching at the word or phrase granularity based on the results of dependent syntactic analysis, filters out the utterances with prominent non-event characteristics, and obtains the final results. The experimental results show the effectiveness of the method.

Keywords: natural language processing, Chinese event detection, rules matching, dependency parsing

Procedia PDF Downloads 125
6122 Evaluation of a Data Fusion Algorithm for Detecting and Locating a Radioactive Source through Monte Carlo N-Particle Code Simulation and Experimental Measurement

Authors: Hadi Ardiny, Amir Mohammad Beigzadeh

Abstract:

Through the utilization of a combination of various sensors and data fusion methods, the detection of potential nuclear threats can be significantly enhanced by extracting more information from different data. In this research, an experimental and modeling approach was employed to track a radioactive source by combining a surveillance camera and a radiation detector (NaI). To run this experiment, three mobile robots were utilized, with one of them equipped with a radioactive source. An algorithm was developed in identifying the contaminated robot through correlation between camera images and camera data. The computer vision method extracts the movements of all robots in the XY plane coordinate system, and the detector system records the gamma-ray count. The position of the robots and the corresponding count of the moving source were modeled using the MCNPX simulation code while considering the experimental geometry. The results demonstrated a high level of accuracy in finding and locating the target in both the simulation model and experimental measurement. The modeling techniques prove to be valuable in designing different scenarios and intelligent systems before initiating any experiments.

Keywords: nuclear threats, radiation detector, MCNPX simulation, modeling techniques, intelligent systems

Procedia PDF Downloads 99
6121 Robust Barcode Detection with Synthetic-to-Real Data Augmentation

Authors: Xiaoyan Dai, Hsieh Yisan

Abstract:

Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.

Keywords: barcode detection, data augmentation, deep learning, image-based processing

Procedia PDF Downloads 151
6120 Capturing the Stress States in Video Conferences by Photoplethysmographic Pulse Detection

Authors: Jarek Krajewski, David Daxberger

Abstract:

We propose a stress detection method based on an RGB camera using heart rate detection, also known as Photoplethysmography Imaging (PPGI). This technique focuses on the measurement of the small changes in skin colour caused by blood perfusion. A stationary lab setting with simulated video conferences is chosen using constant light conditions and a sampling rate of 30 fps. The ground truth measurement of heart rate is conducted with a common PPG system. The proposed approach for pulse peak detection is based on a machine learning-based approach, applying brute force feature extraction for the prediction of heart rate pulses. The statistical analysis showed good agreement (correlation r = .79, p<0.05) between the reference heart rate system and the proposed method. Based on these findings, the proposed method could provide a reliable, low-cost, and contactless way of measuring HR parameters in daily-life environments.

Keywords: heart rate, PPGI, machine learning, brute force feature extraction

Procedia PDF Downloads 119
6119 Low Cost Real Time Robust Identification of Impulsive Signals

Authors: R. Biondi, G. Dys, G. Ferone, T. Renard, M. Zysman

Abstract:

This paper describes an automated implementable system for impulsive signals detection and recognition. The system uses a Digital Signal Processing device for the detection and identification process. Here the system analyses the signals in real time in order to produce a particular response if needed. The system analyses the signals in real time in order to produce a specific output if needed. Detection is achieved through normalizing the inputs and comparing the read signals to a dynamic threshold and thus avoiding detections linked to loud or fluctuating environing noise. Identification is done through neuronal network algorithms. As a setup our system can receive signals to “learn” certain patterns. Through “learning” the system can recognize signals faster, inducing flexibility to new patterns similar to those known. Sound is captured through a simple jack input, and could be changed for an enhanced recording surface such as a wide-area recorder. Furthermore a communication module can be added to the apparatus to send alerts to another interface if needed.

Keywords: sound detection, impulsive signal, background noise, neural network

Procedia PDF Downloads 306
6118 Seismic Retrofitting of Structures Using Steel Plate Slit Dampers Based on Genetic Algorithm

Authors: Mohamed Noureldin, Jinkoo Kim

Abstract:

In this study, a genetic algorithm was used to find out the optimum locations of the slit dampers satisfying a target displacement. A seismic retrofit scheme for a building structure was presented using steel plate slit dampers. A cyclic loading test was used to verify the energy dissipation capacity of the slit damper. The seismic retrofit of the model structure using the slit dampers was compared with the retrofit with enlarging shear walls. The capacity spectrum method was used to propose a simple damper distribution scheme proportional to the inter-story drifts. The validity of the simple story-wise damper distribution procedure was verified by comparing the results of the genetic algorithm. It was observed that the proposed simple damper distribution pattern was in a good agreement with the optimum distribution obtained from the genetic algorithm. Acknowledgment: This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2017R1D1A1B03032809).

Keywords: slit dampers, seismic retrofit, genetic algorithm, optimum design

Procedia PDF Downloads 211
6117 Bundle Block Detection Using Spectral Coherence and Levenberg Marquardt Neural Network

Authors: K. Padmavathi, K. Sri Ramakrishna

Abstract:

This study describes a procedure for the detection of Left and Right Bundle Branch Block (LBBB and RBBB) ECG patterns using spectral Coherence(SC) technique and LM Neural Network. The Coherence function finds common frequencies between two signals and evaluate the similarity of the two signals. The QT variations of Bundle Blocks are observed in lead V1 of ECG. Spectral Coherence technique uses Welch method for calculating PSD. For the detection of normal and Bundle block beats, SC output values are given as the input features for the LMNN classifier. Overall accuracy of LMNN classifier is 99.5 percent. The data was collected from MIT-BIH Arrhythmia database.

Keywords: bundle block, SC, LMNN classifier, welch method, PSD, MIT-BIH, arrhythmia database

Procedia PDF Downloads 268
6116 Sequential Covering Algorithm for Nondifferentiable Global Optimization Problem and Applications

Authors: Mohamed Rahal, Djaouida Guetta

Abstract:

In this paper, the one-dimensional unconstrained global optimization problem of continuous functions satifying a Hölder condition is considered. We extend the algorithm of sequential covering SCA for Lipschitz functions to a large class of Hölder functions. The convergence of the method is studied and the algorithm can be applied to systems of nonlinear equations. Finally, some numerical examples are presented and illustrate the efficiency of the present approach.

Keywords: global optimization, Hölder functions, sequential covering method, systems of nonlinear equations

Procedia PDF Downloads 354
6115 Study of Adaptive Filtering Algorithms and the Equalization of Radio Mobile Channel

Authors: Said Elkassimi, Said Safi, B. Manaut

Abstract:

This paper presented a study of three algorithms, the equalization algorithm to equalize the transmission channel with ZF and MMSE criteria, application of channel Bran A, and adaptive filtering algorithms LMS and RLS to estimate the parameters of the equalizer filter, i.e. move to the channel estimation and therefore reflect the temporal variations of the channel, and reduce the error in the transmitted signal. So far the performance of the algorithm equalizer with ZF and MMSE criteria both in the case without noise, a comparison of performance of the LMS and RLS algorithm.

Keywords: adaptive filtering second equalizer, LMS, RLS Bran A, Proakis (B) MMSE, ZF

Procedia PDF Downloads 305
6114 Sync Consensus Algorithm: Trying to Reach an Agreement at Full Speed

Authors: Yuri Zinchenko

Abstract:

Recently, distributed storage systems have been used more and more in various aspects of everyday life. They provide such necessary properties as Scalability, Fault Tolerance, Durability, and others. At the same time, not only reliable but also fast data storage remains one of the most pressing issues in this area. That brings us to the consensus algorithm as one of the most important components that has a great impact on the functionality of a distributed system. This paper is the result of an analysis of several well-known consensus algorithms, such as Paxos and Raft. The algorithm it offers, called Sync, promotes, but does not insist on simultaneous writing to the nodes (which positively affects the overall writing speed) and tries to minimize the system's inactive time. This allows nodes to reach agreement on the system state in a shorter period, which is a critical factor for distributed systems. Also when developing Sync, a lot of attention was paid to such criteria as simplicity and intuitiveness, the importance of which is difficult to overestimate.

Keywords: sync, consensus algorithm, distributed system, leader-based, synchronization.

Procedia PDF Downloads 49
6113 Kinoform Optimisation Using Gerchberg- Saxton Iterative Algorithm

Authors: M. Al-Shamery, R. Young, P. Birch, C. Chatwin

Abstract:

Computer Generated Holography (CGH) is employed to create digitally defined coherent wavefronts. A CGH can be created by using different techniques such as by using a detour-phase technique or by direct phase modulation to create a kinoform. The detour-phase technique was one of the first techniques that was used to generate holograms digitally. The disadvantage of this technique is that the reconstructed image often has poor quality due to the limited dynamic range it is possible to record using a medium with reasonable spatial resolution.. The kinoform (phase-only hologram) is an alternative technique. In this method, the phase of the original wavefront is recorded but the amplitude is constrained to be constant. The original object does not need to exist physically and so the kinoform can be used to reconstruct an almost arbitrary wavefront. However, the image reconstructed by this technique contains high levels of noise and is not identical to the reference image. To improve the reconstruction quality of the kinoform, iterative techniques such as the Gerchberg-Saxton algorithm (GS) are employed. In this paper the GS algorithm is described for the optimisation of a kinoform used for the reconstruction of a complex wavefront. Iterations of the GS algorithm are applied to determine the phase at a plane (with known amplitude distribution which is often taken as uniform), that satisfies given phase and amplitude constraints in a corresponding Fourier plane. The GS algorithm can be used in this way to enhance the reconstruction quality of the kinoform. Different images are employed as the reference object and their kinoform is synthesised using the GS algorithm. The quality of the reconstructed images is quantified to demonstrate the enhanced reconstruction quality achieved by using this method.

Keywords: computer generated holography, digital holography, Gerchberg-Saxton algorithm, kinoform

Procedia PDF Downloads 517