Search results for: intrusion detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3483

Search results for: intrusion detection

1773 Revealing Thermal Degradation Characteristics of Distinctive Oligo-and Polisaccharides of Prebiotic Relevance

Authors: Attila Kiss, Erzsébet Némedi, Zoltán Naár

Abstract:

As natural prebiotic (non-digestible) carbohydrates stimulate the growth of colon microflora and contribute to maintain the health of the host, analytical studies aiming at revealing the chemical behavior of these beneficial food components came to the forefront of interest. Food processing (especially baking) may lead to a significant conversion of the parent compounds, hence it is of utmost importance to characterize the transformation patterns and the plausible decomposition products formed by thermal degradation. The relevance of this work is confirmed by the wide-spread use of these carbohydrates (fructo-oligosaccharides, cyclodextrins, raffinose and resistant starch) in the food industry. More and more functional foodstuffs are being developed based on prebiotics as bioactive components. 12 different types of oligosaccharides have been investigated in order to reveal their thermal degradation characteristics. Different carbohydrate derivatives (D-fructose and D-glucose oligomers and polymers) have been exposed to elevated temperatures (150 °C 170 °C, 190 °C, 210 °C, and 220 °C) for 10 min. An advanced HPLC method was developed and used to identify the decomposition products of carbohydrates formed as a consequence of thermal treatment. Gradient elution was applied with binary solvent elution (acetonitrile, water) through amine based carbohydrate column. Evaporative light scattering (ELS) proved to be suitable for the reliable detection of the UV/VIS inactive carbohydrate degradation products. These experimental conditions and applied advanced techniques made it possible to survey all the formed intermediers. Change in oligomer distribution was established in cases of all studied prebiotics throughout the thermal treatments. The obtained results indicate increased extent of chain degradation of the carbohydrate moiety at elevated temperatures. Prevalence of oligomers with shorter chain length and even the formation of monomer sugars (D-glucose and D-fructose) might be observed at higher temperatures. Unique oligomer distributions, which have not been described previously are revealed in the case of each studied, specific carbohydrate, which might result in various prebiotic activities. Resistant starches exhibited high stability when being thermal treated. The degradation process has been modeled by a plausible reaction mechanism, in which proton catalyzed degradation and chain cleavage take place.

Keywords: prebiotics, thermal degradation, fructo-oligosaccharide, HPLC, ELS detection

Procedia PDF Downloads 294
1772 System Detecting Border Gateway Protocol Anomalies Using Local and Remote Data

Authors: Alicja Starczewska, Aleksander Nawrat, Krzysztof Daniec, Jarosław Homa, Kacper Hołda

Abstract:

Border Gateway Protocol is the main routing protocol that enables routing establishment between all autonomous systems, which are the basic administrative units of the internet. Due to the poor protection of BGP, it is important to use additional BGP security systems. Many solutions to this problem have been proposed over the years, but none of them have been implemented on a global scale. This article describes a system capable of building images of real-time BGP network topology in order to detect BGP anomalies. Our proposal performs a detailed analysis of BGP messages that come into local network cards supplemented by information collected by remote collectors in different localizations.

Keywords: BGP, BGP hijacking, cybersecurity, detection

Procedia PDF Downloads 62
1771 Optimized Deep Learning-Based Facial Emotion Recognition System

Authors: Erick C. Valverde, Wansu Lim

Abstract:

Facial emotion recognition (FER) system has been recently developed for more advanced computer vision applications. The ability to identify human emotions would enable smart healthcare facility to diagnose mental health illnesses (e.g., depression and stress) as well as better human social interactions with smart technologies. The FER system involves two steps: 1) face detection task and 2) facial emotion recognition task. It classifies the human expression in various categories such as angry, disgust, fear, happy, sad, surprise, and neutral. This system requires intensive research to address issues with human diversity, various unique human expressions, and variety of human facial features due to age differences. These issues generally affect the ability of the FER system to detect human emotions with high accuracy. Early stage of FER systems used simple supervised classification task algorithms like K-nearest neighbors (KNN) and artificial neural networks (ANN). These conventional FER systems have issues with low accuracy due to its inefficiency to extract significant features of several human emotions. To increase the accuracy of FER systems, deep learning (DL)-based methods, like convolutional neural networks (CNN), are proposed. These methods can find more complex features in the human face by means of the deeper connections within its architectures. However, the inference speed and computational costs of a DL-based FER system is often disregarded in exchange for higher accuracy results. To cope with this drawback, an optimized DL-based FER system is proposed in this study.An extreme version of Inception V3, known as Xception model, is leveraged by applying different network optimization methods. Specifically, network pruning and quantization are used to enable lower computational costs and reduce memory usage, respectively. To support low resource requirements, a 68-landmark face detector from Dlib is used in the early step of the FER system.Furthermore, a DL compiler is utilized to incorporate advanced optimization techniques to the Xception model to improve the inference speed of the FER system. In comparison to VGG-Net and ResNet50, the proposed optimized DL-based FER system experimentally demonstrates the objectives of the network optimization methods used. As a result, the proposed approach can be used to create an efficient and real-time FER system.

Keywords: deep learning, face detection, facial emotion recognition, network optimization methods

Procedia PDF Downloads 109
1770 Characterization of β-Lactamases Resistance amongst Acinetobacter Baumannii Isolated from Clinical Samples, Egypt

Authors: Amal Saafan, Kareem Al Sofy, Sameh AbdelGhani, Magdy Amin

Abstract:

Background: Acinetobacter spp. resistance towards β-lactam antibiotics is mediated mainly by different classes of β-lactamases production; detection of some genes responsible for production of β-lactamases is the objective of the study. Methods: One hundred fifty bacterial isolates were recovered from blood, sputum, and urine specimens from different hospitals in Egypt. Sixty-nine isolate were identified as Acinetobacter baumannii using traditional biochemical tests, CHROM agar, MicroScan and PCR amplification of blaoxa-51like gene. Acinetobacterbaumannii isolates were grouped into carbapenem resistant group (GP1), cefotaxime, ceftazidime and cefoxitin resistant group (GP2) and carbapenem and cephalosporin non-resistant group (GP3). Carbapenemase activity was screened using modified Hodge test (MHT) for GP1.Metallo-β-lactamases screening was performed for MHT positive isolates using double disk synergy test (DDST) and combined disk test (CDT). Amp C activity was screened using Amp C disk test with Tris-EDTA, DDST, and CDT for GP2. Finally, PCR amplification of blaoxa-51like, blaoxa-23like, blaIMP-like, blaVIM-like, and blaADC-like genes was performed for isolates that showed, at least, two positive results of three for both AmpC and carbapenemases phenotypic screening tests (obvious activity), in addition to GP3 (for comparison). Detection of blaoxa-51like and blaADC-like genes preceded by ISAba1 was also performed. Results: Antibiogram of 69 pure Acinetobacter baumannii isolates resulted in 57, 64, and 2 isolates enrolled into GP1, GP2, and GP3, respectively. Carbapenemase activity was shown by 49(85.9%) isolate using MHT. Metallo-β-lactamases screening revealed 32(65.3%) and 35(71.4%) using DDST and CDT, respectively.AmpC activity was shown by 43(67.2%) and 50 (78.1%) isolates using AmpC disk test with Tris-EDTA, and both DDST and CDT, respectively. Twenty-seven isolates showed obvious activity, all of them (100%) were harboring blaoxa-51like and blaADC-like genes, while blaoxa-23like, blaIMP-like andblaVIM-like genes were harbored by 23(85.2%), 9 (33.%) and no isolate respectively. Only 12 (44.4%) isolates harbored blaoxa-51like and blaADC-like genes preceded by ISAba1. GP3 isolates showed only positive blaoxa-51like and blaADC-like genes. Conclusion: It is not possible to correlate resistance with presence of blaoxa-51like and blaADC-like genes and presence of ISAba1 was immediate as transcriptional promoter. A blaoxa-23like gene played an important role in carbapenem resistance when compared with blaIMP-like and blaVIM-like gene.

Keywords: acinetobacter, beta-lactams, resistance, antimicrobial agents

Procedia PDF Downloads 331
1769 A Fundamental Study for Real-Time Safety Evaluation System of Landing Pier Using FBG Sensor

Authors: Heungsu Lee, Youngseok Kim, Jonghwa Yi, Chul Park

Abstract:

A landing pier is subjected to safety assessment by visual inspection and design data, but it is difficult to check the damage in real-time. In this study, real - time damage detection and safety evaluation methods were studied. As a result of structural analysis of the arbitrary landing pier structure, the inflection point of deformation and moment occurred at 10%, 50%, and 90% of pile length. The critical value of Fiber Bragg Grating (FBG) sensor was set according to the safety factor, and the FBG sensor application method for real - time safety evaluation was derived.

Keywords: FBG sensor, harbor structure, maintenance, safety evaluation system

Procedia PDF Downloads 199
1768 Microfluidic Lab on Chip Platform for the Detection of Arthritis Markers from Synovial Organ on Chip by Miniaturizing Enzyme-Linked ImmunoSorbent Assay Protocols

Authors: Laura Boschis, Elena D. Ozzello, Enzo Mastromatteo

Abstract:

Point of care diagnostic finds growing interest in medicine and agri-food because of faster intervention and prevention. EliChip is a microfluidic platform to perform Point of Care immunoenzymatic assay based on ready-to-use kits and a portable instrument to manage fluidics and read reliable quantitative results. Thanks to miniaturization, analyses are faster and more sensible than conventional ELISA. EliChip is one of the crucial assets of the Europen-founded Flamingo project for in-line measuring inflammatory markers.

Keywords: lab on chip, point of care, immunoenzymatic analysis, synovial arthritis

Procedia PDF Downloads 170
1767 Limiting Freedom of Expression to Fight Radicalization: The 'Silencing' of Terrorists Does Not Always Allow Rights to 'Speak Loudly'

Authors: Arianna Vedaschi

Abstract:

This paper addresses the relationship between freedom of expression, national security and radicalization. Is it still possible to talk about a balance between the first two elements? Or, due to the intrusion of the third, is it more appropriate to consider freedom of expression as “permanently disfigured” by securitarian concerns? In this study, both the legislative and the judicial level are taken into account and the comparative method is employed in order to provide the reader with a complete framework of relevant issues and a workable set of solutions. The analysis moves from the finding according to which the tension between free speech and national security has become a major issue in democratic countries, whose very essence is continuously endangered by the ever-changing and multi-faceted threat of international terrorism. In particular, a change in terrorist groups’ recruiting pattern, attracting more and more people by way of a cutting-edge communicative strategy, often employing sophisticated technology as a radicalization tool, has called on law-makers to modify their approach to dangerous speech. While traditional constitutional and criminal law used to punish speech only if it explicitly and directly incited the commission of a criminal action (“cause-effect” model), so-called glorification offences – punishing mere ideological support for terrorism, often on the web – are becoming commonplace in the comparative scenario. Although this is direct, and even somehow understandable, consequence of the impending terrorist menace, this research shows many problematic issues connected to such a preventive approach. First, from a predominantly theoretical point of view, this trend negatively impacts on the already blurred line between permissible and prohibited speech. Second, from a pragmatic point of view, such legislative tools are not always suitable to keep up with ongoing developments of both terrorist groups and their use of technology. In other words, there is a risk that such measures become outdated even before their application. Indeed, it seems hard to still talk about a proper balance: what was previously clearly perceived as a balancing of values (freedom of speech v. public security) has turned, in many cases, into a hierarchy with security at its apex. In light of these findings, this paper concludes that such a complex issue would perhaps be better dealt with through a combination of policies: not only criminalizing ‘terrorist speech,’ which should be relegated to a last resort tool, but acting at an even earlier stage, i.e., trying to prevent dangerous speech itself. This might be done by promoting social cohesion and the inclusion of minorities, so as to reduce the probability of people considering terrorist groups as a “viable option” to deal with the lack of identification within their social contexts.

Keywords: radicalization, free speech, international terrorism, national security

Procedia PDF Downloads 183
1766 The Ideal for Building Reservior Under the Ground in Mekong Delta in Vietnam

Authors: Huu Hue Van

Abstract:

The Mekong Delta is the region in southwestern Vietnam where the Mekong River approaches and flow into the sea through a network of distributaries. The Climate Change Research Institute at University of Can Tho, in studying the possible consequences of climate change, has predicted that, many provinces in the Mekong Delta will be flooded by the year 2030. The Mekong Delta lacks fresh water in the dry season. Being served for daily life, industry and agriculture in the dry season, the water is mainly taken from layers of soil contained water under the ground (aquifers) depleted water; the water level in aquifers have decreased. Previously, the Mekong Delta can withstand two bad scenarios in the future: 1) The Mekong Delta will be submerged into the sea again: Due to subsidence of the ground (over-exploitation of groundwater), subsidence of constructions because of the low groundwater level (10 years ago, some of constructions were built on the foundation of Melaleuca poles planted in Mekong Delta, Melaleuca poles have to stay in saturated soil layer fully, if not, they decay easyly; due to the top of Melaleuca poles are higher than the groundwater level, the top of Melaleuca poles will decay and cause subsidence); erosion the river banks (because of the hydroelectric dams in the upstream of the Mekong River is blocking the flow, reducing the concentration of suspended substances in the flow caused erosion the river banks) and the delta will be flooded because of sea level rise (climate change). 2) The Mekong Delta will be deserted: People will migrate to other places to make a living because of no planting due to alum capillary (In Mekong Delta, there is a layer of alum soil under the ground, the elevation of groundwater level is lower than the the elevation of layer of alum soil, alum will be capillary to the arable soil layer); there is no fresh water for cultivation and daily life (because of saline intrusion and groundwater depletion in the aquifers below). Mekong Delta currently has about seven aquifers below with a total depth about 500 m. The water mainly has exploited in the middle - upper Pleistocene aquifer (qp2-3). The major cause of two bad scenarios in the future is over-exploitation of water in aquifers. Therefore, studying and building water reservoirs in seven aquifers will solve many pressing problems such as preventing subsidence, providing water for the whole delta, especially in coastal provinces, favorable to nature, saving land ( if we build the water lake on the surface of the delta, we will need a lot of land), pollution limitation (because when building some hydraulic structures for preventing the salt instrutions and for storing water in the lake on the surface, we cause polluted in the lake)..., It is necessary to build a reservoir under the ground in aquifers in the Mekong Delta. The super-sized reservoir will contribute to the existence and development of the Mekong Delta.

Keywords: aquifers, aquifers storage, groundwater, land subsidence, underground reservoir

Procedia PDF Downloads 72
1765 Predicting Student Performance Based on Coding Behavior in STEAMplug

Authors: Giovanni Gonzalez Araujo, Michael Kyrilov, Angelo Kyrilov

Abstract:

STEAMplug is a web-based innovative educational platform which makes teaching easier and learning more effective. It requires no setup, eliminating the barriers to entry, allowing students to focus on their learning throughreal-world development environments. The student-centric tools enable easy collaboration between peers and teachers. Analyzing user interactions with the system enables us to predict student performance and identify at-risk students, allowing early instructor intervention.

Keywords: plagiarism detection, identifying at-Risk Students, education technology, e-learning system, collaborative development, learning and teaching with technology

Procedia PDF Downloads 134
1764 Prediction of Sepsis Illness from Patients Vital Signs Using Long Short-Term Memory Network and Dynamic Analysis

Authors: Marcio Freire Cruz, Naoaki Ono, Shigehiko Kanaya, Carlos Arthur Mattos Teixeira Cavalcante

Abstract:

The systems that record patient care information, known as Electronic Medical Record (EMR) and those that monitor vital signs of patients, such as heart rate, body temperature, and blood pressure have been extremely valuable for the effectiveness of the patient’s treatment. Several kinds of research have been using data from EMRs and vital signs of patients to predict illnesses. Among them, we highlight those that intend to predict, classify, or, at least identify patterns, of sepsis illness in patients under vital signs monitoring. Sepsis is an organic dysfunction caused by a dysregulated patient's response to an infection that affects millions of people worldwide. Early detection of sepsis is expected to provide a significant improvement in its treatment. Preceding works usually combined medical, statistical, mathematical and computational models to develop detection methods for early prediction, getting higher accuracies, and using the smallest number of variables. Among other techniques, we could find researches using survival analysis, specialist systems, machine learning and deep learning that reached great results. In our research, patients are modeled as points moving each hour in an n-dimensional space where n is the number of vital signs (variables). These points can reach a sepsis target point after some time. For now, the sepsis target point was calculated using the median of all patients’ variables on the sepsis onset. From these points, we calculate for each hour the position vector, the first derivative (velocity vector) and the second derivative (acceleration vector) of the variables to evaluate their behavior. And we construct a prediction model based on a Long Short-Term Memory (LSTM) Network, including these derivatives as explanatory variables. The accuracy of the prediction 6 hours before the time of sepsis, considering only the vital signs reached 83.24% and by including the vectors position, speed, and acceleration, we obtained 94.96%. The data are being collected from Medical Information Mart for Intensive Care (MIMIC) Database, a public database that contains vital signs, laboratory test results, observations, notes, and so on, from more than 60.000 patients.

Keywords: dynamic analysis, long short-term memory, prediction, sepsis

Procedia PDF Downloads 108
1763 A Middleware Management System with Supporting Holonic Modules for Reconfigurable Management System

Authors: Roscoe McLean, Jared Padayachee, Glen Bright

Abstract:

There is currently a gap in the technology covering the rapid establishment of control after a reconfiguration in a Reconfigurable Manufacturing System. This gap involves the detection of the factory floor state and the communication link between the factory floor and the high-level software. In this paper, a thin, hardware-supported Middleware Management System (MMS) is proposed and its design and implementation are discussed. The research found that a cost-effective localization technique can be combined with intelligent software to speed up the ramp-up of a reconfigured system. The MMS makes the process more intelligent, more efficient and less time-consuming, thus supporting the industrial implementation of the RMS paradigm.

Keywords: intelligent systems, middleware, reconfigurable manufacturing, management system

Procedia PDF Downloads 662
1762 Assessment of Morphodynamic Changes at Kaluganga River Outlet, Sri Lanka Due to Poorly Planned Flood Controlling Measures

Authors: G. P. Gunasinghe, Lilani Ruhunage, N. P. Ratnayake, G. V. I. Samaradivakara, H. M. R. Premasiri, A. S. Ratnayake, Nimila Dushantha, W. A. P. Weerakoon, K. B. A. Silva

Abstract:

Sri Lanka is affected by different natural disasters such as tsunami, landslides, lightning, and riverine flood. Out of them, riverine floods act as a major disaster in the country. Different strategies are applied to control the impacts of flood hazards, and the expansion of river mouth is considered as one of the main activities for flood mitigation and disaster reduction. However, due to this expansion process, natural sand barriers including sand spits, barrier islands, and tidal planes are destroyed or subjected to change. This, in turn, can change the hydrodynamics and sediment dynamics of the area leading to other damages to the natural coastal features. The removal of a considerable portion of naturally formed sand barrier at Kaluganga River outlet (Calido Beach), Sri Lanka to control flooding event at Kaluthara urban area on May 2017, has become a serious issue in the area causing complete collapse of river mouth barrier spit bar system leading to rapid coastal erosion Kaluganga river outlet area and saltwater intrusion into the Kaluganga River. The present investigation is focused on assessing effects due to the removal of a considerable portion of naturally formed sand barrier at Kaluganga river mouth. For this study, the beach profiles, the bathymetric surveys, and Google Earth historical satellite images, before and after the flood event were collected and analyzed. Furthermore, a beach boundary survey was also carried out in October 2018 to support the satellite image data. The results of Google Earth satellite images and beach boundary survey data analyzed show a chronological breakdown of the sand barrier at the river outlet. The comparisons of pre and post-disaster bathymetric maps and beach profiles analysis revealed a noticeable deepening of the sea bed at the nearshore zone as well. Such deepening in the nearshore zone can cause the sea waves to break very near to the coastline. This might also lead to generate new diffraction patterns resulting in differential coastal accretion and erosion scenarios. Unless immediate mitigatory measures were not taken, the impacts may cause severe problems to the sensitive Kaluganag river mouth system.

Keywords: bathymetry, beach profiles, coastal features, river outlet, sand barrier, Sri Lanka

Procedia PDF Downloads 124
1761 Automating and Optimization Monitoring Prognostics for Rolling Bearing

Authors: H. Hotait, X. Chiementin, L. Rasolofondraibe

Abstract:

This paper presents a continuous work to detect the abnormal state in the rolling bearing by studying the vibration signature analysis and calculation of the remaining useful life. To achieve these aims, two methods; the first method is the classification to detect the degradation state by the AOM-OPTICS (Acousto-Optic Modulator) method. The second one is the prediction of the degradation state using least-squares support vector regression and then compared with the linear degradation model. An experimental investigation on ball-bearing was conducted to see the effectiveness of the used method by applying the acquired vibration signals. The proposed model for predicting the state of bearing gives us accurate results with the experimental and numerical data.

Keywords: bearings, automatization, optimization, prognosis, classification, defect detection

Procedia PDF Downloads 106
1760 Simulation and Characterization of Stretching and Folding in Microchannel Electrokinetic Flows

Authors: Justo Rodriguez, Daming Chen, Amador M. Guzman

Abstract:

The detection, treatment, and control of rapidly propagating, deadly viruses such as COVID-19, require the development of inexpensive, fast, and accurate devices to address the urgent needs of the population. Microfluidics-based sensors are amongst the different methods and techniques for detection that are easy to use. A micro analyzer is defined as a microfluidics-based sensor, composed of a network of microchannels with varying functions. Given their size, portability, and accuracy, they are proving to be more effective and convenient than other solutions. A micro analyzer based on the concept of “Lab on a Chip” presents advantages concerning other non-micro devices due to its smaller size, and it is having a better ratio between useful area and volume. The integration of multiple processes in a single microdevice reduces both the number of necessary samples and the analysis time, leading the next generation of analyzers for the health-sciences. In some applications, the flow of solution within the microchannels is originated by a pressure gradient, which can produce adverse effects on biological samples. A more efficient and less dangerous way of controlling the flow in a microchannel-based analyzer is applying an electric field to induce the fluid motion and either enhance or suppress the mixing process. Electrokinetic flows are characterized by no less than two non-dimensional parameters: the electric Rayleigh number and its geometrical aspect ratio. In this research, stable and unstable flows have been studied numerically (and when possible, will be experimental) in a T-shaped microchannel. Additionally, unstable electrokinetic flows for Rayleigh numbers higher than critical have been characterized. The flow mixing enhancement was quantified in relation to the stretching and folding that fluid particles undergo when they are subjected to supercritical electrokinetic flows. Computational simulations were carried out using a finite element-based program while working with the flow mixing concepts developed by Gollub and collaborators. Hundreds of seeded massless particles were tracked along the microchannel from the entrance to exit for both stable and unstable flows. After post-processing, their trajectories, the folding and stretching values for the different flows were found. Numerical results show that for supercritical electrokinetic flows, the enhancement effects of the folding and stretching processes become more apparent. Consequently, there is an improvement in the mixing process, ultimately leading to a more homogenous mixture.

Keywords: microchannel, stretching and folding, electro kinetic flow mixing, micro-analyzer

Procedia PDF Downloads 110
1759 Integration of Magnetoresistance Sensor in Microfluidic Chip for Magnetic Particles Detection

Authors: Chao-Ming Su, Pei-Sheng Wu, Yu-Chi Kuo, Yin-Chou Huang, Tan-Yueh Chen, Jefunnie Matahum, Tzong-Rong Ger

Abstract:

Application of magnetic particles (MPs) has been applied in biomedical field for many years. There are lots of advantages through this mediator including high biocompatibility and multi-diversified bio-applications. However, current techniques for evaluating the quantity of the magnetic-labeled sample assays are rare. In this paper, a Wheatstone bridge giant magnetoresistance (GMR) sensor integrated with a homemade detecting system was fabricated and used to quantify the concentration of MPs. The homemade detecting system has shown high detecting sensitivity of 10 μg/μl of MPs with optimized parameter vertical magnetic field 100 G, horizontal magnetic field 2 G and flow rate 0.4 ml/min.

Keywords: magnetic particles, magnetoresistive sensors, microfluidics, biosensor

Procedia PDF Downloads 388
1758 Urdu Text Extraction Method from Images

Authors: Samabia Tehsin, Sumaira Kausar

Abstract:

Due to the vast increase in the multimedia data in recent years, efficient and robust retrieval techniques are needed to retrieve and index images/ videos. Text embedded in the images can serve as the strong retrieval tool for images. This is the reason that text extraction is an area of research with increasing attention. English text extraction is the focus of many researchers but very less work has been done on other languages like Urdu. This paper is focusing on Urdu text extraction from video frames. This paper presents a text detection feature set, which has the ability to deal up with most of the problems connected with the text extraction process. To test the validity of the method, it is tested on Urdu news dataset, which gives promising results.

Keywords: caption text, content-based image retrieval, document analysis, text extraction

Procedia PDF Downloads 495
1757 Imaging of Underground Targets with an Improved Back-Projection Algorithm

Authors: Alireza Akbari, Gelareh Babaee Khou

Abstract:

Ground Penetrating Radar (GPR) is an important nondestructive remote sensing tool that has been used in both military and civilian fields. Recently, GPR imaging has attracted lots of attention in detection of subsurface shallow small targets such as landmines and unexploded ordnance and also imaging behind the wall for security applications. For the monostatic arrangement in the space-time GPR image, a single point target appears as a hyperbolic curve because of the different trip times of the EM wave when the radar moves along a synthetic aperture and collects reflectivity of the subsurface targets. With this hyperbolic curve, the resolution along the synthetic aperture direction shows undesired low resolution features owing to the tails of hyperbola. However, highly accurate information about the size, electromagnetic (EM) reflectivity, and depth of the buried objects is essential in most GPR applications. Therefore hyperbolic curve behavior in the space-time GPR image is often willing to be transformed to a focused pattern showing the object's true location and size together with its EM scattering. The common goal in a typical GPR image is to display the information of the spatial location and the reflectivity of an underground object. Therefore, the main challenge of GPR imaging technique is to devise an image reconstruction algorithm that provides high resolution and good suppression of strong artifacts and noise. In this paper, at first, the standard back-projection (BP) algorithm that was adapted to GPR imaging applications used for the image reconstruction. The standard BP algorithm was limited with against strong noise and a lot of artifacts, which have adverse effects on the following work like detection targets. Thus, an improved BP is based on cross-correlation between the receiving signals proposed for decreasing noises and suppression artifacts. To improve the quality of the results of proposed BP imaging algorithm, a weight factor was designed for each point in region imaging. Compared to a standard BP algorithm scheme, the improved algorithm produces images of higher quality and resolution. This proposed improved BP algorithm was applied on the simulation and the real GPR data and the results showed that the proposed improved BP imaging algorithm has a superior suppression artifacts and produces images with high quality and resolution. In order to quantitatively describe the imaging results on the effect of artifact suppression, focusing parameter was evaluated.

Keywords: algorithm, back-projection, GPR, remote sensing

Procedia PDF Downloads 436
1756 Detection of Chaos in General Parametric Model of Infectious Disease

Authors: Javad Khaligh, Aghileh Heydari, Ali Akbar Heydari

Abstract:

Mathematical epidemiological models for the spread of disease through a population are used to predict the prevalence of a disease or to study the impacts of treatment or prevention measures. Initial conditions for these models are measured from statistical data collected from a population since these initial conditions can never be exact, the presence of chaos in mathematical models has serious implications for the accuracy of the models as well as how epidemiologists interpret their findings. This paper confirms the chaotic behavior of a model for dengue fever and SI by investigating sensitive dependence, bifurcation, and 0-1 test under a variety of initial conditions.

Keywords: epidemiological models, SEIR disease model, bifurcation, chaotic behavior, 0-1 test

Procedia PDF Downloads 311
1755 Nanoimprinted-Block Copolymer-Based Porous Nanocone Substrate for SERS Enhancement

Authors: Yunha Ryu, Kyoungsik Kim

Abstract:

Raman spectroscopy is one of the most powerful techniques for chemical detection, but the low sensitivity originated from the extremely small cross-section of the Raman scattering limits the practical use of Raman spectroscopy. To overcome this problem, Surface Enhanced Raman Scattering (SERS) has been intensively studied for several decades. Because the SERS effect is mainly induced from strong electromagnetic near-field enhancement as a result of localized surface plasmon resonance of metallic nanostructures, it is important to design the plasmonic structures with high density of electromagnetic hot spots for SERS substrate. One of the useful fabrication methods is using porous nanomaterial as a template for metallic structure. Internal pores on a scale of tens of nanometers can be strong EM hotspots by confining the incident light. Also, porous structures can capture more target molecules than non-porous structures in a same detection spot thanks to the large surface area. Herein we report the facile fabrication method of porous SERS substrate by integrating solvent-assisted nanoimprint lithography and selective etching of block copolymer. We obtained nanostructures with high porosity via simple selective etching of the one microdomain of the diblock copolymer. Furthermore, we imprinted of the nanocone patterns into the spin-coated flat block copolymer film to make three-dimensional SERS substrate for the high density of SERS hot spots as well as large surface area. We used solvent-assisted nanoimprint lithography (SAIL) to reduce the fabrication time and cost for patterning BCP film by taking advantage of a solvent which dissolves both polystyrenre and poly(methyl methacrylate) domain of the block copolymer, and thus block copolymer film was molded under the low temperature and atmospheric pressure in a short time. After Ag deposition, we measured Raman intensity of dye molecules adsorbed on the fabricated structure. Compared to the Raman signals of Ag coated solid nanocone, porous nanocone showed 10 times higher Raman intensity at 1510 cm(-1) band. In conclusion, we fabricated porous metallic nanocone arrays with high density electromagnetic hotspots by templating nanoimprinted diblock copolymer with selective etching and demonstrated its capability as an effective SERS substrate.

Keywords: block copolymer, porous nanostructure, solvent-assisted nanoimprint, surface-enhanced Raman spectroscopy

Procedia PDF Downloads 610
1754 Quality and Shelf life of UHT Milk Produced in Tripoli, Libya

Authors: Faozia A. S. Abuhtana, Yahia S. Abujnah, Said O. Gnann

Abstract:

Ultra High Temperature (UHT) processed milk is widely distributed and preferred in numerous countries all over the world due its relatively high quality and long shelf life. Because of the notable high consumption rate of UHT in Libya in addition to negligible studies related to such product on the local level, this study was designed to assess the shelf life of locally produced as well as imported reconstituted sterilized whole milk samples marketed in Tripoli, Libya . Four locally produced vs. three imported brands were used in this study. All samples were stored at room temperature (25± 2C ) for 8 month long period, and subjected to physical, chemical, microbiological and sensory tests. These tests included : measurement of pH, specific gravity, percent acidity, and determination of fat, protein and melamine content. Microbiological tests included total aerobic count, total psychotropic bacteria, total spore forming bacteria and total coliform counts. Results indicated no detection of microbial growth of any type during the study period, in addition to no detection of melamine in all samples. On the other hand, a gradual decline in pH accompanied with gradual increase in % acidity of both locally produced and imported samples was observed. Such changes in both pH and % acidity reached their lowest and highest values respectively during the 24th week of storage. For instance pH values were (6.40, 6.55, 6.55, 6.15) and (6.30, 6.50, 6.20) for local and imported brands respectively. On the other hand, % acidity reached (0.185, 0181, 0170, 0183) and (0180, 0.180, 0.171) at the 24th week for local and imported brands respectively. Similar pattern of decline was also observed in specific gravity, fat and protein content in some local and imported samples especially at later stages of the study. In both cases, some of the recorded pH values, % acidity, sp. gravity and fat content were in violation of the accepted limits set by Libyan standard no. 356 for sterilized milk. Such changes in pH, % acidity and other UHT sterilized milk constituents during storage were coincided with a gradual decrease in the degree of acceptance of the stored milk samples of both types as shown by sensory scores recorded by the panelists. In either case degree of acceptance was significantly low at late stages of storage and most milk samples became relatively unacceptable after the 18th and 20th week for both untrained and trained panelists respectively.

Keywords: UHT milk, shelf life, quality, gravity, bacteria

Procedia PDF Downloads 321
1753 Acrylic Microspheres-Based Microbial Bio-Optode for Nitrite Ion Detection

Authors: Siti Nur Syazni Mohd Zuki, Tan Ling Ling, Nina Suhaity Azmi, Chong Kwok Feng, Lee Yook Heng

Abstract:

Nitrite (NO2-) ion is used prevalently as a preservative in processed meat. Elevated levels of nitrite also found in edible bird’s nests (EBNs). Consumption of NO2- ion at levels above the health-based risk may cause cancer in humans. Spectrophotometric Griess test is the simplest established standard method for NO2- ion detection, however, it requires careful control of pH of each reaction step and susceptible to strong oxidants and dyeing interferences. Other traditional methods rely on the use of laboratory-scale instruments such as GC-MS, HPLC and ion chromatography, which cannot give real-time response. Therefore, it is of significant need for devices capable of measuring nitrite concentration in-situ, rapidly and without reagents, sample pretreatment or extraction step. Herein, we constructed a microspheres-based microbial optode for visual quantitation of NO2- ion. Raoutella planticola, the bacterium expressing NAD(P)H nitrite reductase (NiR) enzyme has been successfully extracted by microbial technique from EBN collected from local birdhouse. The whole cells and the lipophilic Nile Blue chromoionophore were physically absorbed on the photocurable poly(n-butyl acrylate-N-acryloxysuccinimide) [poly (nBA-NAS)] microspheres, whilst the reduced coenzyme NAD(P)H was covalently immobilized on the succinimide-functionalized acrylic microspheres to produce a reagentless biosensing system. Upon the NiR enzyme catalyzes the oxidation of NAD(P)H to NAD(P)+, NO2- ion is reduced to ammonium hydroxide, and that a colour change from blue to pink of the immobilized Nile Blue chromoionophore is perceived as a result of deprotonation reaction increasing the local pH in the microspheres membrane. The microspheres-based optosensor was optimized with a reflectance spectrophotometer at 639 nm and pH 8. The resulting microbial bio-optode membrane could quantify NO2- ion at 0.1 ppm and had a linear response up to 400 ppm. Due to the large surface area to mass ratio of the acrylic microspheres, it allows efficient solid state diffusional mass transfer of the substrate to the bio-recognition phase, and achieve the steady state response as fast as 5 min. The proposed optical microbial biosensor requires no sample pre-treatment step and possesses high stability as the whole cell biocatalyst provides protection to the enzymes from interfering substances, hence it is suitable for measurements in contaminated samples.

Keywords: acrylic microspheres, microbial bio-optode, nitrite ion, reflectometric

Procedia PDF Downloads 424
1752 Study and Construction on Signalling System during Reverse Motion Due to Obstacle

Authors: S. M. Yasir Arafat

Abstract:

Driving models are needed by many researchers to improve traffic safety and to advance autonomous vehicle design. To be most useful, a driving model must state specifically what information is needed and how it is processed. So we developed an “Obstacle Avoidance and Detection Autonomous Car” based on sensor application. The ever increasing technological demands of today call for very complex systems, which in turn require highly sophisticated controllers to ensure that high performance can be achieved and maintained under adverse conditions. Based on a developed model of brakes operation, the controller of braking system operation has been designed. It has a task to enable solution to the problem of the better controlling of braking system operation in a more accurate way then it was the case now a day.

Keywords: automobile, obstacle, safety, sensing

Procedia PDF Downloads 354
1751 Characterization of the Dispersion Phenomenon in an Optical Biosensor

Authors: An-Shik Yang, Chin-Ting Kuo, Yung-Chun Yang, Wen-Hsin Hsieh, Chiang-Ho Cheng

Abstract:

Optical biosensors have become a powerful detection and analysis tool for wide-ranging applications in biomedical research, pharmaceuticals and environmental monitoring. This study carried out the computational fluid dynamics (CFD)-based simulations to explore the dispersion phenomenon in the microchannel of a optical biosensor. The predicted time sequences of concentration contours were utilized to better understand the dispersion development occurred in different geometric shapes of microchannels. The simulation results showed the surface concentrations at the sensing probe (with the best performance of a grating coupler) in respect of time to appraise the dispersion effect and therefore identify the design configurations resulting in minimum dispersion.

Keywords: CFD simulations, dispersion, microfluidic, optical waveguide sensors

Procedia PDF Downloads 533
1750 Hate Speech Detection Using Deep Learning and Machine Learning Models

Authors: Nabil Shawkat, Jamil Saquer

Abstract:

Social media has accelerated our ability to engage with others and eliminated many communication barriers. On the other hand, the widespread use of social media resulted in an increase in online hate speech. This has drastic impacts on vulnerable individuals and societies. Therefore, it is critical to detect hate speech to prevent innocent users and vulnerable communities from becoming victims of hate speech. We investigate the performance of different deep learning and machine learning algorithms on three different datasets. Our results show that the BERT model gives the best performance among all the models by achieving an F1-score of 90.6% on one of the datasets and F1-scores of 89.7% and 88.2% on the other two datasets.

Keywords: hate speech, machine learning, deep learning, abusive words, social media, text classification

Procedia PDF Downloads 118
1749 PCR Based DNA Analysis in Detecting P53 Mutation in Human Breast Cancer (MDA-468)

Authors: Debbarma Asis, Guha Chandan

Abstract:

Tumor Protein-53 (P53) is one of the tumor suppressor proteins. P53 regulates the cell cycle that conserves stability by preventing genome mutation. It is named so as it runs as 53-kilodalton (kDa) protein on Polyacrylamide gel electrophoresis although the actual mass is 43.7 kDa. Experimental evidence has indicated that P53 cancer mutants loses tumor suppression activity and subsequently gain oncogenic activities to promote tumourigenesis. Tumor-specific DNA has recently been detected in the plasma of breast cancer patients. Detection of tumor-specific genetic materials in cancer patients may provide a unique and valuable tumor marker for diagnosis and prognosis. Commercially available MDA-468 breast cancer cell line was used for the proposed study.

Keywords: tumor protein (P53), cancer mutants, MDA-468, tumor suppressor gene

Procedia PDF Downloads 465
1748 Delivering Safer Clinical Trials; Using Electronic Healthcare Records (EHR) to Monitor, Detect and Report Adverse Events in Clinical Trials

Authors: Claire Williams

Abstract:

Randomised controlled Trials (RCTs) of efficacy are still perceived as the gold standard for the generation of evidence, and whilst advances in data collection methods are well developed, this progress has not been matched for the reporting of adverse events (AEs). Assessment and reporting of AEs in clinical trials are fraught with human error and inefficiency and are extremely time and resource intensive. Recent research conducted into the quality of reporting of AEs during clinical trials concluded it is substandard and reporting is inconsistent. Investigators commonly send reports to sponsors who are incorrectly categorised and lacking in critical information, which can complicate the detection of valid safety signals. In our presentation, we will describe an electronic data capture system, which has been designed to support clinical trial processes by reducing the resource burden on investigators, improving overall trial efficiencies, and making trials safer for patients. This proprietary technology was developed using expertise proven in the delivery of the world’s first prospective, phase 3b real-world trial, ‘The Salford Lung Study, ’ which enabled robust safety monitoring and reporting processes to be accomplished by the remote monitoring of patients’ EHRs. This technology enables safety alerts that are pre-defined by the protocol to be detected from the data extracted directly from the patients EHR. Based on study-specific criteria, which are created from the standard definition of a serious adverse event (SAE) and the safety profile of the medicinal product, the system alerts the investigator or study team to the safety alert. Each safety alert will require a clinical review by the investigator or delegate; examples of the types of alerts include hospital admission, death, hepatotoxicity, neutropenia, and acute renal failure. This is achieved in near real-time; safety alerts can be reviewed along with any additional information available to determine whether they meet the protocol-defined criteria for reporting or withdrawal. This active surveillance technology helps reduce the resource burden of the more traditional methods of AE detection for the investigators and study teams and can help eliminate reporting bias. Integration of multiple healthcare data sources enables much more complete and accurate safety data to be collected as part of a trial and can also provide an opportunity to evaluate a drug’s safety profile long-term, in post-trial follow-up. By utilising this robust and proven method for safety monitoring and reporting, a much higher risk of patient cohorts can be enrolled into trials, thus promoting inclusivity and diversity. Broadening eligibility criteria and adopting more inclusive recruitment practices in the later stages of drug development will increase the ability to understand the medicinal products risk-benefit profile across the patient population that is likely to use the product in clinical practice. Furthermore, this ground-breaking approach to AE detection not only provides sponsors with better-quality safety data for their products, but it reduces the resource burden on the investigator and study teams. With the data taken directly from the source, trial costs are reduced, with minimal data validation required and near real-time reporting enables safety concerns and signals to be detected more quickly than in a traditional RCT.

Keywords: more comprehensive and accurate safety data, near real-time safety alerts, reduced resource burden, safer trials

Procedia PDF Downloads 71
1747 Predictive Maintenance Based on Oil Analysis Applicable to Transportation Fleets

Authors: Israel Ibarra Solis, Juan Carlos Rodriguez Sierra, Ma. del Carmen Salazar Hernandez, Isis Rodriguez Sanchez, David Perez Guerrero

Abstract:

At the present paper we try to explain the analysis techniques use for the lubricating oil in a maintenance period of a city bus (Mercedes Benz Boxer 40), which is call ‘R-24 route’, line Coecillo Centro SA de CV in Leon Guanajuato, to estimate the optimal time for the oil change. Using devices such as the rotational viscometer and the atomic absorption spectrometer, they can detect the incipient form when the oil loses its lubricating properties and, therefore, cannot protect the mechanical components of diesel engines such these trucks. Timely detection of lost property in the oil, it allows us taking preventive plan maintenance for the fleet.

Keywords: atomic absorption spectrometry, maintenance, predictive velocity rate, lubricating oils

Procedia PDF Downloads 551
1746 Comparing Xbar Charts: Conventional versus Reweighted Robust Estimation Methods for Univariate Data Sets

Authors: Ece Cigdem Mutlu, Burak Alakent

Abstract:

Maintaining the quality of manufactured products at a desired level depends on the stability of process dispersion and location parameters and detection of perturbations in these parameters as promptly as possible. Shewhart control chart is the most widely used technique in statistical process monitoring to monitor the quality of products and control process mean and variability. In the application of Xbar control charts, sample standard deviation and sample mean are known to be the most efficient conventional estimators in determining process dispersion and location parameters, respectively, based on the assumption of independent and normally distributed datasets. On the other hand, there is no guarantee that the real-world data would be normally distributed. In the cases of estimated process parameters from Phase I data clouded with outliers, efficiency of traditional estimators is significantly reduced, and performance of Xbar charts are undesirably low, e.g. occasional outliers in the rational subgroups in Phase I data set may considerably affect the sample mean and standard deviation, resulting a serious delay in detection of inferior products in Phase II. For more efficient application of control charts, it is required to use robust estimators against contaminations, which may exist in Phase I. In the current study, we present a simple approach to construct robust Xbar control charts using average distance to the median, Qn-estimator of scale, M-estimator of scale with logistic psi-function in the estimation of process dispersion parameter, and Harrell-Davis qth quantile estimator, Hodge-Lehmann estimator and M-estimator of location with Huber psi-function and logistic psi-function in the estimation of process location parameter. Phase I efficiency of proposed estimators and Phase II performance of Xbar charts constructed from these estimators are compared with the conventional mean and standard deviation statistics both under normality and against diffuse-localized and symmetric-asymmetric contaminations using 50,000 Monte Carlo simulations on MATLAB. Consequently, it is found that robust estimators yield parameter estimates with higher efficiency against all types of contaminations, and Xbar charts constructed using robust estimators have higher power in detecting disturbances, compared to conventional methods. Additionally, utilizing individuals charts to screen outlier subgroups and employing different combination of dispersion and location estimators on subgroups and individual observations are found to improve the performance of Xbar charts.

Keywords: average run length, M-estimators, quality control, robust estimators

Procedia PDF Downloads 177
1745 Software Cloning and Agile Environment

Authors: Ravi Kumar, Dhrubajit Barman, Nomi Baruah

Abstract:

Software Cloning has grown an active area in software engineering research community yielding numerous techniques, various tools and other methods for clone detection and removal. The copying, modifying a block of code is identified as cloning as it is the most basic means of software reuse. Agile Software Development is an approach which is currently being used in various software projects, so that it helps to respond the unpredictability of building software through incremental, iterative, work cadences. Software Cloning has been introduced to Agile Environment and many Agile Software Development approaches are using the concept of Software Cloning. This paper discusses the various Agile Software Development approaches. It also discusses the degree to which the Software Cloning concept is being introduced in the Agile Software Development approaches.

Keywords: agile environment, refactoring, reuse, software cloning

Procedia PDF Downloads 516
1744 Automatic Segmentation of Lung Pleura Based On Curvature Analysis

Authors: Sasidhar B., Bhaskar Rao N., Ramesh Babu D. R., Ravi Shankar M.

Abstract:

Segmentation of lung pleura is a preprocessing step in Computer-Aided Diagnosis (CAD) which helps in reducing false positives in detection of lung cancer. The existing methods fail in extraction of lung regions with the nodules at the pleura of the lungs. In this paper, a new method is proposed which segments lung regions with nodules at the pleura of the lungs based on curvature analysis and morphological operators. The proposed algorithm is tested on 06 patient’s dataset which consists of 60 images of Lung Image Database Consortium (LIDC) and the results are found to be satisfactory with 98.3% average overlap measure (AΩ).

Keywords: curvature analysis, image segmentation, morphological operators, thresholding

Procedia PDF Downloads 583