Search results for: real time anomaly detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22690

Search results for: real time anomaly detection

21340 Conditions on Expressing a Matrix as a Sum of α-Involutions

Authors: Ric Joseph R. Murillo, Edna N. Gueco, Dennis I. Merino

Abstract:

Let F be C or R, where C and R are the set of complex numbers and real numbers, respectively, and n be a natural number. An n-by-n matrix A over the field F is called an α-involutory matrix or an α-involution if there exists an α in the field such that the square of the matrix is equal to αI, where I is the n-by-n identity matrix. If α is a complex number or a nonnegative real number, then an n-by-n matrix A over the field F can be written as a sum of n-by-n α-involutory matrices over the field F if and only if the trace of that matrix is an integral multiple of the square root of α. Meanwhile, if α is a negative real number, then a 2n-by-2n matrix A over R can be written as a sum of 2n-by-2n α-involutory matrices over R if and only the trace of the matrix is zero. Some other properties of α-involutory matrices are also determined

Keywords: α-involutory Matrices, sum of α-involutory Matrices, Trace, Matrix Theory

Procedia PDF Downloads 188
21339 Fast and Accurate Model to Detect Ictal Waveforms in Electroencephalogram Signals

Authors: Piyush Swami, Bijaya Ketan Panigrahi, Sneh Anand, Manvir Bhatia, Tapan Gandhi

Abstract:

Visual inspection of electroencephalogram (EEG) signals to detect epileptic signals is very challenging and time-consuming task even for any expert neurophysiologist. This problem is most challenging in under-developed and developing countries due to shortage of skilled neurophysiologists. In the past, notable research efforts have gone in trying to automate the seizure detection process. However, due to high false alarm detections and complexity of the models developed so far, have vastly delimited their practical implementation. In this paper, we present a novel scheme for epileptic seizure detection using empirical mode decomposition technique. The intrinsic mode functions obtained were then used to calculate the standard deviations. This was followed by probability density based classifier to discriminate between non-ictal and ictal patterns in EEG signals. The model presented here demonstrated very high classification rates ( > 97%) without compromising the statistical performance. The computation timings for each testing phase were also very low ( < 0.029 s) which makes this model ideal for practical applications.

Keywords: electroencephalogram (EEG), epilepsy, ictal patterns, empirical mode decomposition

Procedia PDF Downloads 400
21338 Rapid Building Detection in Population-Dense Regions with Overfitted Machine Learning Models

Authors: V. Mantey, N. Findlay, I. Maddox

Abstract:

The quality and quantity of global satellite data have been increasing exponentially in recent years as spaceborne systems become more affordable and the sensors themselves become more sophisticated. This is a valuable resource for many applications, including disaster management and relief. However, while more information can be valuable, the volume of data available is impossible to manually examine. Therefore, the question becomes how to extract as much information as possible from the data with limited manpower. Buildings are a key feature of interest in satellite imagery with applications including telecommunications, population models, and disaster relief. Machine learning tools are fast becoming one of the key resources to solve this problem, and models have been developed to detect buildings in optical satellite imagery. However, by and large, most models focus on affluent regions where buildings are generally larger and constructed further apart. This work is focused on the more difficult problem of detection in populated regions. The primary challenge with detecting small buildings in densely populated regions is both the spatial and spectral resolution of the optical sensor. Densely packed buildings with similar construction materials will be difficult to separate due to a similarity in color and because the physical separation between structures is either non-existent or smaller than the spatial resolution. This study finds that training models until they are overfitting the input sample can perform better in these areas than a more robust, generalized model. An overfitted model takes less time to fine-tune from a generalized pre-trained model and requires fewer input data. The model developed for this study has also been fine-tuned using existing, open-source, building vector datasets. This is particularly valuable in the context of disaster relief, where information is required in a very short time span. Leveraging existing datasets means that little to no manpower or time is required to collect data in the region of interest. The training period itself is also shorter for smaller datasets. Requiring less data means that only a few quality areas are necessary, and so any weaknesses or underpopulated regions in the data can be skipped over in favor of areas with higher quality vectors. In this study, a landcover classification model was developed in conjunction with the building detection tool to provide a secondary source to quality check the detected buildings. This has greatly reduced the false positive rate. The proposed methodologies have been implemented and integrated into a configurable production environment and have been employed for a number of large-scale commercial projects, including continent-wide DEM production, where the extracted building footprints are being used to enhance digital elevation models. Overfitted machine learning models are often considered too specific to have any predictive capacity. However, this study demonstrates that, in cases where input data is scarce, overfitted models can be judiciously applied to solve time-sensitive problems.

Keywords: building detection, disaster relief, mask-RCNN, satellite mapping

Procedia PDF Downloads 166
21337 Efficient Human Motion Detection Feature Set by Using Local Phase Quantization Method

Authors: Arwa Alzughaibi

Abstract:

Human Motion detection is a challenging task due to a number of factors including variable appearance, posture and a wide range of illumination conditions and background. So, the first need of such a model is a reliable feature set that can discriminate between a human and a non-human form with a fair amount of confidence even under difficult conditions. By having richer representations, the classification task becomes easier and improved results can be achieved. The Aim of this paper is to investigate the reliable and accurate human motion detection models that are able to detect the human motions accurately under varying illumination levels and backgrounds. Different sets of features are tried and tested including Histogram of Oriented Gradients (HOG), Deformable Parts Model (DPM), Local Decorrelated Channel Feature (LDCF) and Aggregate Channel Feature (ACF). However, we propose an efficient and reliable human motion detection approach by combining Histogram of oriented gradients (HOG) and local phase quantization (LPQ) as the feature set, and implementing search pruning algorithm based on optical flow to reduce the number of false positive. Experimental results show the effectiveness of combining local phase quantization descriptor and the histogram of gradient to perform perfectly well for a large range of illumination conditions and backgrounds than the state-of-the-art human detectors. Areaunder th ROC Curve (AUC) of the proposed method achieved 0.781 for UCF dataset and 0.826 for CDW dataset which indicates that it performs comparably better than HOG, DPM, LDCF and ACF methods.

Keywords: human motion detection, histograms of oriented gradient, local phase quantization, local phase quantization

Procedia PDF Downloads 251
21336 Inversion of Gravity Data for Density Reconstruction

Authors: Arka Roy, Chandra Prakash Dubey

Abstract:

Inverse problem generally used for recovering hidden information from outside available data. Vertical component of gravity field we will be going to use for underneath density structure calculation. Ill-posing nature is main obstacle for any inverse problem. Linear regularization using Tikhonov formulation are used for appropriate choice of SVD and GSVD components. For real time data handle, signal to noise ratios should have to be less for reliable solution. In our study, 2D and 3D synthetic model with rectangular grid are used for gravity field calculation and its corresponding inversion for density reconstruction. Fine grid also we have considered to hold any irregular structure. Keeping in mind of algebraic ambiguity factor number of observation point should be more than that of number of data point. Picard plot is represented here for choosing appropriate or main controlling Eigenvalues for a regularized solution. Another important study is depth resolution plot (DRP). DRP are generally used for studying how the inversion is influenced by regularizing or discretizing. Our further study involves real time gravity data inversion of Vredeforte Dome South Africa. We apply our method to this data. The results include density structure is in good agreement with known formation in that region, which puts an additional support of our method.

Keywords: depth resolution plot, gravity inversion, Picard plot, SVD, Tikhonov formulation

Procedia PDF Downloads 203
21335 The Impact of Biodiversity and Urban Ecosystem Services in Real Estate

Authors: Carmen Cantuarias-Villessuzanne, Jeffrey Blain, Radmila Pineau

Abstract:

Our research project aims at analyzing the sensitiveness of French households to urban biodiversity and urban ecosystem services (UES). Opinion surveys show that the French population is sensitive to biodiversity and ecosystem services loss, but the value given to these issues within urban fabric and real estate market lacks evidence. Using GIS data and economic evaluation, by hedonic price methods, weassess the isolated contribution of the explanatory variables of biodiversityand UES on the price of residential real estate. We analyze the variation of the valuefor three urban ecosystem services - flood control, proximity to green spaces, and refreshment - on the price of real estate whena property changes ownership. Our modeling and mapping focus on the price at theIRIS scale (statistical information unit) from 2014 to 2019. The main variables are internal characteristics of housing (area, kind of housing, heating), external characteristics(accessibility and infrastructure, economic, social, and physical environmentsuch as air pollution, noise), and biodiversity indicators and urban ecosystemservices for the Ile-de-France region. Moreover, we compare environmental values on the enhancement of greenspaces and their impact on residential choices. These studies are very useful for real estate developers because they enable them to promote green spaces, and municipalities to become more attractive.

Keywords: urban ecosystem services, sustainable real estate, urban biodiversity perception, hedonic price, environmental values

Procedia PDF Downloads 128
21334 Stacking Ensemble Approach for Combining Different Methods in Real Estate Prediction

Authors: Sol Girouard, Zona Kostic

Abstract:

A home is often the largest and most expensive purchase a person makes. Whether the decision leads to a successful outcome will be determined by a combination of critical factors. In this paper, we propose a method that efficiently handles all the factors in residential real estate and performs predictions given a feature space with high dimensionality while controlling for overfitting. The proposed method was built on gradient descent and boosting algorithms and uses a mixed optimizing technique to improve the prediction power. Usually, a single model cannot handle all the cases thus our approach builds multiple models based on different subsets of the predictors. The algorithm was tested on 3 million homes across the U.S., and the experimental results demonstrate the efficiency of this approach by outperforming techniques currently used in forecasting prices. With everyday changes on the real estate market, our proposed algorithm capitalizes from new events allowing more efficient predictions.

Keywords: real estate prediction, gradient descent, boosting, ensemble methods, active learning, training

Procedia PDF Downloads 271
21333 Real Estate Rigidities: The Effect of Cash Transactions and the Impact of Demonetisation on Them

Authors: Dishant Shahi, Aradhya Shandilya, Nand Kumar

Abstract:

We study here the impact of the black component referred to as X component in the text on Real estate transactions. The X component involved not only acts as friction in transaction but also leads to dysfunctionality in the capital market of real estate. The effect of the component is presented by using a model of economy which seeks resemblance with that of India involving property deals. The rigidities which hinder smooth transactions in property or land deals are depicted and their impact on the economy as a whole has been modelled. The effect of subprime crisis (2007) on Indian housing capital market and the role which the X component played during it, is also included in one of the sections. In the entire text, we have utilised 4 Quadrant graphs to study supply and demand causalities involved in commercial real estate. At the end we have included the impact of demonetisation as a move to counter the problem of overvaluation in the property assets arising due to the X component. The case of Demonetisation which has been the latest move by the Indian Government to control huge amount of black money in circulation has been included along with its impact on the housing and rent as well as the capital market.

Keywords: X-component, 4Q graph, real estate, capital markets, demonetisation, consumer sentiments

Procedia PDF Downloads 359
21332 Modified Poly (Pyrrole) Film-Based Biosensors for Phenol Detection

Authors: S. Korkut, M. S. Kilic, E. Erhan

Abstract:

In order to detect and quantify the phenolic contents of a wastewater with biosensors, two working electrodes based on modified Poly (Pyrrole) films were fabricated. Enzyme horseradish peroxidase was used as biomolecule of the prepared electrodes. Various phenolics were tested at the biosensor. Phenol detection was realized by electrochemical reduction of quinones produced by enzymatic activity. Analytical parameters were calculated and the results were compared with each other.

Keywords: carbon nanotube, phenol biosensor, polypyrrole, poly (glutaraldehyde)

Procedia PDF Downloads 411
21331 Eye Tracking: Biometric Evaluations of Instructional Materials for Improved Learning

Authors: Janet Holland

Abstract:

Eye tracking is a great way to triangulate multiple data sources for deeper, more complete knowledge of how instructional materials are really being used and emotional connections made. Using sensor based biometrics provides a detailed local analysis in real time expanding our ability to collect science based data for a more comprehensive level of understanding, not previously possible, for teaching and learning. The knowledge gained will be used to make future improvements to instructional materials, tools, and interactions. The literature has been examined and a preliminary pilot test was implemented to develop a methodology for research in Instructional Design and Technology. Eye tracking now offers the addition of objective metrics obtained from eye tracking and other biometric data collection with analysis for a fresh perspective.

Keywords: area of interest, eye tracking, biometrics, fixation, fixation count, fixation sequence, fixation time, gaze points, heat map, saccades, time to first fixation

Procedia PDF Downloads 126
21330 Edge Detection Using Multi-Agent System: Evaluation on Synthetic and Medical MR Images

Authors: A. Nachour, L. Ouzizi, Y. Aoura

Abstract:

Recent developments on multi-agent system have brought a new research field on image processing. Several algorithms are used simultaneously and improved in deferent applications while new methods are investigated. This paper presents a new automatic method for edge detection using several agents and many different actions. The proposed multi-agent system is based on parallel agents that locally perceive their environment, that is to say, pixels and additional environmental information. This environment is built using Vector Field Convolution that attract free agent to the edges. Problems of partial, hidden or edges linking are solved with the cooperation between agents. The presented method was implemented and evaluated using several examples on different synthetic and medical images. The obtained experimental results suggest that this approach confirm the efficiency and accuracy of detected edge.

Keywords: edge detection, medical MRImages, multi-agent systems, vector field convolution

Procedia PDF Downloads 385
21329 Improved Classification Procedure for Imbalanced and Overlapped Situations

Authors: Hankyu Lee, Seoung Bum Kim

Abstract:

The issue with imbalance and overlapping in the class distribution becomes important in various applications of data mining. The imbalanced dataset is a special case in classification problems in which the number of observations of one class (i.e., major class) heavily exceeds the number of observations of the other class (i.e., minor class). Overlapped dataset is the case where many observations are shared together between the two classes. Imbalanced and overlapped data can be frequently found in many real examples including fraud and abuse patients in healthcare, quality prediction in manufacturing, text classification, oil spill detection, remote sensing, and so on. The class imbalance and overlap problem is the challenging issue because this situation degrades the performance of most of the standard classification algorithms. In this study, we propose a classification procedure that can effectively handle imbalanced and overlapped datasets by splitting data space into three parts: nonoverlapping, light overlapping, and severe overlapping and applying the classification algorithm in each part. These three parts were determined based on the Hausdorff distance and the margin of the modified support vector machine. An experiments study was conducted to examine the properties of the proposed method and compared it with other classification algorithms. The results showed that the proposed method outperformed the competitors under various imbalanced and overlapped situations. Moreover, the applicability of the proposed method was demonstrated through the experiment with real data.

Keywords: classification, imbalanced data with class overlap, split data space, support vector machine

Procedia PDF Downloads 304
21328 Edge Detection and Morphological Image for Estimating Gestational Age Based on Fetus Length Automatically

Authors: Retno Supriyanti, Ahmad Chuzaeri, Yogi Ramadhani, A. Haris Budi Widodo

Abstract:

The use of ultrasonography in the medical world has been very popular including the diagnosis of pregnancy. In determining pregnancy, ultrasonography has many roles, such as to check the position of the fetus, abnormal pregnancy, fetal age and others. Unfortunately, all these things still need to analyze the role of the obstetrician in the sense of image raised by ultrasonography. One of the most striking is the determination of gestational age. Usually, it is done by measuring the length of the fetus manually by obstetricians. In this study, we developed a computer-aided diagnosis for the determination of gestational age by measuring the length of the fetus automatically using edge detection method and image morphology. Results showed that the system is sufficiently accurate in determining the gestational age based image processing.

Keywords: computer aided diagnosis, gestational age, and diameter of uterus, length of fetus, edge detection method, morphology image

Procedia PDF Downloads 290
21327 Liver Regeneration of Small in situ Injury

Authors: Ziwei Song, Junjun Fan, Jeremy Teo, Yang Yu, Yukun Ma, Jie Yan, Shupei Mo, Lisa Tucker-Kellogg, Peter So, Hanry Yu

Abstract:

Liver is the center of detoxification and exposed to toxic metabolites all the time. It is highly regenerative after injury, with the ability to restore even after 70% partial hepatectomy. Most of the previous studies were using hepatectomy as injury models for liver regeneration study. There is limited understanding of small-scale liver injury, which can be caused by either low dose drug consumption or hepatocyte routine metabolism. Although these small in situ injuries do not cause immediate symptoms, repeated injuries will lead to aberrant wound healing in liver. Therefore, the cellular dynamics during liver regeneration is critical for our understanding of liver regeneration mechanism. We aim to study the liver regeneration of small-scale in situ liver injury in transgenic mice labeling actin (Lifeact-GFP). Previous studies have been using sample sections and biopsies of liver, which lack real-time information. In order to trace every individual hepatocyte during the regeneration process, we have developed and optimized an intravital imaging system that allows in vivo imaging of mouse liver for consecutive 5 days, allowing real-time cellular tracking and quantification of hepatocytes. We used femtosecond-laser ablation to make controlled and repeatable liver injury model, which mimics the real-life small in situ liver injury. This injury model is the first case of its kind for in vivo study on liver. We found that small-scale in situ liver injury is repaired by the coordination of hypertrophy and migration of hepatocytes. Hypertrophy is only transient at initial phase, while migration is the main driving force to complete the regeneration process. From cellular aspect, Akt/mTOR pathway is activated immediately after injury, which leads to transient hepatocyte hypertrophy. From mechano-sensing aspect, the actin cable, formed at apical surface of wound proximal hepatocytes, provides mechanical tension for hepatocyte migration. This study provides important information on both chemical and mechanical signals that promote liver regeneration of small in situ injury. We conclude that hypertrophy and migration play a dominant role at different stages of liver regeneration.

Keywords: hepatocyte, hypertrophy, intravital imaging, liver regeneration, migration

Procedia PDF Downloads 202
21326 Data Quality Enhancement with String Length Distribution

Authors: Qi Xiu, Hiromu Hota, Yohsuke Ishii, Takuya Oda

Abstract:

Recently, collectable manufacturing data are rapidly increasing. On the other hand, mega recall is getting serious as a social problem. Under such circumstances, there are increasing needs for preventing mega recalls by defect analysis such as root cause analysis and abnormal detection utilizing manufacturing data. However, the time to classify strings in manufacturing data by traditional method is too long to meet requirement of quick defect analysis. Therefore, we present String Length Distribution Classification method (SLDC) to correctly classify strings in a short time. This method learns character features, especially string length distribution from Product ID, Machine ID in BOM and asset list. By applying the proposal to strings in actual manufacturing data, we verified that the classification time of strings can be reduced by 80%. As a result, it can be estimated that the requirement of quick defect analysis can be fulfilled.

Keywords: string classification, data quality, feature selection, probability distribution, string length

Procedia PDF Downloads 312
21325 Detecting Characters as Objects Towards Character Recognition on Licence Plates

Authors: Alden Boby, Dane Brown, James Connan

Abstract:

Character recognition is a well-researched topic across disciplines. Regardless, creating a solution that can cater to multiple situations is still challenging. Vehicle licence plates lack an international standard, meaning that different countries and regions have their own licence plate format. A problem that arises from this is that the typefaces and designs from different regions make it difficult to create a solution that can cater to a wide range of licence plates. The main issue concerning detection is the character recognition stage. This paper aims to create an object detection-based character recognition model trained on a custom dataset that consists of typefaces of licence plates from various regions. Given that characters have featured consistently maintained across an array of fonts, YOLO can be trained to recognise characters based on these features, which may provide better performance than OCR methods such as Tesseract OCR.

Keywords: computer vision, character recognition, licence plate recognition, object detection

Procedia PDF Downloads 115
21324 Real, Ideal, or False Self- Presentation among Young Adult and Middle Adult Facebook Users

Authors: Maria Joan Grafil, Hannah Wendam, Christine Joyce Yu

Abstract:

The use of social networking sites had been a big part of life of most people. One of the most popular among these is Facebook. Users range from young adults to late adults. While it is more popular among emerging and young adults, this social networking site gives people opportunities to express the self. Via Facebook, people have the opportunity to think about what they prefer to show others. This study identified which among the multiple facets of the self (real self, false self or ideal self) is dominantly presented by young adults and middle adults in using the social networking site Facebook. South Metro Manila was the locale of this study where 100 young adult participants (aged 18-25) were students from nearby universities and the 100 middle adult participants (aged 35-45) were working residents within the area. Participants were comprised of 53% females and 47% males. The data was gathered using a self-report questionnaire to determine which online self-presentation (real self-presentation, false self-presentation, or ideal self-presentation) of the participants has greater extent when engaging in the social networking site Facebook. Using means comparison, results showed that both young adults and middle adults engaged primarily in real self-presentation.

Keywords: false self, ideal self, middle adult, real self, self presentation, young adult

Procedia PDF Downloads 283
21323 A Comprehensive Survey on Machine Learning Techniques and User Authentication Approaches for Credit Card Fraud Detection

Authors: Niloofar Yousefi, Marie Alaghband, Ivan Garibay

Abstract:

With the increase of credit card usage, the volume of credit card misuse also has significantly increased, which may cause appreciable financial losses for both credit card holders and financial organizations issuing credit cards. As a result, financial organizations are working hard on developing and deploying credit card fraud detection methods, in order to adapt to ever-evolving, increasingly sophisticated defrauding strategies and identifying illicit transactions as quickly as possible to protect themselves and their customers. Compounding on the complex nature of such adverse strategies, credit card fraudulent activities are rare events compared to the number of legitimate transactions. Hence, the challenge to develop fraud detection that are accurate and efficient is substantially intensified and, as a consequence, credit card fraud detection has lately become a very active area of research. In this work, we provide a survey of current techniques most relevant to the problem of credit card fraud detection. We carry out our survey in two main parts. In the first part, we focus on studies utilizing classical machine learning models, which mostly employ traditional transnational features to make fraud predictions. These models typically rely on some static physical characteristics, such as what the user knows (knowledge-based method), or what he/she has access to (object-based method). In the second part of our survey, we review more advanced techniques of user authentication, which use behavioral biometrics to identify an individual based on his/her unique behavior while he/she is interacting with his/her electronic devices. These approaches rely on how people behave (instead of what they do), which cannot be easily forged. By providing an overview of current approaches and the results reported in the literature, this survey aims to drive the future research agenda for the community in order to develop more accurate, reliable and scalable models of credit card fraud detection.

Keywords: Credit Card Fraud Detection, User Authentication, Behavioral Biometrics, Machine Learning, Literature Survey

Procedia PDF Downloads 109
21322 Project Time and Quality Management during Construction

Authors: Nahed Al-Hajeri

Abstract:

Time and cost is an integral part of every construction plan and can affect each party’s contractual obligations. The performance of both time and cost are usually important to the client and contractor during the project. Almost all construction projects are experiencing time overrun. These time overruns always contributed as expensive to both client and contractor. Construction of any project inside the gathering centers involves complex management skills related to work force, materials, plant, machineries, new technologies etc. It also involves many agencies interdependent on each other like the vendors, structural and functional designers including various types of specialized engineers and it includes support of contractors and specialized contractors. This paper mainly highlights the types of construction delays due to which project suffer time and cost overrun. This paper also speaks about the delay causes and factors that contribute to the construction sequence delay for the oil and gas projects. Construction delay is supposed to be one of the repeated problems in the construction projects and it has an opposing effect on project success in terms of time, cost and quality. Some effective methods are identified to minimize delays in construction projects such as: 1. Site management and supervision, 2. Effective strategic planning, 3. Clear information and communication channel. Our research paper studies the types of delay with some real examples with statistic results and suggests solutions to overcome this problem.

Keywords: non-compensable delay, delays caused by force majeure, compensable delay, delays caused by the owner or the owner’s representative, non-excusable delay, delay caused by the contractor or the contractor’s representative, concurrent delay, delays resulting from two separate causes at the same time

Procedia PDF Downloads 238
21321 A Double Acceptance Sampling Plan for Truncated Life Test Having Exponentiated Transmuted Weibull Distribution

Authors: A. D. Abdellatif, A. N. Ahmed, M. E. Abdelaziz

Abstract:

The main purpose of this paper is to design a double acceptance sampling plan under the time truncated life test when the product lifetime follows an exponentiated transmuted Weibull distribution. Here, the motive is to meet both the consumer’s risk and producer’s risk simultaneously at the specified quality levels, while the termination time is specified. A comparison between the results of the double and single acceptance sampling plans is conducted. We demonstrate the applicability of our results to real data sets.

Keywords: double sampling plan, single sampling plan, producer’s risk, consumer’s risk, exponentiated transmuted weibull distribution, time truncated experiment, single, double, Marshal-Olkin

Procedia PDF Downloads 482
21320 Recognition of Early Enterococcus Faecalis through Image Treatment by Using Octave

Authors: Laura Victoria Vigoya Morales, David Rolando Suarez Mora

Abstract:

The problem of detecting enterococcus faecalis is receiving considerable attention with the new cases of beachgoers infected with the bacteria, which can be found in fecal matter. The process detection of this kind of bacteria would be taking a long time, which waste time and money as a result of closing recreation place, like beach or pools. Hence, new methods for automating the process of detecting and recognition of this bacteria has become in a challenge. This article describes a novel approach to detect the enterococcus faecalis bacteria in water by using an octave algorithm, which embody a network neural. This document shows result of performance, quality and integrity of the algorithm.

Keywords: Enterococcus faecalis, image treatment, octave and network neuronal

Procedia PDF Downloads 218
21319 Electrochemical Study of Interaction of Thiol Containing Proteins with As (III)

Authors: Sunil Mittal, Sukhpreet Singh, Hardeep Kaur

Abstract:

The affinity of thiol group with heavy metals is a well-established phenomenon. The present investigation has been focused on electrochemical response of cysteine and thioredoxin against arsenite (As III) on indium tin oxide (ITO) electrodes. It was observed that both the compounds produce distinct response in free and immobilised form at the electrode. The SEM, FTIR, and impedance studies of the modified electrode were conducted for characterization. Various parameters were optimized to achieve As (III) effect on the reduction potential of the compounds. Cyclic voltammetry and linear sweep voltammetry were employed as the analysis techniques. The optimum response was observed at neutral pH in both the cases, at optimum concentration of 2 mM and 4.27 µM for cysteine and thioredoxin respectively. It was observed that presence of As (III) increases the reduction current of both the moieties. The linear range of detection for As (III) with cysteine was from 1 to 10 mg L⁻¹ with detection limit of 0.8 mg L⁻¹. The thioredoxin was found more sensitive to As (III) and displayed a linear range from 0.1 to 1 mg L⁻¹ with detection limit of 10 µg L⁻¹.

Keywords: arsenite, cyclic voltammetry, cysteine, thioredoxin

Procedia PDF Downloads 206
21318 Probability-Based Damage Detection of Structures Using Model Updating with Enhanced Ideal Gas Molecular Movement Algorithm

Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee

Abstract:

Model updating method has received increasing attention in damage detection structures based on measured modal parameters. Therefore, a probability-based damage detection (PBDD) procedure based on a model updating procedure is presented in this paper, in which a one-stage model-based damage identification technique based on the dynamic features of a structure is investigated. The presented framework uses a finite element updating method with a Monte Carlo simulation that considers the uncertainty caused by measurement noise. Enhanced ideal gas molecular movement (EIGMM) is used as the main algorithm for model updating. Ideal gas molecular movement (IGMM) is a multiagent algorithm based on the ideal gas molecular movement. Ideal gas molecules disperse rapidly in different directions and cover all the space inside. This is embedded in the high speed of molecules, collisions between them and with the surrounding barriers. In IGMM algorithm to accomplish the optimal solutions, the initial population of gas molecules is randomly generated and the governing equations related to the velocity of gas molecules and collisions between those are utilized. In this paper, an enhanced version of IGMM, which removes unchanged variables after specified iterations, is developed. The proposed method is implemented on two numerical examples in the field of structural damage detection. The results show that the proposed method can perform well and competitive in PBDD of structures.

Keywords: enhanced ideal gas molecular movement (EIGMM), ideal gas molecular movement (IGMM), model updating method, probability-based damage detection (PBDD), uncertainty quantification

Procedia PDF Downloads 274
21317 Designing and Implementing a Tourist-Guide Web Service Based on Volunteer Geographic Information Using Open-Source Technologies

Authors: Javad Sadidi, Ehsan Babaei, Hani Rezayan

Abstract:

The advent of web 2.0 gives a possibility to scale down the costs of data collection and mapping, specifically if the process is done by volunteers. Every volunteer can be thought of as a free and ubiquitous sensor to collect spatial, descriptive as well as multimedia data for tourist services. The lack of large-scale information, such as real-time climate and weather conditions, population density, and other related data, can be considered one of the important challenges in developing countries for tourists to make the best decision in terms of time and place of travel. The current research aims to design and implement a spatiotemporal web map service using volunteer-submitted data. The service acts as a tourist-guide service in which tourists can search interested places based on their requested time for travel. To design the service, three tiers of architecture, including data, logical processing, and presentation tiers, have been utilized. For implementing the service, open-source software programs, client and server-side programming languages (such as OpenLayers2, AJAX, and PHP), Geoserver as a map server, and Web Feature Service (WFS) standards have been used. The result is two distinct browser-based services, one for sending spatial, descriptive, and multimedia volunteer data and another one for tourists and local officials. Local official confirms the veracity of the volunteer-submitted information. In the tourist interface, a spatiotemporal search engine has been designed to enable tourists to find a tourist place based on province, city, and location at a specific time of interest. Implementing the tourist-guide service by this methodology causes the following: the current tourists participate in a free data collection and sharing process for future tourists, a real-time data sharing and accessing for all, avoiding a blind selection of travel destination and significantly, decreases the cost of providing such services.

Keywords: VGI, tourism, spatiotemporal, browser-based, web mapping

Procedia PDF Downloads 89
21316 R Software for Parameter Estimation of Spatio-Temporal Model

Authors: Budi Nurani Ruchjana, Atje Setiawan Abdullah, I. Gede Nyoman Mindra Jaya, Eddy Hermawan

Abstract:

In this paper, we propose the application package to estimate parameters of spatiotemporal model based on the multivariate time series analysis using the R open-source software. We build packages mainly to estimate the parameters of the Generalized Space Time Autoregressive (GSTAR) model. GSTAR is a combination of time series and spatial models that have parameters vary per location. We use the method of Ordinary Least Squares (OLS) and use the Mean Average Percentage Error (MAPE) to fit the model to spatiotemporal real phenomenon. For case study, we use oil production data from volcanic layer at Jatibarang Indonesia or climate data such as rainfall in Indonesia. Software R is very user-friendly and it is making calculation easier, processing the data is accurate and faster. Limitations R script for the estimation of model parameters spatiotemporal GSTAR built is still limited to a stationary time series model. Therefore, the R program under windows can be developed either for theoretical studies and application.

Keywords: GSTAR Model, MAPE, OLS method, oil production, R software

Procedia PDF Downloads 235
21315 Role of ABC-Type Efflux Transporters in Antifungal Resistance of Candida auris

Authors: Mohamed Mahdi Alshahni, Takashi Tamura, Koichi Makimura

Abstract:

Objective: The objective of this study is to evaluate roles of ABC-type efflux transporters in the resistance of Candida auris against common antifungal agents. Material and Methods: A wild-type C. auris strain and its antifungal resistant derivative strain that is generated through induction by antifungal agents were used in this study. The strains were cultured onto media containing beauvericin alone or in combination with azole agents. Moreover, expression levels of four ABC-type transporter’s homologs in those strains were analyzed by real time PCR with or without antifungal stress by fluconazole or voriconazole. Results: Addition of beauvericin helped to partially restore the susceptibility of the resistant strain against fluconazole, suggesting participation of ABC-type transporters in the resistance mechanism. Real time PCR results showed that mRNA levels of three out of the four analyzed transporters in the resistant strain were more than 2-fold higher than their counterparts in the wild-type strain under negative control and antifungal agent-containing conditions. Conclusion: C. auris is an emerging multidrug-resistant pathogen causing human mortality worldwide. Providing effective treatment has been hampered by the resistance to antifungal drugs, demanding understanding the resistance mechanism in order to devise new therapeutic strategies. Our data suggest a partial contribution of ABC-type transporters to the resistance of this pathogen.

Keywords: resistance, C. auris, transporters, antifungi

Procedia PDF Downloads 161
21314 Power Grid Line Ampacity Forecasting Based on a Long-Short-Term Memory Neural Network

Authors: Xiang-Yao Zheng, Jen-Cheng Wang, Joe-Air Jiang

Abstract:

Improving the line ampacity while using existing power grids is an important issue that electricity dispatchers are now facing. Using the information provided by the dynamic thermal rating (DTR) of transmission lines, an overhead power grid can operate safely. However, dispatchers usually lack real-time DTR information. Thus, this study proposes a long-short-term memory (LSTM)-based method, which is one of the neural network models. The LSTM-based method predicts the DTR of lines using the weather data provided by Central Weather Bureau (CWB) of Taiwan. The possible thermal bottlenecks at different locations along the line and the margin of line ampacity can be real-time determined by the proposed LSTM-based prediction method. A case study that targets the 345 kV power grid of TaiPower in Taiwan is utilized to examine the performance of the proposed method. The simulation results show that the proposed method is useful to provide the information for the smart grid application in the future.

Keywords: electricity dispatch, line ampacity prediction, dynamic thermal rating, long-short-term memory neural network, smart grid

Procedia PDF Downloads 278
21313 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection

Authors: Hamidullah Binol, Abdullah Bal

Abstract:

Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.

Keywords: food (ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods

Procedia PDF Downloads 427
21312 Ordered Mesoporous WO₃-TiO₂ Nanocomposites for Enhanced Xylene Gas Detection

Authors: Vijay K. Tomer, Ritu Malik, Satya P. Nehra, Anshu Sharma

Abstract:

Highly ordered mesoporous WO₃-TiO₂ nanohybrids with large intrinsic surface area and highly ordered pore channels were synthesized using mesoporous silica, KIT-6 as hard template using a nanocasting strategy. The nanohybrid samples were characterized by a variety of physico-chemical techniques including X-ray diffraction, Nitrogen adsorption-desorption isotherms, and high resolution transmission electron microscope. The nanohybrids were tested for detection of important indoor Volatile Organic Compounds (VOCs) including acetone, ethanol, n-butanol, toluene, and xylene. The sensing result illustrates that the nanocomposite sensor was highly responsive towards xylene gas at relatively lower operating temperature. A rapid response and recovery time, highly linear response and excellent stability in the concentration ranges from 1 to 100 ppm was observed for xylene gas. It is believed that the promising results of this study can be utilized in the synthesis of ordered mesoporous nanostructures which can extend its configuration for the development of new age e-nose type sensors with enhanced gas-sensing performance.

Keywords: nanohybrids, response, sensor, VOCs, xylene

Procedia PDF Downloads 319
21311 Failure Analysis Using Rtds for a Power System Equipped with Thyristor-Controlled Series Capacitor in Korea

Authors: Chur Hee Lee, Jae in Lee, Minh Chau Diah, Jong Su Yoon, Seung Wan Kim

Abstract:

This paper deals with Real Time Digital Simulator (RTDS) analysis about effects of transmission lines failure in power system equipped with Thyristor Controlled Series Capacitance (TCSC) in Korea. The TCSC is firstly applied in Korea to compensate real power in case of 765 kV line faults. Therefore, It is important to analyze with TCSC replica using RTDS. In this test, all systems in Korea, other than those near TCSC, were abbreviated to Thevenin equivalent. The replica was tested in the case of a line failure near the TCSC, a generator failure, and a 765-kV line failure. The effects of conventional operated STATCOM, SVC and TCSC were also analyzed. The test results will be used for the actual TCSC operational impact analysis.

Keywords: failure analysis, power system, RTDS, TCSC

Procedia PDF Downloads 115