Search results for: fault detection and recovery
4846 Modern Spectrum Sensing Techniques for Cognitive Radio Networks: Practical Implementation and Performance Evaluation
Authors: Antoni Ivanov, Nikolay Dandanov, Nicole Christoff, Vladimir Poulkov
Abstract:
Spectrum underutilization has made cognitive radio a promising technology both for current and future telecommunications. This is due to the ability to exploit the unused spectrum in the bands dedicated to other wireless communication systems, and thus, increase their occupancy. The essential function, which allows the cognitive radio device to perceive the occupancy of the spectrum, is spectrum sensing. In this paper, the performance of modern adaptations of the four most widely used spectrum sensing techniques namely, energy detection (ED), cyclostationary feature detection (CSFD), matched filter (MF) and eigenvalues-based detection (EBD) is compared. The implementation has been accomplished through the PlutoSDR hardware platform and the GNU Radio software package in very low Signal-to-Noise Ratio (SNR) conditions. The optimal detection performance of the examined methods in a realistic implementation-oriented model is found for the common relevant parameters (number of observed samples, sensing time and required probability of false alarm).Keywords: cognitive radio, dynamic spectrum access, GNU Radio, spectrum sensing
Procedia PDF Downloads 2514845 Cracks Detection and Measurement Using VLP-16 LiDAR and Intel Depth Camera D435 in Real-Time
Authors: Xinwen Zhu, Xingguang Li, Sun Yi
Abstract:
Crack is one of the most common damages in buildings, bridges, roads and so on, which may pose safety hazards. However, cracks frequently happen in structures of various materials. Traditional methods of manual detection and measurement, which are known as subjective, time-consuming, and labor-intensive, are gradually unable to meet the needs of modern development. In addition, crack detection and measurement need be safe considering space limitations and danger. Intelligent crack detection has become necessary research. In this paper, an efficient method for crack detection and quantification using a 3D sensor, LiDAR, and depth camera is proposed. This method works even in a dark environment, which is usual in real-world applications. The LiDAR rapidly spins to scan the surrounding environment and discover cracks through lasers thousands of times per second, providing a rich, 3D point cloud in real-time. The LiDAR provides quite accurate depth information. The precision of the distance of each point can be determined within around ±3 cm accuracy, and not only it is good for getting a precise distance, but it also allows us to see far of over 100m going with the top range models. But the accuracy is still large for some high precision structures of material. To make the depth of crack is much more accurate, the depth camera is in need. The cracks are scanned by the depth camera at the same time. Finally, all data from LiDAR and Depth cameras are analyzed, and the size of the cracks can be quantified successfully. The comparison shows that the minimum and mean absolute percentage error between measured and calculated width are about 2.22% and 6.27%, respectively. The experiments and results are presented in this paper.Keywords: LiDAR, depth camera, real-time, detection and measurement
Procedia PDF Downloads 2354844 RGB Color Based Real Time Traffic Sign Detection and Feature Extraction System
Authors: Kay Thinzar Phu, Lwin Lwin Oo
Abstract:
In an intelligent transport system and advanced driver assistance system, the developing of real-time traffic sign detection and recognition (TSDR) system plays an important part in recent research field. There are many challenges for developing real-time TSDR system due to motion artifacts, variable lighting and weather conditions and situations of traffic signs. Researchers have already proposed various methods to minimize the challenges problem. The aim of the proposed research is to develop an efficient and effective TSDR in real time. This system proposes an adaptive thresholding method based on RGB color for traffic signs detection and new features for traffic signs recognition. In this system, the RGB color thresholding is used to detect the blue and yellow color traffic signs regions. The system performs the shape identify to decide whether the output candidate region is traffic sign or not. Lastly, new features such as termination points, bifurcation points, and 90’ angles are extracted from validated image. This system uses Myanmar Traffic Sign dataset.Keywords: adaptive thresholding based on RGB color, blue color detection, feature extraction, yellow color detection
Procedia PDF Downloads 3164843 Innocence Compensation: Motions to Strike and Dismiss to Forestall Financial Recovery
Authors: Myles Frederick McLellan
Abstract:
When errors in the criminal justice process lead to wrongful convictions and miscarriages of justice, it falls upon the State to make reparation for the egregious harms brought to innocent individuals. Of all the remedies available to seek compensation, private and public law litigation against the police and prosecution services is the most widely used. Unfortunately, all levels of court including the Supreme Court of Canada have explicitly endorsed the prospect of striking out or dismissing these claims at the outset on an expedited basis. The burden on agents of the State as defendants to succeed on motions for such relief is so low that very few actions will survive to give an innocent accused his or her day in court. This paper will be a quantitative and qualitative analysis on the occurrence and success of motions to strike and dismiss to forestall financial recovery for the damage caused when a criminal investigation and prosecution goes wrong. This paper will also include a comparative component on the private law systems at common law (e.g. USA, UK, Australia and New Zealand) with respect to the availability of a similar process to pre-emptively terminate litigation for the recovery of compensation to an innocent individual.Keywords: compensation, innocence, miscarriages of justice, wrongful convictions
Procedia PDF Downloads 1444842 Generation of Automated Alarms for Plantwide Process Monitoring
Authors: Hyun-Woo Cho
Abstract:
Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.Keywords: detection, monitoring, process data, noise
Procedia PDF Downloads 2554841 Traffic Light Detection Using Image Segmentation
Authors: Vaishnavi Shivde, Shrishti Sinha, Trapti Mishra
Abstract:
Traffic light detection from a moving vehicle is an important technology both for driver safety assistance functions as well as for autonomous driving in the city. This paper proposed a deep-learning-based traffic light recognition method that consists of a pixel-wise image segmentation technique and a fully convolutional network i.e., UNET architecture. This paper has used a method for detecting the position and recognizing the state of the traffic lights in video sequences is presented and evaluated using Traffic Light Dataset which contains masked traffic light image data. The first stage is the detection, which is accomplished through image processing (image segmentation) techniques such as image cropping, color transformation, segmentation of possible traffic lights. The second stage is the recognition, which means identifying the color of the traffic light or knowing the state of traffic light which is achieved by using a Convolutional Neural Network (UNET architecture).Keywords: traffic light detection, image segmentation, machine learning, classification, convolutional neural networks
Procedia PDF Downloads 1824840 CE Method for Development of Japan's Stochastic Earthquake Catalogue
Authors: Babak Kamrani, Nozar Kishi
Abstract:
Stochastic catalog represents the events module of the earthquake loss estimation models. It includes series of events with different magnitudes and corresponding frequencies/probabilities. For the development of the stochastic catalog, random or uniform sampling methods are used to sample the events from the seismicity model. For covering all the Magnitude Frequency Distribution (MFD), a huge number of events should be generated for the above-mentioned methods. Characteristic Event (CE) method chooses the events based on the interest of the insurance industry. We divide the MFD of each source into bins. We have chosen the bins based on the probability of the interest by the insurance industry. First, we have collected the information for the available seismic sources. Sources are divided into Fault sources, subduction, and events without specific fault source. We have developed the MFD for each of the individual and areal source based on the seismicity of the sources. Afterward, we have calculated the CE magnitudes based on the desired probability. To develop the stochastic catalog, we have introduced uncertainty to the location of the events too.Keywords: stochastic catalogue, earthquake loss, uncertainty, characteristic event
Procedia PDF Downloads 3024839 Feature Based Unsupervised Intrusion Detection
Authors: Deeman Yousif Mahmood, Mohammed Abdullah Hussein
Abstract:
The goal of a network-based intrusion detection system is to classify activities of network traffics into two major categories: normal and attack (intrusive) activities. Nowadays, data mining and machine learning plays an important role in many sciences; including intrusion detection system (IDS) using both supervised and unsupervised techniques. However, one of the essential steps of data mining is feature selection that helps in improving the efficiency, performance and prediction rate of proposed approach. This paper applies unsupervised K-means clustering algorithm with information gain (IG) for feature selection and reduction to build a network intrusion detection system. For our experimental analysis, we have used the new NSL-KDD dataset, which is a modified dataset for KDDCup 1999 intrusion detection benchmark dataset. With a split of 60.0% for the training set and the remainder for the testing set, a 2 class classifications have been implemented (Normal, Attack). Weka framework which is a java based open source software consists of a collection of machine learning algorithms for data mining tasks has been used in the testing process. The experimental results show that the proposed approach is very accurate with low false positive rate and high true positive rate and it takes less learning time in comparison with using the full features of the dataset with the same algorithm.Keywords: information gain (IG), intrusion detection system (IDS), k-means clustering, Weka
Procedia PDF Downloads 2994838 Anomaly Detection Based on System Log Data
Authors: M. Kamel, A. Hoayek, M. Batton-Hubert
Abstract:
With the increase of network virtualization and the disparity of vendors, the continuous monitoring and detection of anomalies cannot rely on static rules. An advanced analytical methodology is needed to discriminate between ordinary events and unusual anomalies. In this paper, we focus on log data (textual data), which is a crucial source of information for network performance. Then, we introduce an algorithm used as a pipeline to help with the pretreatment of such data, group it into patterns, and dynamically label each pattern as an anomaly or not. Such tools will provide users and experts with continuous real-time logs monitoring capability to detect anomalies and failures in the underlying system that can affect performance. An application of real-world data illustrates the algorithm.Keywords: logs, anomaly detection, ML, scoring, NLP
Procedia PDF Downloads 1004837 Speech Detection Model Based on Deep Neural Networks Classifier for Speech Emotions Recognition
Authors: A. Shoiynbek, K. Kozhakhmet, P. Menezes, D. Kuanyshbay, D. Bayazitov
Abstract:
Speech emotion recognition has received increasing research interest all through current years. There was used emotional speech that was collected under controlled conditions in most research work. Actors imitating and artificially producing emotions in front of a microphone noted those records. There are four issues related to that approach, namely, (1) emotions are not natural, and it means that machines are learning to recognize fake emotions. (2) Emotions are very limited by quantity and poor in their variety of speaking. (3) There is language dependency on SER. (4) Consequently, each time when researchers want to start work with SER, they need to find a good emotional database on their language. In this paper, we propose the approach to create an automatic tool for speech emotion extraction based on facial emotion recognition and describe the sequence of actions of the proposed approach. One of the first objectives of the sequence of actions is a speech detection issue. The paper gives a detailed description of the speech detection model based on a fully connected deep neural network for Kazakh and Russian languages. Despite the high results in speech detection for Kazakh and Russian, the described process is suitable for any language. To illustrate the working capacity of the developed model, we have performed an analysis of speech detection and extraction from real tasks.Keywords: deep neural networks, speech detection, speech emotion recognition, Mel-frequency cepstrum coefficients, collecting speech emotion corpus, collecting speech emotion dataset, Kazakh speech dataset
Procedia PDF Downloads 1054836 Microwave Tomography: The Analytical Treatment for Detecting Malignant Tumor Inside Human Body
Authors: Muhammad Hassan Khalil, Xu Jiadong
Abstract:
Early detection through screening is the best tool short of a perfect treatment against the malignant tumor inside the breast of a woman. By detecting cancer in its early stages, it can be recognized and treated before it has the opportunity to spread and change into potentially dangerous. Microwave tomography is a new imaging method based on contrast in dielectric properties of materials. The mathematical theory of microwave tomography involves solving an inverse problem for Maxwell’s equations. In this paper, we present designed antenna for breast cancer detection, which will use in microwave tomography configuration.Keywords: microwave imaging, inverse scattering, breast cancer, malignant tumor detection
Procedia PDF Downloads 3754835 Comparing Nonverbal Deception Detection of Police Officers and Human Resources Students in the Czech Republic
Authors: Lenka Mynaříková, Hedvika Boukalová
Abstract:
The study looks at the ability to detect nonverbal deception among police officers and management students in the Czech Republic. Respondents from police departments (n=197) and university students of human resources (n=161) completed a deception detection task and evaluated veracity of the statements of suspects in 21 video clips from real crime investigations. Their evaluations were based on nonverbal behavior. Voices in the video clips were modified so that words were not recognizable, yet paraverbal voice characteristics were preserved. Results suggest that respondents have a tendency to lie bias based on their profession. In the evaluation of video clips, stereotypes also played a significant role. The statements of suspects of a different ethnicity, younger age or specific visual features were considered deceitful more often. Research might be beneficial for training in professions that are in need of deception detection techniques.Keywords: deception detection, police officers, human resources, forensic psychology, forensic studies, organizational psychology
Procedia PDF Downloads 4354834 Comparing Community Detection Algorithms in Bipartite Networks
Authors: Ehsan Khademi, Mahdi Jalili
Abstract:
Despite the special features of bipartite networks, they are common in many systems. Real-world bipartite networks may show community structure, similar to what one can find in one-mode networks. However, the interpretation of the community structure in bipartite networks is different as compared to one-mode networks. In this manuscript, we compare a number of available methods that are frequently used to discover community structure of bipartite networks. These networks are categorized into two broad classes. One class is the methods that, first, transfer the network into a one-mode network, and then apply community detection algorithms. The other class is the algorithms that have been developed specifically for bipartite networks. These algorithms are applied on a model network with prescribed community structure.Keywords: community detection, bipartite networks, co-clustering, modularity, network projection, complex networks
Procedia PDF Downloads 6304833 Rapid, Label-Free, Direct Detection and Quantification of Escherichia coli Bacteria Using Nonlinear Acoustic Aptasensor
Authors: Shilpa Khobragade, Carlos Da Silva Granja, Niklas Sandström, Igor Efimov, Victor P. Ostanin, Wouter van der Wijngaart, David Klenerman, Sourav K. Ghosh
Abstract:
Rapid, label-free and direct detection of pathogenic bacteria is critical for the prevention of disease outbreaks. This paper for the first time attempts to probe the nonlinear acoustic response of quartz crystal resonator (QCR) functionalized with specific DNA aptamers for direct detection and quantification of viable E. coli KCTC 2571 bacteria. DNA aptamers were immobilized through biotin and streptavidin conjugation, onto the gold surface of QCR to capture the target bacteria and the detection was accomplished by shift in amplitude of the peak 3f signal (3 times the drive frequency) upon binding, when driven near fundamental resonance frequency. The developed nonlinear acoustic aptasensor system demonstrated better reliability than conventional resonance frequency shift and energy dissipation monitoring that were recorded simultaneously. This sensing system could directly detect 10⁽⁵⁾ cells/mL target bacteria within 30 min or less and had high specificity towards E. coli KCTC 2571 bacteria as compared to the same concentration of S.typhi bacteria. Aptasensor response was observed for the bacterial suspensions ranging from 10⁽⁵⁾-10⁽⁸⁾ cells/mL. Conclusively, this nonlinear acoustic aptasensor is simple to use, gives real-time output, cost-effective and has the potential for rapid, specific, label-free direction detection of bacteria.Keywords: acoustic, aptasensor, detection, nonlinear
Procedia PDF Downloads 5714832 Analysis of Collision Avoidance System
Authors: N. Gayathri Devi, K. Batri
Abstract:
The advent of technology has increased the traffic hazards and the road accidents take place. Collision detection system in automobile aims at reducing or mitigating the severity of an accident. This project aims at avoiding Vehicle head on collision by means of collision detection algorithm. This collision detection algorithm predicts the collision and the avoidance or minimization have to be done within few seconds on confirmation. Under critical situation collision minimization is made possible by turning the vehicle to the desired turn radius so that collision impact can be reduced. In order to avoid the collision completely, the turning of the vehicle should be achieved at reduced speed in order to maintain the stability.Keywords: collision avoidance system, time to collision, time to turn, turn radius
Procedia PDF Downloads 5514831 Dual Mode “Turn On-Off-On” Photoluminescence Detection of EDTA and Lead Using Moringa Oleifera Gum-Derived Carbon Dots
Authors: Anisha Mandal, Swambabu Varanasi
Abstract:
Lead is one of the most prevalent toxic heavy metal ions, and its pollution poses a significant threat to the environment and human health. On the other hand, Ethylenediaminetetraacetic acid is a widely used metal chelating agent that, due to its poor biodegradability, is an incessant pollutant to the environment. For the first time, a green, simple, and cost-effective approach is used to hydrothermally synthesise photoluminescent carbon dots using Moringa Oleifera Gum in a single step. Then, using Moringa Oleifera Gum-derived carbon dots, a photoluminescent "ON-OFF-ON" mechanism for dual mode detection of trace Pb2+ and EDTA was proposed. MOG-CDs detect Pb2+ selectively and sensitively using a photoluminescence quenching mechanism, with a detection limit (LOD) of 0.000472 ppm. (1.24 nM). The quenched photoluminescence can be restored by adding EDTA to the MOG-CD+Pb2+ system; this strategy is used to quantify EDTA at a level of detection of 0.0026 ppm. (8.9 nM). The quantification of Pb2+ and EDTA in actual samples encapsulated the applicability and dependability of the proposed photoluminescent probe.Keywords: carbon dots, photoluminescence, sensor, moringa oleifera gum
Procedia PDF Downloads 1214830 A Comprehensive Study of Camouflaged Object Detection Using Deep Learning
Authors: Khalak Bin Khair, Saqib Jahir, Mohammed Ibrahim, Fahad Bin, Debajyoti Karmaker
Abstract:
Object detection is a computer technology that deals with searching through digital images and videos for occurrences of semantic elements of a particular class. It is associated with image processing and computer vision. On top of object detection, we detect camouflage objects within an image using Deep Learning techniques. Deep learning may be a subset of machine learning that's essentially a three-layer neural network Over 6500 images that possess camouflage properties are gathered from various internet sources and divided into 4 categories to compare the result. Those images are labeled and then trained and tested using vgg16 architecture on the jupyter notebook using the TensorFlow platform. The architecture is further customized using Transfer Learning. Methods for transferring information from one or more of these source tasks to increase learning in a related target task are created through transfer learning. The purpose of this transfer of learning methodologies is to aid in the evolution of machine learning to the point where it is as efficient as human learning.Keywords: deep learning, transfer learning, TensorFlow, camouflage, object detection, architecture, accuracy, model, VGG16
Procedia PDF Downloads 1584829 Development of a Direct Immunoassay for Human Ferritin Using Diffraction-Based Sensing Method
Authors: Joel Ballesteros, Harriet Jane Caleja, Florian Del Mundo, Cherrie Pascual
Abstract:
Diffraction-based sensing was utilized in the quantification of human ferritin in blood serum to provide an alternative to label-based immunoassays currently used in clinical diagnostics and researches. The diffraction intensity was measured by the diffractive optics technology or dotLab™ system. Two methods were evaluated in this study: direct immunoassay and direct sandwich immunoassay. In the direct immunoassay, human ferritin was captured by human ferritin antibodies immobilized on an avidin-coated sensor while the direct sandwich immunoassay had an additional step for the binding of a detector human ferritin antibody on the analyte complex. Both methods were repeatable with coefficient of variation values below 15%. The direct sandwich immunoassay had a linear response from 10 to 500 ng/mL which is wider than the 100-500 ng/mL of the direct immunoassay. The direct sandwich immunoassay also has a higher calibration sensitivity with value 0.002 Diffractive Intensity (ng mL-1)-1) compared to the 0.004 Diffractive Intensity (ng mL-1)-1 of the direct immunoassay. The limit of detection and limit of quantification values of the direct immunoassay were found to be 29 ng/mL and 98 ng/mL, respectively, while the direct sandwich immunoassay has a limit of detection (LOD) of 2.5 ng/mL and a limit of quantification (LOQ) of 8.2 ng/mL. In terms of accuracy, the direct immunoassay had a percent recovery of 88.8-93.0% in PBS while the direct sandwich immunoassay had 94.1 to 97.2%. Based on the results, the direct sandwich immunoassay is a better diffraction-based immunoassay in terms of accuracy, LOD, LOQ, linear range, and sensitivity. The direct sandwich immunoassay was utilized in the determination of human ferritin in blood serum and the results are validated by Chemiluminescent Magnetic Immunoassay (CMIA). The calculated Pearson correlation coefficient was 0.995 and the p-values of the paired-sample t-test were less than 0.5 which show that the results of the direct sandwich immunoassay was comparable to that of CMIA and could be utilized as an alternative analytical method.Keywords: biosensor, diffraction, ferritin, immunoassay
Procedia PDF Downloads 3564828 Conflation Methodology Applied to Flood Recovery
Authors: Eva L. Suarez, Daniel E. Meeroff, Yan Yong
Abstract:
Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.Keywords: community resilience, conflation, flood risk, nuisance flooding
Procedia PDF Downloads 1084827 Grain Boundary Detection Based on Superpixel Merges
Authors: Gaokai Liu
Abstract:
The distribution of material grain sizes reflects the strength, fracture, corrosion and other properties, and the grain size can be acquired via the grain boundary. In recent years, the automatic grain boundary detection is widely required instead of complex experimental operations. In this paper, an effective solution is applied to acquire the grain boundary of material images. First, the initial superpixel segmentation result is obtained via a superpixel approach. Then, a region merging method is employed to merge adjacent regions based on certain similarity criterions, the experimental results show that the merging strategy improves the superpixel segmentation result on material datasets.Keywords: grain boundary detection, image segmentation, material images, region merging
Procedia PDF Downloads 1744826 Anatomical Survey for Text Pattern Detection
Abstract:
The ultimate aim of machine intelligence is to explore and materialize the human capabilities, one of which is the ability to detect various text objects within one or more images displayed on any canvas including prints, videos or electronic displays. Multimedia data has increased rapidly in past years. Textual information present in multimedia contains important information about the image/video content. However, it needs to technologically testify the commonly used human intelligence of detecting and differentiating the text within an image, for computers. Hence in this paper feature set based on anatomical study of human text detection system is proposed. Subsequent examination bears testimony to the fact that the features extracted proved instrumental to text detection.Keywords: biologically inspired vision, content based retrieval, document analysis, text extraction
Procedia PDF Downloads 4524825 Trend Detection Using Community Rank and Hawkes Process
Authors: Shashank Bhatnagar, W. Wilfred Godfrey
Abstract:
We develop in this paper, an approach to find the trendy topic, which not only considers the user-topic interaction but also considers the community, in which user belongs. This method modifies the previous approach of user-topic interaction to user-community-topic interaction with better speed-up in the range of [1.1-3]. We assume that trend detection in a social network is dependent on two things. The one is, broadcast of messages in social network governed by self-exciting point process, namely called Hawkes process and the second is, Community Rank. The influencer node links to others in the community and decides the community rank based on its PageRank and the number of users links to that community. The community rank decides the influence of one community over the other. Hence, the Hawkes process with the kernel of user-community-topic decides the trendy topic disseminated into the social network.Keywords: community detection, community rank, Hawkes process, influencer node, pagerank, trend detection
Procedia PDF Downloads 3884824 Automatic Extraction of Arbitrarily Shaped Buildings from VHR Satellite Imagery
Authors: Evans Belly, Imdad Rizvi, M. M. Kadam
Abstract:
Satellite imagery is one of the emerging technologies which are extensively utilized in various applications such as detection/extraction of man-made structures, monitoring of sensitive areas, creating graphic maps etc. The main approach here is the automated detection of buildings from very high resolution (VHR) optical satellite images. Initially, the shadow, the building and the non-building regions (roads, vegetation etc.) are investigated wherein building extraction is mainly focused. Once all the landscape is collected a trimming process is done so as to eliminate the landscapes that may occur due to non-building objects. Finally the label method is used to extract the building regions. The label method may be altered for efficient building extraction. The images used for the analysis are the ones which are extracted from the sensors having resolution less than 1 meter (VHR). This method provides an efficient way to produce good results. The additional overhead of mid processing is eliminated without compromising the quality of the output to ease the processing steps required and time consumed.Keywords: building detection, shadow detection, landscape generation, label, partitioning, very high resolution (VHR) satellite imagery
Procedia PDF Downloads 3194823 Heavy Oil Recovery with Chemical Viscosity-Reduction: An Innovative Low-Carbon and Low-Cost Technology
Authors: Lin Meng, Xi Lu, Haibo Wang, Yong Song, Lili Cao, Wenfang Song, Yong Hu
Abstract:
China has abundant heavy oil resources, and thermal recovery is the main recovery method for heavy oil reservoirs. However, high energy consumption, high carbon emission and high production costs make heavy oil thermal recovery unsustainable. It is urgent to explore a replacement for developing technology. A low Carbon and cost technology of heavy oil recovery, chemical viscosity-reduction in layer (CVRL), is developed by the petroleum exploration and development research institute of Sinopec via investigated mechanisms, synthesized products, and improved oil production technologies, as follows: (1) Proposed a cascade viscous mechanism of heavy oil. Asphaltene and resin grow from free molecules to associative structures further to bulk aggregations by π - π stacking and hydrogen bonding, which causes the high viscosity of heavy oil. (2) Aimed at breaking the π - π stacking and hydrogen bond of heavy oil, the copolymer of N-(3,4-dihydroxyphenethyl) acryl amide and 2-Acrylamido-2-methylpropane sulfonic acid was synthesized as a viscosity reducer. It achieves a viscosity reduction rate of>80% without shearing for heavy oil (viscosity < 50000 mPa‧s), of which fluidity is evidently improved in the layer. (3) Synthesized hydroxymethyl acrylamide-maleic acid-decanol ternary copolymer self-assembly plugging agent. The particle size is 0.1 μm-2 mm adjustable, and the volume is 10-500 times controllable, which can achieve the efficient transportation of viscosity reducer to enriched oil areas. CVRL has applied 400 wells until now, increasing oil production by 470000 tons, saving 81000 tons of standard coal, reducing CO2 emissions by 174000 tons, and reducing production costs by 60%. It promotes the transformation of heavy oil towards low energy consumption, low carbon emissions, and low-cost development.Keywords: heavy oil, chemical viscosity-reduction, low carbon, viscosity reducer, plugging agent
Procedia PDF Downloads 814822 The IVAIRE Study: Relative Performance of Energy and Heat Recovery Ventilators in Cold Climates
Authors: D. Aubin, D. Won, H. Schleibinger, P. Lajoie, D. Gauvin, J.-M. Leclerc
Abstract:
This paper describes the results obtained in a two-year randomized intervention field study investigating the impact of ventilation rates on indoor air quality (IAQ) and the respiratory health of asthmatic children in Québec City, Canada. The focus of this article is on the comparative effectiveness of heat recovery ventilators (HRVs) and energy recovery ventilators (ERVs) at increasing ventilation rates, improving IAQ, and maintaining an acceptable indoor relative humidity (RH). In 14% of the homes, the RH was found to be too low in winter. Providing more cold and dry outside air to under-ventilated homes in winter further reduces indoor RH. Thus, low-RH homes in the intervention group were chosen to receive ERVs (instead of HRVs) to increase the ventilation rate. The installation of HRVs or ERVs led to a near doubling of the ventilation rates in the intervention group homes which led to a significant reduction in the concentration of several key of pollutants. The ERVs were also effective in maintaining an acceptable indoor RH since they avoided excessive dehumidification of the home by recovering moisture from the exhaust airstream through the enthalpy core, otherwise associated with increased cold supply air rates.Keywords: asthma, field study, indoor air quality, ventilation
Procedia PDF Downloads 2764821 An Investigation into Fraud Detection in Financial Reporting Using Sugeno Fuzzy Classification
Authors: Mohammad Sarchami, Mohsen Zeinalkhani
Abstract:
Always, financial reporting system faces some problems to win public ear. The increase in the number of fraud and representation, often combined with the bankruptcy of large companies, has raised concerns about the quality of financial statements. So, investors, legislators, managers, and auditors have focused on significant fraud detection or prevention in financial statements. This article aims to investigate the Sugeno fuzzy classification to consider fraud detection in financial reporting of accepted firms by Tehran stock exchange. The hypothesis is: Sugeno fuzzy classification may detect fraud in financial reporting by financial ratio. Hypothesis was tested using Matlab software. Accuracy average was 81/80 in Sugeno fuzzy classification; so the hypothesis was confirmed.Keywords: fraud, financial reporting, Sugeno fuzzy classification, firm
Procedia PDF Downloads 2534820 One Pot Synthesis of Cu–Ni–S/Ni Foam for the Simultaneous Removal and Detection of Norfloxacin
Authors: Xincheng Jiang, Yanyan An, Yaoyao Huang, Wei Ding, Manli Sun, Hong Li, Huaili Zheng
Abstract:
The residual antibiotics in the environment will pose a threat to the environment and human health. Thus, efficient removal and rapid detection of norfloxacin (NOR) in wastewater is very important. The main sources of NOR pollution are the agricultural, pharmaceutical industry and hospital wastewater. The total consumption of NOR in China can reach 5440 tons per year. It is found that neither animals nor humans can totally absorb and metabolize NOR, resulting in the excretion of NOR into the environment. Therefore, residual NOR has been detected in water bodies. The hazards of NOR in wastewater lie in three aspects: (1) the removal capacity of the wastewater treatment plant for NOR is limited (it is reported that the average removal efficiency of NOR in the wastewater treatment plant is only 68%); (2) NOR entering the environment will lead to the emergence of drug-resistant strains; (3) NOR is toxic to many aquatic species. At present, the removal and detection technologies of NOR are applied separately, which leads to a cumbersome operation process. The development of simultaneous adsorption-flocculation removal and FTIR detection of pollutants has three advantages: (1) Adsorption-flocculation technology promotes the detection technology (the enrichment effect on the material surface improves the detection ability); (2) The integration of adsorption-flocculation technology and detection technology reduces the material cost and makes the operation easier; (3) FTIR detection technology endows the water treatment agent with the ability of molecular recognition and semi-quantitative detection for pollutants. Thus, it is of great significance to develop a smart water treatment material with high removal capacity and detection ability for pollutants. This study explored the feasibility of combining NOR removal method with the semi-quantitative detection method. A magnetic Cu-Ni-S/Ni foam was synthesized by in-situ loading Cu-Ni-S nanostructures on the surface of Ni foam. The novelty of this material is the combination of adsorption-flocculation technology and semi-quantitative detection technology. Batch experiments showed that Cu-Ni-S/Ni foam has a high removal rate of NOR (96.92%), wide pH adaptability (pH=4.0-10.0) and strong ion interference resistance (0.1-100 mmol/L). According to the Langmuir fitting model, the removal capacity can reach 417.4 mg/g at 25 °C, which is much higher than that of other water treatment agents reported in most studies. Characterization analysis indicated that the main removal mechanisms are surface complexation, cation bridging, electrostatic attraction, precipitation and flocculation. Transmission FTIR detection experiments showed that NOR on Cu-Ni-S/Ni foam has easily recognizable FTIR fingerprints; the intensity of characteristic peaks roughly reflects the concentration information to some extent. This semi-quantitative detection method has a wide linear range (5-100 mg/L) and a low limit of detection (4.6 mg/L). These results show that Cu-Ni-S/Ni foam has excellent removal performance and semi-quantitative detection ability of NOR molecules. This paper provides a new idea for designing and preparing multi-functional water treatment materials to achieve simultaneous removal and semi-quantitative detection of organic pollutants in water.Keywords: adsorption-flocculation, antibiotics detection, Cu-Ni-S/Ni foam, norfloxacin
Procedia PDF Downloads 794819 The Qualitative and Quantitative Detection of Pistachio in Processed Food Products Using Florescence Dye Based PCR
Authors: Ergün Şakalar, Şeyma Özçirak Ergün
Abstract:
Pistachio nuts, the fruits of the pistachio tree (Pistacia vera), are edible tree nuts highly valued for their organoleptic properties. Pistachio nuts used in snack foods, chocolates, baklava, meat products, ice-cream industries and other gourmet products as ingredients. Undeclared pistachios may be present in food products as a consequence of fraudulent substitution. Control of food samples is very important for safety and fraud. Mix of pistachio, peanut (Arachis hypogaea), pea (Pisum sativum L.) used instead of pistachio in food products, because pistachio is a considerably expensive nut. To solve this problem, a sensitive polymerase chain reaction PCR has been developed. A real-time PCR assay for the detection of pea, peanut and pistachio in baklava was designed by using EvaGreen fluorescence dye. Primers were selected from powerful regions for identification of pea, peanut and pistachio. DNA from reference samples and industrial products were successfully extracted with the GIDAGEN® Multi-Fast DNA Isolation Kit. Genomes were identified based on their specific melting peaks (Mp) which are 77°C, 85.5°C and 82.5°C for pea, peanut and pistachio, respectively. Homogenized mixtures of raw pistachio, pea and peanut were prepared with the ratio of 0.01%, 0.1%, 1%, 10%, 40% and 70% of pistachio. Quantitative detection limit of assay was 0.1% for pistachio. Also, real-time PCR technique used in this study allowed the qualitative detection of as little as 0.001% level of peanut DNA, 0,000001% level of pistachio DNA and 0.000001% level of pea DNA in the experimental admixtures. This assay represents a potentially valuable diagnostic method for detection of nut species adulterated with pistachio as well as for highly specific and relatively rapid detection of small amounts of pistachio in food samples.Keywords: pea, peanut, pistachio, real-time PCR
Procedia PDF Downloads 2684818 Enhanced Solar-Driven Evaporation Process via F-Mwcnts/Pvdf Photothermal Membrane for Forward Osmosis Draw Solution Recovery
Authors: Ayat N. El-Shazly, Dina Magdy Abdo, Hamdy Maamoun Abdel-Ghafar, Xiangju Song, Heqing Jiang
Abstract:
Product water recovery and draw solution (DS) reuse is the most energy-intensive stage in forwarding osmosis (FO) technology. Sucrose solution is the most suitable DS for FO application in food and beverages. However, sucrose DS recovery by conventional pressure-driven or thermal-driven concentration techniques consumes high energy. Herein, we developed a spontaneous and sustainable solar-driven evaporation process based on a photothermal membrane for the concentration and recovery of sucrose solution. The photothermal membrane is composed of multi-walled carbon nanotubes (f-MWCNTs)photothermal layer on a hydrophilic polyvinylidene fluoride (PVDF) substrate. The f-MWCNTs photothermal layer with a rough surface and interconnected network structures not only improves the light-harvesting and light-to-heat conversion performance but also facilitates the transport of water molecules. The hydrophilic PVDF substrate can promote the rapid transport of water for adequate water supply to the photothermal layer. As a result, the optimized f-MWCNTs/PVDF photothermal membrane exhibits an excellent light absorption of 95%, and a high surface temperature of 74 °C at 1 kW m−2 . Besides, it realizes an evaporation rate of 1.17 kg m−2 h−1 for 5% (w/v) of sucrose solution, which is about 5 times higher than that of the natural evaporation. The designed photothermal evaporation process is capable of concentrating sucrose solution efficiently from 5% to 75% (w/v), which has great potential in FO process and juice concentration.Keywords: solar, pothothermal, membrane, MWCNT
Procedia PDF Downloads 1024817 Rupture Termination of the 1950 C. E. Earthquake and Recurrent Interval of Great Earthquake in North Eastern Himalaya, India
Authors: Rao Singh Priyanka, Jayangondaperumal R.
Abstract:
The Himalayan active fault has the potential to generate great earthquakes in the future, posing a biggest existential threat to humans in the Himalayan and adjacent region. Quantitative evaluation of accumulated and released interseismic strain is crucial to assess the magnitude and spatio-temporal variability of future great earthquakes along the Himalayan arc. To mitigate the destruction and hazards associated with such earthquakes, it is important to understand their recurrence cycle. The eastern Himalayan and Indo-Burman plate boundary systems offers an oblique convergence across two orthogonal plate boundaries, resulting in a zone of distributed deformation both within and away from the plate boundary and clockwise rotation of fault-bounded blocks. This seismically active region has poorly documented historical archive of the past large earthquakes. Thus, paleoseismologicalstudies confirm the surface rupture evidences of the great continental earthquakes (Mw ≥ 8) along the Himalayan Frontal Thrust (HFT), which along with the Geodetic studies, collectively provide the crucial information to understand and assess the seismic potential. These investigations reveal the rupture of 3/4th of the HFT during great events since medieval time but with debatable opinions for the timing of events due to unclear evidences, ignorance of transverse segment boundaries, and lack of detail studies. Recent paleoseismological investigations in the eastern Himalaya and Mishmi ranges confirms the primary surface ruptures of the 1950 C.E. great earthquake (M>8). However, a seismic gap exists between the 1714 C.E. and 1950 C.E. Assam earthquakes that did not slip since 1697 C.E. event. Unlike the latest large blind 2015 Gorkha earthquake (Mw 7.8), the 1950 C.E. event is not triggered by a large event of 1947 C.E. that occurred near the western edge of the great upper Assam event. Moreover, the western segment of the eastern Himalayadid not witness any surface breaking earthquake along the HFT for over the past 300 yr. The frontal fault excavations reveal that during the 1950 earthquake, ~3.1-m-high scarp along the HFT was formed due to the co-seismic slip of 5.5 ± 0.7 m at Pasighat in the Eastern Himalaya and a 10-m-high-scarp at a Kamlang Nagar along the Mishmi Thrust in the Eastern Himalayan Syntaxis is an outcome of a dip-slip displacement of 24.6 ± 4.6 m along a 25 ± 5°E dipping fault. This event has ruptured along the two orthogonal fault systems in the form of oblique thrust fault mechanism. Approx. 130 km west of Pasighat site, the Himebasti village has witnessed two earthquakes, the historical 1697 Sadiya earthquake, and the 1950 event, with a cumulative dip-slip displacement of 15.32 ± 4.69 m. At Niglok site, Arunachal Pradesh, a cumulative slip of ~12.82 m during at least three events since pre 19585 B.P. has produced ~6.2-m high scarp while the youngest scarp of ~2.4-m height has been produced during 1697 C.E. The site preserves two deformational events along the eastern HFT, providing an idea of last serial ruptures at an interval of ~850 yearswhile the successive surface rupturing earthquakes lacks in the Mishmi Range to estimate the recurrence cycle.Keywords: paleoseismology, surface rupture, recurrence interval, Eastern Himalaya
Procedia PDF Downloads 86