Search results for: natural hazard detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9186

Search results for: natural hazard detection

9006 Vibration Based Damage Detection and Stiffness Reduction of Bridges: Experimental Study on a Small Scale Concrete Bridge

Authors: Mirco Tarozzi, Giacomo Pignagnoli, Andrea Benedetti

Abstract:

Structural systems are often subjected to degradation processes due to different kind of phenomena like unexpected loadings, ageing of the materials and fatigue cycles. This is true especially for bridges, in which their safety evaluation is crucial for the purpose of a design of planning maintenance. This paper discusses the experimental evaluation of the stiffness reduction from frequency changes due to uniform damage scenario. For this purpose, a 1:4 scaled bridge has been built in the laboratory of the University of Bologna. It is made of concrete and its cross section is composed by a slab linked to four beams. This concrete deck is 6 m long and 3 m wide, and its natural frequencies have been identified dynamically by exciting it with an impact hammer, a dropping weight, or by walking on it randomly. After that, a set of loading cycles has been applied to this bridge in order to produce a uniformly distributed crack pattern. During the loading phase, either cracking moment and yielding moment has been reached. In order to define the relationship between frequency variation and loss in stiffness, the identification of the natural frequencies of the bridge has been performed, before and after the occurrence of the damage, corresponding to each load step. The behavior of breathing cracks and its effect on the natural frequencies has been taken into account in the analytical calculations. By using a sort of exponential function given from the study of lot of experimental tests in the literature, it has been possible to predict the stiffness reduction through the frequency variation measurements. During the load test also crack opening and middle span vertical displacement has been monitored.

Keywords: concrete bridge, damage detection, dynamic test, frequency shifts, operational modal analysis

Procedia PDF Downloads 159
9005 Formulation of a Rapid Earthquake Risk Ranking Criteria for National Bridges in the National Capital Region Affected by the West Valley Fault Using GIS Data Integration

Authors: George Mariano Soriano

Abstract:

In this study, a Rapid Earthquake Risk Ranking Criteria was formulated by integrating various existing maps and databases by the Department of Public Works and Highways (DPWH) and Philippine Institute of Volcanology and Seismology (PHIVOLCS). Utilizing Geographic Information System (GIS) software, the above-mentioned maps and databases were used in extracting seismic hazard parameters and bridge vulnerability characteristics in order to rank the seismic damage risk rating of bridges in the National Capital Region.

Keywords: bridge, earthquake, GIS, hazard, risk, vulnerability

Procedia PDF Downloads 375
9004 Real-Time Pedestrian Detection Method Based on Improved YOLOv3

Authors: Jingting Luo, Yong Wang, Ying Wang

Abstract:

Pedestrian detection in image or video data is a very important and challenging task in security surveillance. The difficulty of this task is to locate and detect pedestrians of different scales in complex scenes accurately. To solve these problems, a deep neural network (RT-YOLOv3) is proposed to realize real-time pedestrian detection at different scales in security monitoring. RT-YOLOv3 improves the traditional YOLOv3 algorithm. Firstly, the deep residual network is added to extract vehicle features. Then six convolutional neural networks with different scales are designed and fused with the corresponding scale feature maps in the residual network to form the final feature pyramid to perform pedestrian detection tasks. This method can better characterize pedestrians. In order to further improve the accuracy and generalization ability of the model, a hybrid pedestrian data set training method is used to extract pedestrian data from the VOC data set and train with the INRIA pedestrian data set. Experiments show that the proposed RT-YOLOv3 method achieves 93.57% accuracy of mAP (mean average precision) and 46.52f/s (number of frames per second). In terms of accuracy, RT-YOLOv3 performs better than Fast R-CNN, Faster R-CNN, YOLO, SSD, YOLOv2, and YOLOv3. This method reduces the missed detection rate and false detection rate, improves the positioning accuracy, and meets the requirements of real-time detection of pedestrian objects.

Keywords: pedestrian detection, feature detection, convolutional neural network, real-time detection, YOLOv3

Procedia PDF Downloads 113
9003 Comparison of Vessel Detection in Standard vs Ultra-WideField Retinal Images

Authors: Maher un Nisa, Ahsan Khawaja

Abstract:

Retinal imaging with Ultra-WideField (UWF) view technology has opened up new avenues in the field of retinal pathology detection. Recent developments in retinal imaging such as Optos California Imaging Device helps in acquiring high resolution images of the retina to help the Ophthalmologists in diagnosing and analyzing eye related pathologies more accurately. This paper investigates the acquired retinal details by comparing vessel detection in standard 450 color fundus images with the state of the art 2000 UWF retinal images.

Keywords: color fundus, retinal images, ultra-widefield, vessel detection

Procedia PDF Downloads 424
9002 Application Difference between Cox and Logistic Regression Models

Authors: Idrissa Kayijuka

Abstract:

The logistic regression and Cox regression models (proportional hazard model) at present are being employed in the analysis of prospective epidemiologic research looking into risk factors in their application on chronic diseases. However, a theoretical relationship between the two models has been studied. By definition, Cox regression model also called Cox proportional hazard model is a procedure that is used in modeling data regarding time leading up to an event where censored cases exist. Whereas the Logistic regression model is mostly applicable in cases where the independent variables consist of numerical as well as nominal values while the resultant variable is binary (dichotomous). Arguments and findings of many researchers focused on the overview of Cox and Logistic regression models and their different applications in different areas. In this work, the analysis is done on secondary data whose source is SPSS exercise data on BREAST CANCER with a sample size of 1121 women where the main objective is to show the application difference between Cox regression model and logistic regression model based on factors that cause women to die due to breast cancer. Thus we did some analysis manually i.e. on lymph nodes status, and SPSS software helped to analyze the mentioned data. This study found out that there is an application difference between Cox and Logistic regression models which is Cox regression model is used if one wishes to analyze data which also include the follow-up time whereas Logistic regression model analyzes data without follow-up-time. Also, they have measurements of association which is different: hazard ratio and odds ratio for Cox and logistic regression models respectively. A similarity between the two models is that they are both applicable in the prediction of the upshot of a categorical variable i.e. a variable that can accommodate only a restricted number of categories. In conclusion, Cox regression model differs from logistic regression by assessing a rate instead of proportion. The two models can be applied in many other researches since they are suitable methods for analyzing data but the more recommended is the Cox, regression model.

Keywords: logistic regression model, Cox regression model, survival analysis, hazard ratio

Procedia PDF Downloads 423
9001 Detection of Clipped Fragments in Speech Signals

Authors: Sergei Aleinik, Yuri Matveev

Abstract:

In this paper a novel method for the detection of clipping in speech signals is described. It is shown that the new method has better performance than known clipping detection methods, is easy to implement, and is robust to changes in signal amplitude, size of data, etc. Statistical simulation results are presented.

Keywords: clipping, clipped signal, speech signal processing, digital signal processing

Procedia PDF Downloads 362
9000 Evaluating Performance of an Anomaly Detection Module with Artificial Neural Network Implementation

Authors: Edward Guillén, Jhordany Rodriguez, Rafael Páez

Abstract:

Anomaly detection techniques have been focused on two main components: data extraction and selection and the second one is the analysis performed over the obtained data. The goal of this paper is to analyze the influence that each of these components has over the system performance by evaluating detection over network scenarios with different setups. The independent variables are as follows: the number of system inputs, the way the inputs are codified and the complexity of the analysis techniques. For the analysis, some approaches of artificial neural networks are implemented with different number of layers. The obtained results show the influence that each of these variables has in the system performance.

Keywords: network intrusion detection, machine learning, artificial neural network, anomaly detection module

Procedia PDF Downloads 305
8999 Automatic Change Detection for High-Resolution Satellite Images of Urban and Suburban Areas

Authors: Antigoni Panagiotopoulou, Lemonia Ragia

Abstract:

High-resolution satellite images can provide detailed information about change detection on the earth. In the present work, QuickBird images of spatial resolution 60 cm/pixel and WorldView images of resolution 30 cm/pixel are utilized to perform automatic change detection in urban and suburban areas of Crete, Greece. There is a relative time difference of 13 years among the satellite images. Multiindex scene representation is applied on the images to classify the scene into buildings, vegetation, water and ground. Then, automatic change detection is made possible by pixel-per-pixel comparison of the classified multi-temporal images. The vegetation index and the water index which have been developed in this study prove effective. Furthermore, the proposed change detection approach not only indicates whether changes have taken place or not but also provides specific information relative to the types of changes. Experimentations with other different scenes in the future could help optimize the proposed spectral indices as well as the entire change detection methodology.

Keywords: change detection, multiindex scene representation, spectral index, QuickBird, WorldView

Procedia PDF Downloads 110
8998 A Hazard Rate Function for the Time of Ruin

Authors: Sule Sahin, Basak Bulut Karageyik

Abstract:

This paper introduces a hazard rate function for the time of ruin to calculate the conditional probability of ruin for very small intervals. We call this function the force of ruin (FoR). We obtain the expected time of ruin and conditional expected time of ruin from the exact finite time ruin probability with exponential claim amounts. Then we introduce the FoR which gives the conditional probability of ruin and the condition is that ruin has not occurred at time t. We analyse the behavior of the FoR function for different initial surpluses over a specific time interval. We also obtain FoR under the excess of loss reinsurance arrangement and examine the effect of reinsurance on the FoR.

Keywords: conditional time of ruin, finite time ruin probability, force of ruin, reinsurance

Procedia PDF Downloads 359
8997 The Laser Line Detection for Autonomous Mapping Based on Color Segmentation

Authors: Pavel Chmelar, Martin Dobrovolny

Abstract:

Laser projection or laser footprint detection is today widely used in many fields of robotics, measurement, or electronics. The system accuracy strictly depends on precise laser footprint detection on target objects. This article deals with the laser line detection based on the RGB segmentation and the component labeling. As a measurement device was used the developed optical rangefinder. The optical rangefinder is equipped with vertical sweeping of the laser beam and high quality camera. This system was developed mainly for automatic exploration and mapping of unknown spaces. In the first section is presented a new detection algorithm. In the second section are presented measurements results. The measurements were performed in variable light conditions in interiors. The last part of the article present achieved results and their differences between day and night measurements.

Keywords: color segmentation, component labelling, laser line detection, automatic mapping, distance measurement, vector map

Procedia PDF Downloads 396
8996 A Background Subtraction Based Moving Object Detection Around the Host Vehicle

Authors: Hyojin Lim, Cuong Nguyen Khac, Ho-Youl Jung

Abstract:

In this paper, we propose moving object detection method which is helpful for driver to safely take his/her car out of parking lot. When moving objects such as motorbikes, pedestrians, the other cars and some obstacles are detected at the rear-side of host vehicle, the proposed algorithm can provide to driver warning. We assume that the host vehicle is just before departure. Gaussian Mixture Model (GMM) based background subtraction is basically applied. Pre-processing such as smoothing and post-processing as morphological filtering are added.We examine “which color space has better performance for detection of moving objects?” Three color spaces including RGB, YCbCr, and Y are applied and compared, in terms of detection rate. Through simulation, we prove that RGB space is more suitable for moving object detection based on background subtraction.

Keywords: gaussian mixture model, background subtraction, moving object detection, color space, morphological filtering

Procedia PDF Downloads 583
8995 Efficiency on the Enteric Viral Removal in Four Potable Water Treatment Plants in Northeastern Colombia

Authors: Raquel Amanda Villamizar Gallardo, Oscar Orlando Ortíz Rodríguez

Abstract:

Enteric viruses are cosmopolitan agents present in several environments including water. These viruses can cause different diseases including gastroenteritis, hepatitis, conjunctivitis, respiratory problems among others. Although in Colombia there are not regulations concerning to routine viral analysis of drinking water, an enhanced understanding of viral pollution and resistance to treatments is desired in order to assure pure water to the population. Viral detection is often complex due to the need of specialized and time-consuming procedures. In addition, viruses are highly diluted in water which is a drawback from the analytical point of view. To this end, a fast and selective detection method for detection enteric viruses (i.e. Hepatitis A and Rotavirus) were applied. Micro- magnetic particles were functionalized with monoclonal antibodies anti-Hepatitis and anti-Rotavirus and they were used to capture, concentrate and separate whole viral particles in raw water and drinking water samples from four treatment plants identified as CAR-01, MON-02, POR-03, TON-04 and located in the Northeastern Colombia. Viruses were molecularly by using RT-PCR One Step Superscript III. Each plant was analyzed at the entry and exit points, in order to determine the initial presence and eventual reduction of Hepatitis A and Rotavirus after disinfection. The results revealed the presence of both enteric viruses in a 100 % of raw water analyzed in all plants. This represents a potential health hazard, especially for those people whose use this water for agricultural purposes. However, in drinking water analysis, enteric viruses was only positive in CAR-01, where was found the presence of Rotavirus. As a conclusion, the results confirm Rotavirus as the best indicator to evaluate the efficacy of potable treatment plant in eliminating viruses. CAR potable water plant should improve their disinfection process in order to remove efficiently enteric viruses.

Keywords: drinking water, hepatitis A, rotavirus, virus removal

Procedia PDF Downloads 199
8994 The Comparation of Limits of Detection of Lateral Flow Immunochromatographic Strips of Different Types of Mycotoxins

Authors: Xinyi Zhao, Furong Tian

Abstract:

Mycotoxins are secondary metabolic products of fungi. These are poisonous, carcinogens and mutagens in nature and pose a serious health threat to both humans and animals, causing severe illnesses and even deaths. The rapid, simple and cheap detection methods of mycotoxins are of immense importance and in great demand in the food and beverage industry as well as in agriculture and environmental monitoring. Lateral flow immunochromatographic strips (ICSTs) have been widely used in food safety, environment monitoring. Forty-six papers were identified and reviewed on Google Scholar and Scopus for their limit of detection and nanomaterial on Lateral flow immunochromatographic strips on different types of mycotoxins. The papers were dated 2001-2021. Twenty five papers were compared to identify the lowest limit of detection of among different mycotoxins (Aflatoxin B1: 10, Zearalenone:5, Fumonisin B1: 5, Trichothecene-A: 5). Most of these highly sensitive strips are competitive. Sandwich structure are usually used in large scale detection. In conclusion, the mycotoxin receives that most researches is aflatoxin B1 and its limit of detection is the lowest. Gold-nanopaticle based immunochromatographic test strips has the lowest limit of detection. Five papers involve smartphone detection and they all detect aflatoxin B1 with gold nanoparticles. In these papers, quantitative concentration results can be obtained when the user uploads the photograph of test lines using the smartphone application.

Keywords: aflatoxin B1, limit of detection, gold nanoparticle, lateral flow immunochromatographic strips, mycotoxins

Procedia PDF Downloads 165
8993 Paper-Based Detection Using Synthetic Gene Circuits

Authors: Vanessa Funk, Steven Blum, Stephanie Cole, Jorge Maciel, Matthew Lux

Abstract:

Paper-based synthetic gene circuits offer a new paradigm for programmable, fieldable biodetection. We demonstrate that by freeze-drying gene circuits with in vitro expression machinery, we can use complimentary RNA sequences to trigger colorimetric changes upon rehydration. We have successfully utilized both green fluorescent protein and luciferase-based reporters for easy visualization purposes in solution. Through several efforts, we are aiming to use this new platform technology to address a variety of needs in portable detection by demonstrating several more expression and reporter systems for detection functions on paper. In addition to RNA-based biodetection, we are exploring the use of various mechanisms that cells use to respond to environmental conditions to move towards all-hazards detection. Examples include explosives, heavy metals for water quality, and toxic chemicals.

Keywords: cell-free lysates, detection, gene circuits, in vitro

Procedia PDF Downloads 365
8992 A Highly Sensitive Dip Strip for Detection of Phosphate in Water

Authors: Hojat Heidari-Bafroui, Amer Charbaji, Constantine Anagnostopoulos, Mohammad Faghri

Abstract:

Phosphorus is an essential nutrient for plant life which is most frequently found as phosphate in water. Once phosphate is found in abundance in surface water, a series of adverse effects on an ecosystem can be initiated. Therefore, a portable and reliable method is needed to monitor the phosphate concentrations in the field. In this paper, an inexpensive dip strip device with the ascorbic acid/antimony reagent dried on blotting paper along with wet chemistry is developed for the detection of low concentrations of phosphate in water. Ammonium molybdate and sulfuric acid are separately stored in liquid form so as to improve significantly the lifetime of the device and enhance the reproducibility of the device’s performance. The limit of detection and quantification for the optimized device are 0.134 ppm and 0.472 ppm for phosphate in water, respectively. The device’s shelf life, storage conditions, and limit of detection are superior to what has been previously reported for the paper-based phosphate detection devices.

Keywords: phosphate detection, paper-based device, molybdenum blue method, colorimetric assay

Procedia PDF Downloads 140
8991 Adaptive Nonparametric Approach for Guaranteed Real-Time Detection of Targeted Signals in Multichannel Monitoring Systems

Authors: Andrey V. Timofeev

Abstract:

An adaptive nonparametric method is proposed for stable real-time detection of seismoacoustic sources in multichannel C-OTDR systems with a significant number of channels. This method guarantees given upper boundaries for probabilities of Type I and Type II errors. Properties of the proposed method are rigorously proved. The results of practical applications of the proposed method in a real C-OTDR-system are presented in this report.

Keywords: guaranteed detection, multichannel monitoring systems, change point, interval estimation, adaptive detection

Procedia PDF Downloads 420
8990 Intrusion Detection Using Dual Artificial Techniques

Authors: Rana I. Abdulghani, Amera I. Melhum

Abstract:

With the abnormal growth of the usage of computers over networks and under the consideration or agreement of most of the computer security experts who said that the goal of building a secure system is never achieved effectively, all these points led to the design of the intrusion detection systems(IDS). This research adopts a comparison between two techniques for network intrusion detection, The first one used the (Particles Swarm Optimization) that fall within the field (Swarm Intelligence). In this Act, the algorithm Enhanced for the purpose of obtaining the minimum error rate by amending the cluster centers when better fitness function is found through the training stages. Results show that this modification gives more efficient exploration of the original algorithm. The second algorithm used a (Back propagation NN) algorithm. Finally a comparison between the results of two methods used were based on (NSL_KDD) data sets for the construction and evaluation of intrusion detection systems. This research is only interested in clustering the two categories (Normal and Abnormal) for the given connection records. Practices experiments result in intrude detection rate (99.183818%) for EPSO and intrude detection rate (69.446416%) for BP neural network.

Keywords: IDS, SI, BP, NSL_KDD, PSO

Procedia PDF Downloads 357
8989 Fingerprint on Ballistic after Shooting

Authors: Narong Kulnides

Abstract:

This research involved fingerprints on ballistics after shooting. Two objectives of research were as follows; (1) to study the duration of the existence of latent fingerprints on .38, .45, 9 mm and .223 cartridge case after shooting, and (2) to compare the effectiveness of the detection of latent fingerprints by Black Powder, Super Glue, Perma Blue and Gun Bluing. The latent fingerprint appearance were studied on .38, .45, 9 mm. and .223 cartridge cases before and after shooting with Black Powder, Super Glue, Perma Blue and Gun Bluing. The detection times were 3 minute, 6, 12, 18, 24, 30, 36, 42, 48, 54, 60, 66, 72, 78 and 84 hours respectively. As a result of the study, it can be conclude that: (1) Before shooting, the detection of latent fingerprints on 38, .45, and 9 mm. and .223 cartridge cases with Black Powder, Super Glue, Perma Blue and Gun Bluing can detect the fingerprints at all detection times. (2) After shooting, the detection of latent fingerprints on .38, .45, 9 mm. and .223 cartridge cases with Black Powder, Super Glue did not appear. The detection of latent fingerprints on .38, .45, 9 mm. cartridge cases with Perma Blue and Gun Bluing were found 100% of the time and the detection of latent fingerprints on .223 cartridge cases with Perma Blue and Gun Bluing were found 40% and 46.67% of the time, respectively.

Keywords: ballistic, fingerprint, shooting, detection times

Procedia PDF Downloads 397
8988 The Application of Hellomac Rockfall Alert System in Rockfall Barriers: An Explainer

Authors: Kinjal Parmar, Matteo Lelli

Abstract:

The usage of IoT technology as a rockfall alert system is relatively new. This paper explains the potential of such an alert system called HelloMac from Maccaferri which provides transportation infrastructure asset owners the way to effectively utilize their resources in the detection of boulder impacts on rockfall barriers. This would ensure a faster assessment of the impacted barrier and subsequently facilitates the implementation of remedial works in an effective and timely manner. In addition, the HelloMac can also be integrated with another warning system to alert vehicle users of the unseen dangers ahead. HelloMac is developed to work also in remote areas, where cell coverage is not available. User gets notified when a rockfall even occurs via mobile app, SMS and email. Using such alarming systems effectively, we can reduce the risk of rockfall hazard.

Keywords: rockfall, barrier, HelloMac, rockfall alert system

Procedia PDF Downloads 19
8987 Using Information Theory to Observe Natural Intelligence and Artificial Intelligence

Authors: Lipeng Zhang, Limei Li, Yanming Pearl Zhang

Abstract:

This paper takes a philosophical view as axiom, and reveals the relationship between information theory and Natural Intelligence and Artificial Intelligence under real world conditions. This paper also derives the relationship between natural intelligence and nature. According to communication principle of information theory, Natural Intelligence can be divided into real part and virtual part. Based on information theory principle that Information does not increase, the restriction mechanism of Natural Intelligence creativity is conducted. The restriction mechanism of creativity reveals the limit of natural intelligence and artificial intelligence. The paper provides a new angle to observe natural intelligence and artificial intelligence.

Keywords: natural intelligence, artificial intelligence, creativity, information theory, restriction of creativity

Procedia PDF Downloads 345
8986 Post-Earthquake Road Damage Detection by SVM Classification from Quickbird Satellite Images

Authors: Moein Izadi, Ali Mohammadzadeh

Abstract:

Detection of damaged parts of roads after earthquake is essential for coordinating rescuers. In this study, an approach is presented for the semi-automatic detection of damaged roads in a city using pre-event vector maps and both pre- and post-earthquake QuickBird satellite images. Damage is defined in this study as the debris of damaged buildings adjacent to the roads. Some spectral and texture features are considered for SVM classification step to detect damages. Finally, the proposed method is tested on QuickBird pan-sharpened images from the Bam City earthquake and the results show that an overall accuracy of 81% and a kappa coefficient of 0.71 are achieved for the damage detection. The obtained results indicate the efficiency and accuracy of the proposed approach.

Keywords: SVM classifier, disaster management, road damage detection, quickBird images

Procedia PDF Downloads 593
8985 The Effects of Extreme Precipitation Events on Ecosystem Services

Authors: Szu-Hua Wang, Yi-Wen Chen

Abstract:

Urban ecosystems are complex coupled human-environment systems. They contain abundant natural resources for producing natural assets and attract urban assets to consume natural resources for urban development. Urban ecosystems provide several ecosystem services, including provisioning services, regulating services, cultural services, and supporting services. Rapid global climate change makes urban ecosystems and their ecosystem services encountering various natural disasters. Lots of natural disasters have occurred around the world under the constant changes in the frequency and intensity of extreme weather events in the past two decades. In Taiwan, hydrological disasters have been paid more attention due to the potential high sensitivity of Taiwan’s cities to climate change, and it impacts. However, climate change not only causes extreme weather events directly but also affects the interactions among human, ecosystem services and their dynamic feedback processes indirectly. Therefore, this study adopts a systematic method, solar energy synthesis, based on the concept of the eco-energy analysis. The Taipei area, the most densely populated area in Taiwan, is selected as the study area. The changes of ecosystem services between 2015 and Typhoon Soudelor have been compared in order to investigate the impacts of extreme precipitation events on ecosystem services. The results show that the forest areas are the largest contributions of energy to ecosystem services in the Taipei area generally. Different soil textures of different subsystem have various upper limits of water contents or substances. The major contribution of ecosystem services of the study area is natural hazard regulation provided by the surface water resources areas. During the period of Typhoon Soudelor, the freshwater supply in the forest areas had become the main contribution. Erosion control services were the main ecosystem service affected by Typhoon Soudelor. The second and third main ecosystem services were hydrologic regulation and food supply. Due to the interactions among ecosystem services, fresh water supply, water purification, and waste treatment had been affected severely.

Keywords: ecosystem, extreme precipitation events, ecosystem services, solar energy synthesis

Procedia PDF Downloads 118
8984 Anthraquinone Labelled DNA for Direct Detection and Discrimination of Closely Related DNA Targets

Authors: Sarah A. Goodchild, Rachel Gao, Philip N. Bartlett

Abstract:

A novel detection approach using immobilized DNA probes labeled with Anthraquinone (AQ) as an electrochemically active reporter moiety has been successfully developed as a new, simple, reliable method for the detection of DNA. This method represents a step forward in DNA detection as it can discriminate between multiple nucleotide polymorphisms within target DNA strands without the need for any additional reagents, reporters or processes such as melting of DNA strands. The detection approach utilizes single-stranded DNA probes immobilized on gold surfaces labeled at the distal terminus with AQ. The effective immobilization has been monitored using techniques such as AC impedance and Raman spectroscopy. Simple voltammetry techniques (Differential Pulse Voltammetry, Cyclic Voltammetry) are then used to monitor the reduction potential of the AQ before and after the addition of complementary strand of target DNA. A reliable relationship between the shift in reduction potential and the number of base pair mismatch has been established and can be used to discriminate between DNA from highly related pathogenic organisms of clinical importance. This indicates that this approach may have great potential to be exploited within biosensor kits for detection and diagnosis of pathogenic organisms in Point of Care devices.

Keywords: Anthraquinone, discrimination, DNA detection, electrochemical biosensor

Procedia PDF Downloads 369
8983 Comparison of Methodologies to Compute the Probabilistic Seismic Hazard Involving Faults and Associated Uncertainties

Authors: Aude Gounelle, Gloria Senfaute, Ludivine Saint-Mard, Thomas Chartier

Abstract:

The long-term deformation rates of faults are not fully captured by Probabilistic Seismic Hazard Assessment (PSHA). PSHA that use catalogues to develop area or smoothed-seismicity sources is limited by the data available to constraint future earthquakes activity rates. The integration of faults in PSHA can at least partially address the long-term deformation. However, careful treatment of fault sources is required, particularly, in low strain rate regions, where estimated seismic hazard levels are highly sensitive to assumptions concerning fault geometry, segmentation and slip rate. When integrating faults in PSHA various constraints on earthquake rates from geologic and seismologic data have to be satisfied. For low strain rate regions where such data is scarce it would be especially challenging. Faults in PSHA requires conversion of the geologic and seismologic data into fault geometries, slip rates and then into earthquake activity rates. Several approaches exist for translating slip rates into earthquake activity rates. In the most frequently used approach, the background earthquakes are handled using a truncated approach, in which earthquakes with a magnitude lower or equal to a threshold magnitude (Mw) occur in the background zone, with a rate defined by the rate in the earthquake catalogue. Although magnitudes higher than the threshold are located on the fault with a rate defined using the average slip rate of the fault. As high-lighted by several research, seismic events with magnitudes stronger than the selected magnitude threshold may potentially occur in the background and not only at the fault, especially in regions of slow tectonic deformation. It also has been known that several sections of a fault or several faults could rupture during a single fault-to-fault rupture. It is then essential to apply a consistent modelling procedure to allow for a large set of possible fault-to-fault ruptures to occur aleatory in the hazard model while reflecting the individual slip rate of each section of the fault. In 2019, a tool named SHERIFS (Seismic Hazard and Earthquake Rates in Fault Systems) was published. The tool is using a methodology to calculate the earthquake rates in a fault system where the slip-rate budget of each fault is conversed into rupture rates for all possible single faults and faultto-fault ruptures. The objective of this paper is to compare the SHERIFS method with one other frequently used model to analyse the impact on the seismic hazard and through sensibility studies better understand the influence of key parameters and assumptions. For this application, a simplified but realistic case study was selected, which is in an area of moderate to hight seismicity (South Est of France) and where the fault is supposed to have a low strain.

Keywords: deformation rates, faults, probabilistic seismic hazard, PSHA

Procedia PDF Downloads 25
8982 GIS and Remote Sensing Approach in Earthquake Hazard Assessment and Monitoring: A Case Study in the Momase Region of Papua New Guinea

Authors: Tingneyuc Sekac, Sujoy Kumar Jana, Indrajit Pal, Dilip Kumar Pal

Abstract:

Tectonism induced Tsunami, landslide, ground shaking leading to liquefaction, infrastructure collapse, conflagration are the common earthquake hazards that are experienced worldwide. Apart from human casualty, the damage to built-up infrastructures like roads, bridges, buildings and other properties are the collateral episodes. The appropriate planning must precede with a view to safeguarding people’s welfare, infrastructures and other properties at a site based on proper evaluation and assessments of the potential level of earthquake hazard. The information or output results can be used as a tool that can assist in minimizing risk from earthquakes and also can foster appropriate construction design and formulation of building codes at a particular site. Different disciplines adopt different approaches in assessing and monitoring earthquake hazard throughout the world. For the present study, GIS and Remote Sensing potentials were utilized to evaluate and assess earthquake hazards of the study region. Subsurface geology and geomorphology were the common features or factors that were assessed and integrated within GIS environment coupling with seismicity data layers like; Peak Ground Acceleration (PGA), historical earthquake magnitude and earthquake depth to evaluate and prepare liquefaction potential zones (LPZ) culminating in earthquake hazard zonation of our study sites. The liquefaction can eventuate in the aftermath of severe ground shaking with amenable site soil condition, geology and geomorphology. The latter site conditions or the wave propagation media were assessed to identify the potential zones. The precept has been that during any earthquake event the seismic wave is generated and propagates from earthquake focus to the surface. As it propagates, it passes through certain geological or geomorphological and specific soil features, where these features according to their strength/stiffness/moisture content, aggravates or attenuates the strength of wave propagation to the surface. Accordingly, the resulting intensity of shaking may or may not culminate in the collapse of built-up infrastructures. For the case of earthquake hazard zonation, the overall assessment was carried out through integrating seismicity data layers with LPZ. Multi-criteria Evaluation (MCE) with Saaty’s Analytical Hierarchy Process (AHP) was adopted for this study. It is a GIS technology that involves integration of several factors (thematic layers) that can have a potential contribution to liquefaction triggered by earthquake hazard. The factors are to be weighted and ranked in the order of their contribution to earthquake induced liquefaction. The weightage and ranking assigned to each factor are to be normalized with AHP technique. The spatial analysis tools i.e., Raster calculator, reclassify, overlay analysis in ArcGIS 10 software were mainly employed in the study. The final output of LPZ and Earthquake hazard zones were reclassified to ‘Very high’, ‘High’, ‘Moderate’, ‘Low’ and ‘Very Low’ to indicate levels of hazard within a study region.

Keywords: hazard micro-zonation, liquefaction, multi criteria evaluation, tectonism

Procedia PDF Downloads 236
8981 Natural Gas Production Forecasts Using Diffusion Models

Authors: Md. Abud Darda

Abstract:

Different options for natural gas production in wide geographic areas may be described through diffusion of innovation models. This type of modeling approach provides an indirect estimate of an ultimately recoverable resource, URR, capture the quantitative effects of observed strategic interventions, and allow ex-ante assessments of future scenarios over time. In order to ensure a sustainable energy policy, it is important to forecast the availability of this natural resource. Considering a finite life cycle, in this paper we try to investigate the natural gas production of Myanmar and Algeria, two important natural gas provider in the world energy market. A number of homogeneous and heterogeneous diffusion models, with convenient extensions, have been used. Models validation has also been performed in terms of prediction capability.

Keywords: diffusion models, energy forecast, natural gas, nonlinear production

Procedia PDF Downloads 201
8980 A CFD Analysis of Flow through a High-Pressure Natural Gas Pipeline with an Undeformed and Deformed Orifice Plate

Authors: R. Kiš, M. Malcho, M. Janovcová

Abstract:

This work aims to present a numerical analysis of the natural gas which flows through a high-pressure pipeline and an orifice plate, through the use of CFD methods. The paper contains CFD calculations for the flow of natural gas in a pipe with different geometry used for the orifice plates. One of them has a standard geometry and a shape without any deformation and the other is deformed by the action of the pressure differential. It shows the behaviour of natural gas in a pipeline using the velocity profiles and pressure fields of the gas in both models with their differences. The entire research is based on the elimination of any inaccuracy which should appear in the flow of the natural gas measured in the high-pressure pipelines of the gas industry and which is currently not given in the relevant standard.

Keywords: orifice plate, high-pressure pipeline, natural gas, CFD analysis

Procedia PDF Downloads 357
8979 Detection of New Attacks on Ubiquitous Services in Cloud Computing and Countermeasures

Authors: L. Sellami, D. Idoughi, P. F. Tiako

Abstract:

Cloud computing provides infrastructure to the enterprise through the Internet allowing access to cloud services at anytime and anywhere. This pervasive aspect of the services, the distributed nature of data and the wide use of information make cloud computing vulnerable to intrusions that violate the security of the cloud. This requires the use of security mechanisms to detect malicious behavior in network communications and hosts such as intrusion detection systems (IDS). In this article, we focus on the detection of intrusion into the cloud sing IDSs. We base ourselves on client authentication in the computing cloud. This technique allows to detect the abnormal use of ubiquitous service and prevents the intrusion of cloud computing. This is an approach based on client authentication data. Our IDS provides intrusion detection inside and outside cloud computing network. It is a double protection approach: The security user node and the global security cloud computing.

Keywords: cloud computing, intrusion detection system, privacy, trust

Procedia PDF Downloads 282
8978 Seismotectonics of Southern Haiti: A Faulting Model for the 12 January 2010 M7 Earthquake

Authors: Newdeskarl Saint Fleur, Nathalie Feuillet, Raphaël Grandin, Éric Jacques, Jennifer Weil-Accardo, Yann Klinger

Abstract:

The prevailing consensus is that the 2010 Mw7.0 Haiti earthquake left the Enriquillo–Plantain Garden strike-slip Fault (EPGF) unruptured but broke unmapped blind north-dipping thrusts. Using high-resolution topography, aerial images, bathymetry and geology we identified previously unrecognized south-dipping NW-SE-striking active thrusts in southern Haiti. One of them, Lamentin thrust (LT), cuts across the crowded city of Carrefour, extends offshore into Port-au-Prince Bay and connects at depth with the EPGF. We propose that both faults broke in 2010. The rupture likely initiated on the thrust and propagated further along the EPGF due to unclamping. This scenario is consistent with geodetic, seismological and field data. The 2010 earthquake increased the stress toward failure on the unruptured segments of the EPGF and on neighboring thrusts, significantly increasing the seismic hazard in the Port-au-Prince urban area. The numerous active thrusts recognized in that area must be considered for future evaluation of the seismic hazard.

Keywords: active faulting, enriquillo-plantain garden fault, Haiti earthquake, seismic hazard

Procedia PDF Downloads 1206
8977 Process Safety Evaluation of a Nuclear Power Plant through Virtual Process Hazard Analysis (PHA) using the What-If Technique

Authors: Lormaine Anne Branzuela, Elysa Largo, Julie Marisol Pagalilauan, Neil Concibido, Monet Concepcion Detras

Abstract:

Energy is a necessity both for the people and the country. The demand for energy is continually increasing, but the supply is not doing the same. The reopening of the Bataan Nuclear Power Plant (BNPP) in the Philippines has been circulating in the media for the current time. The general public has been hesitant in accepting the inclusion of nuclear energy in the Philippine energy mix due to perceived unsafe conditions of the plant. This study evaluated the possible operations of a nuclear power plant, which is of the same type as the BNPP, considering the safety of the workers, the public, and the environment using a Process Hazard Analysis (PHA) method. What-If Technique was utilized to identify the hazards and consequences on the operations of the plant, together with the level of risk it entails. Through the brainstorming sessions of the PHA team, it was found that the most critical system on the plant is the primary system. Possible leakages on pipes and equipment due to weakened seals and welds and blockages on coolant path due to fouling were the most common scenarios identified, which further caused the most critical scenario – radioactive leak through sump contamination, nuclear meltdown, and equipment damage and explosion which could result to multiple injuries and fatalities, and environmental impacts.

Keywords: process safety management, process hazard analysis, what-If technique, nuclear power plant

Procedia PDF Downloads 178