Search results for: limit of detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4795

Search results for: limit of detection

3715 A Simple Approach to Reliability Assessment of Structures via Anomaly Detection

Authors: Rims Janeliukstis, Deniss Mironovs, Andrejs Kovalovs

Abstract:

Operational Modal Analysis (OMA) is widely applied as a method for Structural Health Monitoring for structural damage identification and assessment by tracking the changes of the identified modal parameters over time. Unfortunately, modal parameters also depend on such external factors as temperature and loads. Any structural condition assessment using modal parameters should be done taking into consideration those external factors, otherwise there is a high chance of false positives. A method of structural reliability assessment based on anomaly detection technique called Machalanobis Squared Distance (MSD) is proposed. It requires a set of reference conditions to learn healthy state of a structure, which all future parameters are compared to. In this study, structural modal parameters (natural frequency and mode shape), as well as ambient temperature and loads acting on the structure are used as features. Numerical tests were performed on a finite element model of a carbon fibre reinforced polymer composite beam with delamination damage at various locations and of various severities. The advantages of the demonstrated approach include relatively few computational steps, ability to distinguish between healthy and damaged conditions and discriminate between different damage severities. It is anticipated to be promising in reliability assessment of massively produced structural parts.

Keywords: operational modal analysis, reliability assessment, anomaly detection, damage, mahalanobis squared distance

Procedia PDF Downloads 114
3714 Self-Directed-Car on GT Road: Grand Trunk Road

Authors: Rameez Ahmad, Aqib Mehmood, Imran Khan

Abstract:

Self-directed car (SDC) that can drive itself from one fact to another without support from a driver. Certain trust that self-directed car obligate the probable to transform the transportation manufacturing while essentially removing coincidences, and cleaning up the environment. This study realizes the effects that SDC (also called a self-driving, driver or robotic) vehicle travel demands and ride scheme is likely to have. Without the typical obstacles that allows detection of a audio vision based hardware and software construction (It (SDC) and cost benefits, the vehicle technologies, Gold (Generic Obstacle and Lane Detection) to a knowledge-based system to predict their potential and consider the shape, color, or balance) and an organized environment with colored lane patterns, lane position ban. Discovery the problematic consequence of (SDC) on GT (grand trunk road) road and brand the car further effectual.

Keywords: SDC, gold, GT, knowledge-based system

Procedia PDF Downloads 370
3713 Combining Shallow and Deep Unsupervised Machine Learning Techniques to Detect Bad Actors in Complex Datasets

Authors: Jun Ming Moey, Zhiyaun Chen, David Nicholson

Abstract:

Bad actors are often hard to detect in data that imprints their behaviour patterns because they are comparatively rare events embedded in non-bad actor data. An unsupervised machine learning framework is applied here to detect bad actors in financial crime datasets that record millions of transactions undertaken by hundreds of actors (<0.01% bad). Specifically, the framework combines ‘shallow’ (PCA, Isolation Forest) and ‘deep’ (Autoencoder) methods to detect outlier patterns. Detection performance analysis for both the individual methods and their combination is reported.

Keywords: detection, machine learning, deep learning, unsupervised, outlier analysis, data science, fraud, financial crime

Procedia PDF Downloads 94
3712 Empirical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: general appliance model, non intrusive load monitoring, events detection, unsupervised techniques;

Procedia PDF Downloads 82
3711 Invasion of Pectinatella magnifica in Freshwater Resources of the Czech Republic

Authors: J. Pazourek, K. Šmejkal, P. Kollár, J. Rajchard, J. Šinko, Z. Balounová, E. Vlková, H. Salmonová

Abstract:

Pectinatella magnifica (Leidy, 1851) is an invasive freshwater animal that lives in colonies. A colony of Pectinatella magnifica (a gelatinous blob) can be up to several feet in diameter large and under favorable conditions it exhibits an extreme growth rate. Recently European countries around rivers of Elbe, Oder, Danube, Rhine and Vltava have confirmed invasion of Pectinatella magnifica, including freshwater reservoirs in South Bohemia (Czech Republic). Our project (Czech Science Foundation, GAČR P503/12/0337) is focused onto biology and chemistry of Pectinatella magnifica. We monitor the organism occurrence in selected South Bohemia ponds and sandpits during the last years, collecting information about physical properties of surrounding water, and sampling the colonies for various analyses (classification, maps of secondary metabolites, toxicity tests). Because the gelatinous matrix is during the colony lifetime also a host for algae, bacteria and cyanobacteria (co-habitants), in this contribution, we also applied a high performance liquid chromatography (HPLC) method for determination of potentially present cyanobacterial toxins (microcystin-LR, microcystin-RR, nodularin). Results from the last 3-year monitoring show that these toxins are under limit of detection (LOD), so that they do not represent a danger yet. The final goal of our study is to assess toxicity risks related to fresh water resources invaded by Pectinatella magnifica, and to understand the process of invasion, which can enable to control it.

Keywords: cyanobacteria, fresh water resources, Pectinatella magnifica invasion, toxicity monitoring

Procedia PDF Downloads 239
3710 Residual Evaluation by Thresholding and Neuro-Fuzzy System: Application to Actuator

Authors: Y. Kourd, D. Lefebvre, N. Guersi

Abstract:

The monitoring of industrial processes is required to ensure operating conditions of industrial systems through automatic detection and isolation of faults. In this paper we propose a method of fault diagnosis based on neuro-fuzzy technique and the choice of a threshold. The validation of this method on a test bench "Actuator Electro DAMADICS Benchmark". In the first phase of the method, we construct a model represents the normal state of the system to fault detection. With residuals analysis generated and the choice of thresholds for signatures table. These signatures provide us with groups of non-detectable faults. In the second phase, we build faulty models to see the flaws in the system that are not located in the first phase.

Keywords: residuals analysis, threshold, neuro-fuzzy system, residual evaluation

Procedia PDF Downloads 446
3709 Refractometric Optical Sensing by Using Photonics Mach–Zehnder Interferometer

Authors: Gong Zhang, Hong Cai, Bin Dong, Jifang Tao, Aiqun Liu, Dim-Lee Kwong, Yuandong Gu

Abstract:

An on-chip refractive index sensor with high sensitivity and large measurement range is demonstrated in this paper. The sensing structures are based on Mach-Zehnder interferometer configuration, built on the SOI substrate. The wavelength sensitivity of the sensor is estimated to be 3129 nm/RIU. Meanwhile, according to the interference pattern period changes, the measured period sensitivities are 2.9 nm/RIU (TE mode) and 4.21 nm/RIU (TM mode), respectively. As such, the wavelength shift and the period shift can be used for fine index change detection and larger index change detection, respectively. Therefore, the sensor design provides an approach for large index change measurement with high sensitivity.

Keywords: Mach-Zehnder interferometer, nanotechnology, refractive index sensing, sensors

Procedia PDF Downloads 445
3708 Application of Remote Sensing and GIS in Assessing Land Cover Changes within Granite Quarries around Brits Area, South Africa

Authors: Refilwe Moeletsi

Abstract:

Dimension stone quarrying around Brits and Belfast areas started in the early 1930s and has been growing rapidly since then. Environmental impacts associated with these quarries have not been documented, and hence this study aims at detecting any change in the environment that might have been caused by these activities. Landsat images that were used to assess land use/land cover changes in Brits quarries from 1998 - 2015. A supervised classification using maximum likelihood classifier was applied to classify each image into different land use/land cover types. Classification accuracy was assessed using Google Earth™ as a source of reference data. Post-classification change detection method was used to determine changes. The results revealed significant increase in granite quarries and corresponding decrease in vegetation cover within the study region.

Keywords: remote sensing, GIS, change detection, granite quarries

Procedia PDF Downloads 314
3707 Scour Damaged Detection of Bridge Piers Using Vibration Analysis - Numerical Study of a Bridge

Authors: Solaine Hachem, Frédéric Bourquin, Dominique Siegert

Abstract:

The brutal collapse of bridges is mainly due to scour. Indeed, the soil erosion in the riverbed around a pier modifies the embedding conditions of the structure, reduces its overall stiffness and threatens its stability. Hence, finding an efficient technique that allows early scour detection becomes mandatory. Vibration analysis is an indirect method for scour detection that relies on real-time monitoring of the bridge. It tends to indicate the presence of a scour based on its consequences on the stability of the structure and its dynamic response. Most of the research in this field has focused on the dynamic behavior of a single pile and has examined the depth of the scour. In this paper, a bridge is fully modeled with all piles and spans and the scour is represented by a reduction in the foundation's stiffnesses. This work aims to identify the vibration modes sensitive to the rigidity’s loss in the foundations so that their variations can be considered as a scour indicator: the decrease in soil-structure interaction rigidity leads to a decrease in the natural frequencies’ values. By using the first-order perturbation method, the expression of sensitivity, which depends only on the selected vibration modes, is established to determine the deficiency of foundations stiffnesses. The solutions are obtained by using the singular value decomposition method for the regularization of the inverse problem. The propagation of uncertainties is also calculated to verify the efficiency of the inverse problem method. Numerical simulations describing different scenarios of scour are investigated on a simplified model of a real composite steel-concrete bridge located in France. The results of the modal analysis show that the modes corresponding to in-plane and out-of-plane piers vibrations are sensitive to the loss of foundation stiffness. While the deck bending modes are not affected by this damage.

Keywords: bridge’s piers, inverse problems, modal sensitivity, scour detection, vibration analysis

Procedia PDF Downloads 104
3706 Improving Chest X-Ray Disease Detection with Enhanced Data Augmentation Using Novel Approach of Diverse Conditional Wasserstein Generative Adversarial Networks

Authors: Malik Muhammad Arslan, Muneeb Ullah, Dai Shihan, Daniyal Haider, Xiaodong Yang

Abstract:

Chest X-rays are instrumental in the detection and monitoring of a wide array of diseases, including viral infections such as COVID-19, tuberculosis, pneumonia, lung cancer, and various cardiac and pulmonary conditions. To enhance the accuracy of diagnosis, artificial intelligence (AI) algorithms, particularly deep learning models like Convolutional Neural Networks (CNNs), are employed. However, these deep learning models demand a substantial and varied dataset to attain optimal precision. Generative Adversarial Networks (GANs) can be employed to create new data, thereby supplementing the existing dataset and enhancing the accuracy of deep learning models. Nevertheless, GANs have their limitations, such as issues related to stability, convergence, and the ability to distinguish between authentic and fabricated data. In order to overcome these challenges and advance the detection and classification of CXR normal and abnormal images, this study introduces a distinctive technique known as DCWGAN (Diverse Conditional Wasserstein GAN) for generating synthetic chest X-ray (CXR) images. The study evaluates the effectiveness of this Idiosyncratic DCWGAN technique using the ResNet50 model and compares its results with those obtained using the traditional GAN approach. The findings reveal that the ResNet50 model trained on the DCWGAN-generated dataset outperformed the model trained on the classic GAN-generated dataset. Specifically, the ResNet50 model utilizing DCWGAN synthetic images achieved impressive performance metrics with an accuracy of 0.961, precision of 0.955, recall of 0.970, and F1-Measure of 0.963. These results indicate the promising potential for the early detection of diseases in CXR images using this Inimitable approach.

Keywords: CNN, classification, deep learning, GAN, Resnet50

Procedia PDF Downloads 88
3705 High-Performance Liquid Chromatographic Method with Diode Array Detection (HPLC-DAD) Analysis of Naproxen and Omeprazole Active Isomers

Authors: Marwa Ragab, Eman El-Kimary

Abstract:

Chiral separation and analysis of omeprazole and naproxen enantiomers in tablets were achieved using high-performance liquid chromatographic method with diode array detection (HPLC-DAD). Kromasil Cellucoat chiral column was used as a stationary phase for separation and the eluting solvent consisted of hexane, isopropanol and trifluoroacetic acid in a ratio of: 90, 9.9 and 0.1, respectively. The chromatographic system was suitable for the enantiomeric separation and analysis of active isomers of the drugs. Resolution values of 2.17 and 3.84 were obtained after optimization of the chromatographic conditions for omeprazole and naproxen isomers, respectively. The determination of S-isomers of each drug in their dosage form was fully validated.

Keywords: chiral analysis, esomeprazole, S-Naproxen, HPLC-DAD

Procedia PDF Downloads 301
3704 A Combination of Anisotropic Diffusion and Sobel Operator to Enhance the Performance of the Morphological Component Analysis for Automatic Crack Detection

Authors: Ankur Dixit, Hiroaki Wagatsuma

Abstract:

The crack detection on a concrete bridge is an important and constant task in civil engineering. Chronically, humans are checking the bridge for inspection of cracks to maintain the quality and reliability of bridge. But this process is very long and costly. To overcome such limitations, we have used a drone with a digital camera, which took some images of bridge deck and these images are processed by morphological component analysis (MCA). MCA technique is a very strong application of sparse coding and it explores the possibility of separation of images. In this paper, MCA has been used to decompose the image into coarse and fine components with the effectiveness of two dictionaries namely anisotropic diffusion and wavelet transform. An anisotropic diffusion is an adaptive smoothing process used to adjust diffusion coefficient by finding gray level and gradient as features. These cracks in image are enhanced by subtracting the diffused coarse image into the original image and the results are treated by Sobel edge detector and binary filtering to exhibit the cracks in a fine way. Our results demonstrated that proposed MCA framework using anisotropic diffusion followed by Sobel operator and binary filtering may contribute to an automation of crack detection even in open field sever conditions such as bridge decks.

Keywords: anisotropic diffusion, coarse component, fine component, MCA, Sobel edge detector and wavelet transform

Procedia PDF Downloads 173
3703 A Pattern Recognition Neural Network Model for Detection and Classification of SQL Injection Attacks

Authors: Naghmeh Moradpoor Sheykhkanloo

Abstract:

Structured Query Language Injection (SQLI) attack is a code injection technique in which malicious SQL statements are inserted into a given SQL database by simply using a web browser. Losing data, disclosing confidential information or even changing the value of data are the severe damages that SQLI attack can cause on a given database. SQLI attack has also been rated as the number-one attack among top ten web application threats on Open Web Application Security Project (OWASP). OWASP is an open community dedicated to enabling organisations to consider, develop, obtain, function, and preserve applications that can be trusted. In this paper, we propose an effective pattern recognition neural network model for detection and classification of SQLI attacks. The proposed model is built from three main elements of: a Uniform Resource Locator (URL) generator in order to generate thousands of malicious and benign URLs, a URL classifier in order to: 1) classify each generated URL to either a benign URL or a malicious URL and 2) classify the malicious URLs into different SQLI attack categories, and an NN model in order to: 1) detect either a given URL is a malicious URL or a benign URL and 2) identify the type of SQLI attack for each malicious URL. The model is first trained and then evaluated by employing thousands of benign and malicious URLs. The results of the experiments are presented in order to demonstrate the effectiveness of the proposed approach.

Keywords: neural networks, pattern recognition, SQL injection attacks, SQL injection attack classification, SQL injection attack detection

Procedia PDF Downloads 469
3702 Principle Component Analysis on Colon Cancer Detection

Authors: N. K. Caecar Pratiwi, Yunendah Nur Fuadah, Rita Magdalena, R. D. Atmaja, Sofia Saidah, Ocky Tiaramukti

Abstract:

Colon cancer or colorectal cancer is a type of cancer that attacks the last part of the human digestive system. Lymphoma and carcinoma are types of cancer that attack human’s colon. Colon cancer causes deaths about half a million people every year. In Indonesia, colon cancer is the third largest cancer case for women and second in men. Unhealthy lifestyles such as minimum consumption of fiber, rarely exercising and lack of awareness for early detection are factors that cause high cases of colon cancer. The aim of this project is to produce a system that can detect and classify images into type of colon cancer lymphoma, carcinoma, or normal. The designed system used 198 data colon cancer tissue pathology, consist of 66 images for Lymphoma cancer, 66 images for carcinoma cancer and 66 for normal / healthy colon condition. This system will classify colon cancer starting from image preprocessing, feature extraction using Principal Component Analysis (PCA) and classification using K-Nearest Neighbor (K-NN) method. Several stages in preprocessing are resize, convert RGB image to grayscale, edge detection and last, histogram equalization. Tests will be done by trying some K-NN input parameter setting. The result of this project is an image processing system that can detect and classify the type of colon cancer with high accuracy and low computation time.

Keywords: carcinoma, colorectal cancer, k-nearest neighbor, lymphoma, principle component analysis

Procedia PDF Downloads 205
3701 Machine Learning Methods for Network Intrusion Detection

Authors: Mouhammad Alkasassbeh, Mohammad Almseidin

Abstract:

Network security engineers work to keep services available all the time by handling intruder attacks. Intrusion Detection System (IDS) is one of the obtainable mechanisms that is used to sense and classify any abnormal actions. Therefore, the IDS must be always up to date with the latest intruder attacks signatures to preserve confidentiality, integrity, and availability of the services. The speed of the IDS is a very important issue as well learning the new attacks. This research work illustrates how the Knowledge Discovery and Data Mining (or Knowledge Discovery in Databases) KDD dataset is very handy for testing and evaluating different Machine Learning Techniques. It mainly focuses on the KDD preprocess part in order to prepare a decent and fair experimental data set. The J48, MLP, and Bayes Network classifiers have been chosen for this study. It has been proven that the J48 classifier has achieved the highest accuracy rate for detecting and classifying all KDD dataset attacks, which are of type DOS, R2L, U2R, and PROBE.

Keywords: IDS, DDoS, MLP, KDD

Procedia PDF Downloads 234
3700 Application Reliability Method for the Analysis of the Stability Limit States of Large Concrete Dams

Authors: Mustapha Kamel Mihoubi, Essadik Kerkar, Abdelhamid Hebbouche

Abstract:

According to the randomness of most of the factors affecting the stability of a gravity dam, probability theory is generally used to TESTING the risk of failure and there is a confusing logical transition from the state of stability failed state, so the stability failure process is considered as a probable event. The control of risk of product failures is of capital importance for the control from a cross analysis of the gravity of the consequences and effects of the probability of occurrence of identified major accidents and can incur a significant risk to the concrete dam structures. Probabilistic risk analysis models are used to provide a better understanding the reliability and structural failure of the works, including when calculating stability of large structures to a major risk in the event of an accident or breakdown. This work is interested in the study of the probability of failure of concrete dams through the application of the reliability analysis methods including the methods used in engineering. It is in our case of the use of level II methods via the study limit state. Hence, the probability of product failures is estimated by analytical methods of the type FORM (First Order Reliability Method), SORM (Second Order Reliability Method). By way of comparison, a second level III method was used which generates a full analysis of the problem and involving an integration of the probability density function of, random variables are extended to the field of security by using of the method of Mont-Carlo simulations. Taking into account the change in stress following load combinations: normal, exceptional and extreme the acting on the dam, calculation results obtained have provided acceptable failure probability values which largely corroborate the theory, in fact, the probability of failure tends to increase with increasing load intensities thus causing a significant decrease in strength, especially in the presence of combinations of unique and extreme loads. Shear forces then induce a shift threatens the reliability of the structure by intolerable values of the probability of product failures. Especially, in case THE increase of uplift in a hypothetical default of the drainage system.

Keywords: dam, failure, limit state, monte-carlo, reliability, probability, sliding, Taylor

Procedia PDF Downloads 318
3699 Automatic Early Breast Cancer Segmentation Enhancement by Image Analysis and Hough Transform

Authors: David Jurado, Carlos Ávila

Abstract:

Detection of early signs of breast cancer development is crucial to quickly diagnose the disease and to define adequate treatment to increase the survival probability of the patient. Computer Aided Detection systems (CADs), along with modern data techniques such as Machine Learning (ML) and Neural Networks (NN), have shown an overall improvement in digital mammography cancer diagnosis, reducing the false positive and false negative rates becoming important tools for the diagnostic evaluations performed by specialized radiologists. However, ML and NN-based algorithms rely on datasets that might bring issues to the segmentation tasks. In the present work, an automatic segmentation and detection algorithm is described. This algorithm uses image processing techniques along with the Hough transform to automatically identify microcalcifications that are highly correlated with breast cancer development in the early stages. Along with image processing, automatic segmentation of high-contrast objects is done using edge extraction and circle Hough transform. This provides the geometrical features needed for an automatic mask design which extracts statistical features of the regions of interest. The results shown in this study prove the potential of this tool for further diagnostics and classification of mammographic images due to the low sensitivity to noisy images and low contrast mammographies.

Keywords: breast cancer, segmentation, X-ray imaging, hough transform, image analysis

Procedia PDF Downloads 83
3698 Detection of Cryptosporidium Oocysts by Acid-Fast Staining Method and PCR in Surface Water from Tehran, Iran

Authors: Mohamad Mohsen Homayouni, Niloofar Taghipour, Ahmad Reza Memar, Niloofar Khalaji, Hamed Kiani, Seyyed Javad Seyyed Tabaei

Abstract:

Background and Objective: Cryptosporidium is a coccidian protozoan parasite; its oocysts in surface water are a global health problem. Due to the low number of parasites in the water resources and the lack of laboratory culture, rapid and sensitive method for detection of the organism in the water resources is necessarily required. We applied modified acid-fast staining and PCR for the detection of the Cryptosporidium spp. and analysed the genotypes in 55 samples collected from surface water. Methods: Over a period of nine months, 55 surface water samples were collected from the five rivers in Tehran, Iran. The samples were filtered by using cellulose acetate membrane filters. By acid fast method, initial identification of Cryptosporidium oocyst were carried out on surface water samples. Then, nested PCR assay was designed for the specific amplification and analysed the genotypes. Results: Modified Ziehl-Neelsen method revealed 5–20 Cryptosporidium oocysts detected per 10 Liter. Five out of the 55 (9.09%) surface water samples were found positive for Cryptosporidium spp. by Ziehl-Neelsen test and seven (12.7%) were found positive by nested PCR. The staining results were consistent with PCR. Seven Cryptosporidium PCR products were successfully sequenced and five gp60 subtypes were detected. Our finding of gp60 gene revealed that all of the positive isolates were Cryptosporidium parvum and belonged to subtype families IIa and IId. Conclusion: Our investigations were showed that collection of water samples were contaminated by Cryptosporidium, with potential hazards for the significant health problem. This study provides the first report on detection and genotyping of Cryptosporidium species from surface water samples in Iran, and its result confirmed the low clinical incidence of this parasite on the community.

Keywords: Cryptosporidium spp., membrane filtration, subtype, surface water, Iran

Procedia PDF Downloads 416
3697 Efficient Passenger Counting in Public Transport Based on Machine Learning

Authors: Chonlakorn Wiboonsiriruk, Ekachai Phaisangittisagul, Chadchai Srisurangkul, Itsuo Kumazawa

Abstract:

Public transportation is a crucial aspect of passenger transportation, with buses playing a vital role in the transportation service. Passenger counting is an essential tool for organizing and managing transportation services. However, manual counting is a tedious and time-consuming task, which is why computer vision algorithms are being utilized to make the process more efficient. In this study, different object detection algorithms combined with passenger tracking are investigated to compare passenger counting performance. The system employs the EfficientDet algorithm, which has demonstrated superior performance in terms of speed and accuracy. Our results show that the proposed system can accurately count passengers in varying conditions with an accuracy of 94%.

Keywords: computer vision, object detection, passenger counting, public transportation

Procedia PDF Downloads 154
3696 The Design of Multiple Detection Parallel Combined Spread Spectrum Communication System

Authors: Lixin Tian, Wei Xue

Abstract:

Many jobs in society go underground, such as mine mining, tunnel construction and subways, which are vital to the development of society. Once accidents occur in these places, the interruption of traditional wired communication is not conducive to the development of rescue work. In order to realize the positioning, early warning and command functions of underground personnel and improve rescue efficiency, it is necessary to develop and design an emergency ground communication system. It is easy to be subjected to narrowband interference when performing conventional underground communication. Spreading communication can be used for this problem. However, general spread spectrum methods such as direct spread communication are inefficient, so it is proposed to use parallel combined spread spectrum (PCSS) communication to improve efficiency. The PCSS communication not only has the anti-interference ability and the good concealment of the traditional spread spectrum system, but also has a relatively high frequency band utilization rate and a strong information transmission capability. So, this technology has been widely used in practice. This paper presents a PCSS communication model-multiple detection parallel combined spread spectrum (MDPCSS) communication system. In this paper, the principle of MDPCSS communication system is described, that is, the sequence at the transmitting end is processed in blocks and cyclically shifted to facilitate multiple detection at the receiving end. The block diagrams of the transmitter and receiver of the MDPCSS communication system are introduced. At the same time, the calculation formula of the system bit error rate (BER) is introduced, and the simulation and analysis of the BER of the system are completed. By comparing with the common parallel PCSS communication, we can draw a conclusion that it is indeed possible to reduce the BER and improve the system performance. Furthermore, the influence of different pseudo-code lengths selected on the system BER is simulated and analyzed, and the conclusion is that the larger the pseudo-code length is, the smaller the system error rate is.

Keywords: cyclic shift, multiple detection, parallel combined spread spectrum, PN code

Procedia PDF Downloads 137
3695 Enhancement of Pulsed Eddy Current Response Based on Power Spectral Density after Continuous Wavelet Transform Decomposition

Authors: A. Benyahia, M. Zergoug, M. Amir, M. Fodil

Abstract:

The main objective of this work is to enhance the Pulsed Eddy Current (PEC) response from the aluminum structure using signal processing. Cracks and metal loss in different structures cause changes in PEC response measurements. In this paper, time-frequency analysis is used to represent PEC response, which generates a large quantity of data and reduce the noise due to measurement. Power Spectral Density (PSD) after Wavelet Decomposition (PSD-WD) is proposed for defect detection. The experimental results demonstrate that the cracks in the surface can be extracted satisfactorily by the proposed methods. The validity of the proposed method is discussed.

Keywords: DT, pulsed eddy current, continuous wavelet transform, Mexican hat wavelet mother, defect detection, power spectral density.

Procedia PDF Downloads 236
3694 Change Point Analysis in Average Ozone Layer Temperature Using Exponential Lomax Distribution

Authors: Amjad Abdullah, Amjad Yahya, Bushra Aljohani, Amani Alghamdi

Abstract:

Change point detection is an important part of data analysis. The presence of a change point refers to a significant change in the behavior of a time series. In this article, we examine the detection of multiple change points of parameters of the exponential Lomax distribution, which is broad and flexible compared with other distributions while fitting data. We used the Schwarz information criterion and binary segmentation to detect multiple change points in publicly available data on the average temperature in the ozone layer. The change points were successfully located.

Keywords: binary segmentation, change point, exponentialLomax distribution, information criterion

Procedia PDF Downloads 174
3693 Truthful or Untruthful Social Media Posts: Applying Statement Analysis to Decode online Deception

Authors: Christa L. Arnold, Margaret C. Stewart

Abstract:

This research shares the results of an exploratory study examining Statement Analysis (SA) to detect deception in online truthful and untruthful social media posts. Applying a Law Enforcement methodology SA, used in criminal interview statements, this research analyzes what is stated to assist in evaluating written deceptive information. Preliminary findings reveal qualitative and quantitative nuances for SA in online deception detection and uncover insights regarding digital deceptive behavior. Thus far, findings reveal truthful statements tend to differ from untruthful statements in both content and quality.

Keywords: deception detection, online deception, social media content, statement analysis

Procedia PDF Downloads 64
3692 The Impact of Reducing Road Traffic Speed in London on Noise Levels: A Comparative Study of Field Measurement and Theoretical Calculation

Authors: Jessica Cecchinelli, Amer Ali

Abstract:

The continuing growth in road traffic and the resultant impact on the level of pollution and safety especially in urban areas have led local and national authorities to reduce traffic speed and flow in major towns and cities. Various boroughs of London have recently reduced the in-city speed limit from 30mph to 20mph mainly to calm traffic, improve safety and reduce noise and vibration. This paper reports the detailed field measurements using noise sensor and analyser and the corresponding theoretical calculations and analysis of the noise levels on a number of roads in the central London Borough of Camden where speed limit was reduced from 30mph to 20mph in all roads except the major routes of the ‘Transport for London (TfL)’. The measurements, which included the key noise levels and scales at residential streets and main roads, were conducted during weekdays and weekends normal and rush hours. The theoretical calculations were done according to the UK procedure ‘Calculation of Road Traffic Noise 1988’ and with conversion to the European L-day, L-evening, L-night, and L-den and other important levels. The current study also includes comparable data and analysis from previously measured noise in the Borough of Camden and other boroughs of central London. Classified traffic flow and speed on the roads concerned were observed and used in the calculation part of the study. Relevant data and description of the weather condition are reported. The paper also reports a field survey in the form of face-to-face interview questionnaires, which was carried out in parallel with the field measurement of noise, in order to ascertain the opinions and views of local residents and workers in the reduced speed zones of 20mph. The main findings are that the reduction in speed had reduced the noise pollution on the studied zones and that the measured and calculated noise levels for each speed zone are closely matched. Among the other findings was that of the field survey of the opinions and views of the local residents and workers in the reduced speed 20mph zones who supported the scheme and felt that it had improved the quality of life in their areas giving a sense of calmness and safety particularly for families with children, the elderly, and encouraged pedestrians and cyclists. The key conclusions are that lowering the speed limit in built-up areas would not just reduce the number of serious accidents but it would also reduce the noise pollution and promote clean modes of transport particularly walking and cycling. The details of the site observations and the corresponding calculations together with critical comparative analysis and relevant conclusions will be reported in the full version of the paper.

Keywords: noise calculation, noise field measurement, road traffic noise, speed limit in london, survey of people satisfaction

Procedia PDF Downloads 424
3691 Off-Policy Q-learning Technique for Intrusion Response in Network Security

Authors: Zheni S. Stefanova, Kandethody M. Ramachandran

Abstract:

With the increasing dependency on our computer devices, we face the necessity of adequate, efficient and effective mechanisms, for protecting our network. There are two main problems that Intrusion Detection Systems (IDS) attempt to solve. 1) To detect the attack, by analyzing the incoming traffic and inspect the network (intrusion detection). 2) To produce a prompt response when the attack occurs (intrusion prevention). It is critical creating an Intrusion detection model that will detect a breach in the system on time and also challenging making it provide an automatic and with an acceptable delay response at every single stage of the monitoring process. We cannot afford to adopt security measures with a high exploiting computational power, and we are not able to accept a mechanism that will react with a delay. In this paper, we will propose an intrusion response mechanism that is based on artificial intelligence, and more precisely, reinforcement learning techniques (RLT). The RLT will help us to create a decision agent, who will control the process of interacting with the undetermined environment. The goal is to find an optimal policy, which will represent the intrusion response, therefore, to solve the Reinforcement learning problem, using a Q-learning approach. Our agent will produce an optimal immediate response, in the process of evaluating the network traffic.This Q-learning approach will establish the balance between exploration and exploitation and provide a unique, self-learning and strategic artificial intelligence response mechanism for IDS.

Keywords: cyber security, intrusion prevention, optimal policy, Q-learning

Procedia PDF Downloads 236
3690 The EAO2 in Essouabaa, Tebessa, Algeria: An Example of Facies to Organic Matter

Authors: Sihem Salmi Laouar, Khoudair Chabane, Rabah Laouar, Adrian J. Boyce et Anthony E. Fallick

Abstract:

The solid mass of Essouabaa belongs paléogéography to the field téthysian and belonged to the area of the Mounts of Mellègue. This area was not saved by the oceanic-2 event anoxic (EAO-2) which was announced, over one short period, around the limit cénomanian-turonian. In the solid mass of Essouabba, the dominant sediments, pertaining to this period, are generally fine, dark, laminated and sometimes rolled deposits. They contain a rather rich planktonic microfaune, pyrite, and grains of phosphate, thus translating an environment rather deep and reducing rather deep and reducing. For targeting well the passage Cénomanian-Turonian (C-T) in the solid mass of Essouabaa, of the studies lithological and biostratigraphic were combined with the data of the isotopic analyses carbon and oxygen like with the contents of CaCO3. The got results indicate that this passage is marked by a biological event translated by the appearance of the "filaments" like by a positive excursion of the δ13C and δ18O. The cénomanian-turonian passage in the solid mass of Essouabaa represents a good example where during the oceanic event anoxic a facies with organic matter with contents of COT which can reach 1.36%. C E massive presents biostratigraphic and isotopic similarities with those obtained as well in the areas bordering (ex: Tunisia and Morocco) that throughout the world.

Keywords: limit cénomanian-turonian (C-T), COT, filaments, event anoxic 2 (EAO-2), stable isotopes, mounts of Mellègue, Algeria

Procedia PDF Downloads 515
3689 Electrospray Deposition Technique of Dye Molecules in the Vacuum

Authors: Nouf Alharbi

Abstract:

The electrospray deposition technique became an important method that enables fragile, nonvolatile molecules to be deposited in situ in high vacuum environments. Furthermore, it is considered one of the ways to close the gap between basic surface science and molecular engineering, which represents a gradual change in the range of scientist research. Also, this paper talked about one of the most important techniques that have been developed and aimed for helping to further develop and characterize the electrospray by providing data collected using an image charge detection instrument. Image charge detection mass spectrometry (CDMS) is used to measure speed and charge distributions of the molecular ions. As well as, some data has been included using SIMION simulation to simulate the energies and masses of the molecular ions through the system in order to refine the mass-selection process.

Keywords: charge, deposition, electrospray, image, ions, molecules, SIMION

Procedia PDF Downloads 133
3688 Threshold Sand Detection Limits for Acoustic Monitors in Multiphase Flow

Authors: Vinod Ponnagandla, Brenton McLaury, Siamack Shirazi

Abstract:

Sand production can lead to deposition of particles or erosion. Low production rates resulting in deposition can partially clog systems and cause under deposit corrosion. Commercially available nonintrusive acoustic sand detectors are attractive as they claim to detect sand production. Acoustic sand detectors are used during oil and gas production; however, operators often do not know the threshold detection limits of these devices. It is imperative to know the detection limits to appropriately plan for cleaning of separation equipment or examine risk of erosion. These monitors are based on detecting the acoustic signature of sand as the particles impact the pipe walls. The objective of this work is to determine threshold detection limits for acoustic sand monitors that are commercially available. The minimum threshold sand concentration that can be detected in a pipe are determined as a function of flowing gas and liquid velocities. A large scale flow loop with a 4-inch test section is utilized. Commercially available sand monitors (ClampOn and Roxar) are evaluated for different flow regimes, sand sizes and pipe orientation (vertical and horizontal). The manufacturers’ recommend that the monitors be placed on a bend to maximize the number of particle impacts, so results are shown for monitors placed at 45 and 90 degree positions in a bend. Acoustic sand monitors that clamp to the outside of pipe are passive and listen for solid particle impact noise. The threshold sand rate is calculated by eliminating the background noise created by the flow of gas and liquid in the pipe for various flow regimes that are generated in horizontal and vertical test sections. The average sand sizes examined are 150 and 300 microns. For stratified and bubbly flows the threshold sand rates are much higher than other flow regimes such as slug and annular flow regimes that are investigated. However, the background noise generated by slug flow regime is very high and cause a high uncertainty in detection limits. The threshold sand rates for annular flow and dry gas conditions are the lowest because of high gas velocities. The effects of monitor placement around elbows that are in vertical and horizontal pipes are also examined for 150 micron. The results show that the threshold sand rates that are detected in vertical orientation are generally lower for all various flow regimes that are investigated.

Keywords: acoustic monitor, sand, multiphase flow, threshold

Procedia PDF Downloads 407
3687 Electronic Equipment Failure due to Corrosion

Authors: Yousaf Tariq

Abstract:

There are many reasons which are involved in electronic equipment failure i.e. temperature, humidity, dust, smoke etc. Corrosive gases are also one of the factor which may involve in failure of equipment. Sensitivity of electronic equipment increased when “lead-free” regulation enforced on manufacturers. In data center, equipment like hard disk, servers, printed circuit boards etc. have been exposed to gaseous contamination due to increase in sensitivity. There is a worldwide standard to protect electronic industrial electronic from corrosive gases. It is well known as “ANSI/ISA S71.04 – 1985 - Environmental Conditions for Control Systems: Airborne Contaminants. ASHRAE Technical Committee (TC) 9.9 members also recommended ISA standard in their whitepaper on Gaseous and Particulate Contamination Guideline for data centers. TC 9.9 members represented some of the major IT equipment manufacturers e.g. IBM, HP, Cisco etc. As per standard practices, first step is to monitor air quality in data center. If contamination level shows more than G1, it means that gas-phase air filtration is required other than dust/smoke air filtration. It is important that outside fresh air entering in data center should have pressurization/re-circulated process in order to absorb corrosive gases and to maintain level within specified limit. It is also important that air quality monitoring should be conducted once in a year. Temperature and humidity should also be monitored as per standard practices to maintain level within specified limit.

Keywords: corrosive gases, corrosion, electronic equipment failure, ASHRAE, hard disk

Procedia PDF Downloads 329
3686 Artificially Intelligent Context Aware Personal Computer Assistant (ACPCA)

Authors: Abdul Mannan Akhtar

Abstract:

In this paper a novel concept of a self learning smart personalized computer assistant (ACPCA) is established which is a context aware system. Based on user habits, moods, and other routines/situational reactions the system will manage various services and suggestions at appropriate times including what schedule to follow, what to watch, what software to be used, what should be deleted etc. This system will utilize a hybrid fuzzyNeural model to predict what the user will do next and support his actions. This will be done by establishing fuzzy sets of user activities, choices, preferences etc. and utilizing their combinations to predict his moods and immediate preferences. Various application of context aware systems exist separately e.g. on certain websites for music or multimedia suggestions but a personalized autonomous system that could adapt to user’s personality does not exist at present. Due to the novelty and massiveness of this concept, this paper will primarily focus on the problem establishment, product features and its functionality; however a small mini case is also implemented on MATLAB to demonstrate some of the aspects of ACPCA. The mini case involves prediction of user moods, activity, routine and food preference using a hybrid fuzzy-Neural soft computing technique.

Keywords: context aware systems, APCPCA, soft computing techniques, artificial intelligence, fuzzy logic, neural network, mood detection, face detection, activity detection

Procedia PDF Downloads 464