Search results for: differential detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4936

Search results for: differential detection

3796 Assay for SARS-Cov-2 on Chicken Meat

Authors: R. Mehta, M. Ghogomu, B. Schoel

Abstract:

Reports appeared in 2020 about China detecting SARS-Cov-2 (Covid-19) on frozen meat, shrimp, and food packaging material. In this study, we examined the use of swabs for the detection of Covid-19 on meat samples, and chicken breast (CB) was used as a model. Methods: Heat inactivated SARS-Cov-2 virus (IV) from Microbiologics was loaded onto the CB, swabbing was done, and the recovered inactivated virus was subjected to the Machery & Nagel NucleoSpin RNAVirus kit for RNA isolation according to manufacturer's instructions. For RT-PCR, the IDT 2019-nCoV RUO Covid-19 test kit was used with the Taqman Fast Virus 1-step master mix. The limit of detection (LOD) of viral load recovered from the CB was determined under various conditions: first on frozen CB where the IV was introduced on a defined area, then on frozen CB, with IV spread-out, and finally, on thawed CB. Results: The lowest amount of IV which can be reliably detected on frozen CB was a load of 1,000 - 2,000 IV copies where the IV was loaded on one spot of about 1 square inch. Next, the IV was spread out over a whole frozen CB about 16 square inches. The IV could be recovered at a lowest load of 4,000 to 8,000 copies. Furthermore, the effects of temperature change on viral load recovery was investigated i.e., if raw unfrozen meat became contaminated and remains for 1 hour at 4°C or gets refrozen. The amount of IV recovered successfully from CB kept at 4°C and the refrozen CB was similar to the recovery gotten from loading the IV directly on the frozen CB. In conclusion, an assay using swabs was successfully established for the detection of SARS-Cov-2 on frozen or raw (unfrozen) CB with a minimal load of up to 8,000 copies spread over 16 square inches.

Keywords: assay, COVID-19, meat, SARS-Cov-2

Procedia PDF Downloads 200
3795 Clinical Efficacy of Indigenous Software for Automatic Detection of Stages of Retinopathy of Prematurity (ROP)

Authors: Joshi Manisha, Shivaram, Anand Vinekar, Tanya Susan Mathews, Yeshaswini Nagaraj

Abstract:

Retinopathy of prematurity (ROP) is abnormal blood vessel development in the retina of the eye in a premature infant. The principal object of the invention is to provide a technique for detecting demarcation line and ridge detection for a given ROP image that facilitates early detection of ROP in stage 1 and stage 2. The demarcation line is an indicator of Stage 1 of the ROP and the ridge is the hallmark of typically Stage 2 ROP. Thirty Retcam images of Asian Indian infants obtained during routine ROP screening have been used for the analysis. A graphical user interface has been developed to detect demarcation line/ridge and to extract ground truth. This novel algorithm uses multilevel vessel enhancement to enhance tubular structures in the digital ROP images. It has been observed that the orientation of the demarcation line/ridge is normal to the direction of the blood vessels, which is used for the identification of the ridge/ demarcation line. Quantitative analysis has been presented based on gold standard images marked by expert ophthalmologist. Image based analysis has been based on the length and the position of the detected ridge. In image based evaluation, average sensitivity and positive predictive value was found to be 92.30% and 85.71% respectively. In pixel based evaluation, average sensitivity, specificity, positive predictive value and negative predictive value achieved were 60.38%, 99.66%, 52.77% and 99.75% respectively.

Keywords: ROP, ridge, multilevel vessel enhancement, biomedical

Procedia PDF Downloads 395
3794 Multiobjective Optimization of a Pharmaceutical Formulation Using Regression Method

Authors: J. Satya Eswari, Ch. Venkateswarlu

Abstract:

The formulation of a commercial pharmaceutical product involves several composition factors and response characteristics. When the formulation requires to satisfy multiple response characteristics which are conflicting, an optimal solution requires the need for an efficient multiobjective optimization technique. In this work, a regression is combined with a non-dominated sorting differential evolution (NSDE) involving Naïve & Slow and ε constraint techniques to derive different multiobjective optimization strategies, which are then evaluated by means of a trapidil pharmaceutical formulation. The analysis of the results show the effectiveness of the strategy that combines the regression model and NSDE with the integration of both Naïve & Slow and ε constraint techniques for Pareto optimization of trapidil formulation. With this strategy, the optimal formulation at pH=6.8 is obtained with the decision variables of micro crystalline cellulose, hydroxypropyl methylcellulose and compression pressure. The corresponding response characteristics of rate constant and release order are also noted down. The comparison of these results with the experimental data and with those of other multiple regression model based multiobjective evolutionary optimization strategies signify the better performance for optimal trapidil formulation.

Keywords: pharmaceutical formulation, multiple regression model, response surface method, radial basis function network, differential evolution, multiobjective optimization

Procedia PDF Downloads 401
3793 Analysis of a IncResU-Net Model for R-Peak Detection in ECG Signals

Authors: Beatriz Lafuente Alcázar, Yash Wani, Amit J. Nimunkar

Abstract:

Cardiovascular Diseases (CVDs) are the leading cause of death globally, and around 80% of sudden cardiac deaths are due to arrhythmias or irregular heartbeats. The majority of these pathologies are revealed by either short-term or long-term alterations in the electrocardiogram (ECG) morphology. The ECG is the main diagnostic tool in cardiology. It is a non-invasive, pain free procedure that measures the heart’s electrical activity and that allows the detecting of abnormal rhythms and underlying conditions. A cardiologist can diagnose a wide range of pathologies based on ECG’s form alterations, but the human interpretation is subjective and it is contingent to error. Moreover, ECG records can be quite prolonged in time, which can further complicate visual diagnosis, and deeply retard disease detection. In this context, deep learning methods have risen as a promising strategy to extract relevant features and eliminate individual subjectivity in ECG analysis. They facilitate the computation of large sets of data and can provide early and precise diagnoses. Therefore, the cardiology field is one of the areas that can most benefit from the implementation of deep learning algorithms. In the present study, a deep learning algorithm is trained following a novel approach, using a combination of different databases as the training set. The goal of the algorithm is to achieve the detection of R-peaks in ECG signals. Its performance is further evaluated in ECG signals with different origins and features to test the model’s ability to generalize its outcomes. Performance of the model for detection of R-peaks for clean and noisy ECGs is presented. The model is able to detect R-peaks in the presence of various types of noise, and when presented with data, it has not been trained. It is expected that this approach will increase the effectiveness and capacity of cardiologists to detect divergences in the normal cardiac activity of their patients.

Keywords: arrhythmia, deep learning, electrocardiogram, machine learning, R-peaks

Procedia PDF Downloads 175
3792 EQMamba - Method Suggestion for Earthquake Detection and Phase Picking

Authors: Noga Bregman

Abstract:

Accurate and efficient earthquake detection and phase picking are crucial for seismic hazard assessment and emergency response. This study introduces EQMamba, a deep-learning method that combines the strengths of the Earthquake Transformer and the Mamba model for simultaneous earthquake detection and phase picking. EQMamba leverages the computational efficiency of Mamba layers to process longer seismic sequences while maintaining a manageable model size. The proposed architecture integrates convolutional neural networks (CNNs), bidirectional long short-term memory (BiLSTM) networks, and Mamba blocks. The model employs an encoder composed of convolutional layers and max pooling operations, followed by residual CNN blocks for feature extraction. Mamba blocks are applied to the outputs of BiLSTM blocks, efficiently capturing long-range dependencies in seismic data. Separate decoders are used for earthquake detection, P-wave picking, and S-wave picking. We trained and evaluated EQMamba using a subset of the STEAD dataset, a comprehensive collection of labeled seismic waveforms. The model was trained using a weighted combination of binary cross-entropy loss functions for each task, with the Adam optimizer and a scheduled learning rate. Data augmentation techniques were employed to enhance the model's robustness. Performance comparisons were conducted between EQMamba and the EQTransformer over 20 epochs on this modest-sized STEAD subset. Results demonstrate that EQMamba achieves superior performance, with higher F1 scores and faster convergence compared to EQTransformer. EQMamba reached F1 scores of 0.8 by epoch 5 and maintained higher scores throughout training. The model also exhibited more stable validation performance, indicating good generalization capabilities. While both models showed lower accuracy in phase-picking tasks compared to detection, EQMamba's overall performance suggests significant potential for improving seismic data analysis. The rapid convergence and superior F1 scores of EQMamba, even on a modest-sized dataset, indicate promising scalability for larger datasets. This study contributes to the field of earthquake engineering by presenting a computationally efficient and accurate method for simultaneous earthquake detection and phase picking. Future work will focus on incorporating Mamba layers into the P and S pickers and further optimizing the architecture for seismic data specifics. The EQMamba method holds the potential for enhancing real-time earthquake monitoring systems and improving our understanding of seismic events.

Keywords: earthquake, detection, phase picking, s waves, p waves, transformer, deep learning, seismic waves

Procedia PDF Downloads 28
3791 New Hardy Type Inequalities of Two-Dimensional on Time Scales via Steklov Operator

Authors: Wedad Albalawi

Abstract:

The mathematical inequalities have been the core of mathematical study and used in almost all branches of mathematics as well in various areas of science and engineering. The inequalities by Hardy, Littlewood and Polya were the first significant composition of several science. This work presents fundamental ideas, results and techniques, and it has had much influence on research in various branches of analysis. Since 1934, various inequalities have been produced and studied in the literature. Furthermore, some inequalities have been formulated by some operators; in 1989, weighted Hardy inequalities have been obtained for integration operators. Then, they obtained weighted estimates for Steklov operators that were used in the solution of the Cauchy problem for the wave equation. They were improved upon in 2011 to include the boundedness of integral operators from the weighted Sobolev space to the weighted Lebesgue space. Some inequalities have been demonstrated and improved using the Hardy–Steklov operator. Recently, a lot of integral inequalities have been improved by differential operators. Hardy inequality has been one of the tools that is used to consider integrity solutions of differential equations. Then, dynamic inequalities of Hardy and Coposon have been extended and improved by various integral operators. These inequalities would be interesting to apply in different fields of mathematics (functional spaces, partial differential equations, mathematical modeling). Some inequalities have been appeared involving Copson and Hardy inequalities on time scales to obtain new special version of them. A time scale is an arbitrary nonempty closed subset of the real numbers. Then, the dynamic inequalities on time scales have received a lot of attention in the literature and has become a major field in pure and applied mathematics. There are many applications of dynamic equations on time scales to quantum mechanics, electrical engineering, neural networks, heat transfer, combinatorics, and population dynamics. This study focuses on Hardy and Coposon inequalities, using Steklov operator on time scale in double integrals to obtain special cases of time-scale inequalities of Hardy and Copson on high dimensions. The advantage of this study is that it uses the one-dimensional classical Hardy inequality to obtain higher dimensional on time scale versions that will be applied in the solution of the Cauchy problem for the wave equation. In addition, the obtained inequalities have various applications involving discontinuous domains such as bug populations, phytoremediation of metals, wound healing, maximization problems. The proof can be done by introducing restriction on the operator in several cases. The concepts in time scale version such as time scales calculus will be used that allows to unify and extend many problems from the theories of differential and of difference equations. In addition, using chain rule, and some properties of multiple integrals on time scales, some theorems of Fubini and the inequality of H¨older.

Keywords: time scales, inequality of hardy, inequality of coposon, steklov operator

Procedia PDF Downloads 87
3790 A Simple Approach to Reliability Assessment of Structures via Anomaly Detection

Authors: Rims Janeliukstis, Deniss Mironovs, Andrejs Kovalovs

Abstract:

Operational Modal Analysis (OMA) is widely applied as a method for Structural Health Monitoring for structural damage identification and assessment by tracking the changes of the identified modal parameters over time. Unfortunately, modal parameters also depend on such external factors as temperature and loads. Any structural condition assessment using modal parameters should be done taking into consideration those external factors, otherwise there is a high chance of false positives. A method of structural reliability assessment based on anomaly detection technique called Machalanobis Squared Distance (MSD) is proposed. It requires a set of reference conditions to learn healthy state of a structure, which all future parameters are compared to. In this study, structural modal parameters (natural frequency and mode shape), as well as ambient temperature and loads acting on the structure are used as features. Numerical tests were performed on a finite element model of a carbon fibre reinforced polymer composite beam with delamination damage at various locations and of various severities. The advantages of the demonstrated approach include relatively few computational steps, ability to distinguish between healthy and damaged conditions and discriminate between different damage severities. It is anticipated to be promising in reliability assessment of massively produced structural parts.

Keywords: operational modal analysis, reliability assessment, anomaly detection, damage, mahalanobis squared distance

Procedia PDF Downloads 106
3789 Self-Directed-Car on GT Road: Grand Trunk Road

Authors: Rameez Ahmad, Aqib Mehmood, Imran Khan

Abstract:

Self-directed car (SDC) that can drive itself from one fact to another without support from a driver. Certain trust that self-directed car obligate the probable to transform the transportation manufacturing while essentially removing coincidences, and cleaning up the environment. This study realizes the effects that SDC (also called a self-driving, driver or robotic) vehicle travel demands and ride scheme is likely to have. Without the typical obstacles that allows detection of a audio vision based hardware and software construction (It (SDC) and cost benefits, the vehicle technologies, Gold (Generic Obstacle and Lane Detection) to a knowledge-based system to predict their potential and consider the shape, color, or balance) and an organized environment with colored lane patterns, lane position ban. Discovery the problematic consequence of (SDC) on GT (grand trunk road) road and brand the car further effectual.

Keywords: SDC, gold, GT, knowledge-based system

Procedia PDF Downloads 362
3788 Combining Shallow and Deep Unsupervised Machine Learning Techniques to Detect Bad Actors in Complex Datasets

Authors: Jun Ming Moey, Zhiyaun Chen, David Nicholson

Abstract:

Bad actors are often hard to detect in data that imprints their behaviour patterns because they are comparatively rare events embedded in non-bad actor data. An unsupervised machine learning framework is applied here to detect bad actors in financial crime datasets that record millions of transactions undertaken by hundreds of actors (<0.01% bad). Specifically, the framework combines ‘shallow’ (PCA, Isolation Forest) and ‘deep’ (Autoencoder) methods to detect outlier patterns. Detection performance analysis for both the individual methods and their combination is reported.

Keywords: detection, machine learning, deep learning, unsupervised, outlier analysis, data science, fraud, financial crime

Procedia PDF Downloads 90
3787 The Importance of Teachers´ Self-Efficacy in the Field of Education of Socially Disadvantaged Students

Authors: Anna Petr Safrankova, Karla Hrbackova

Abstract:

The education of socially disadvantaged students is in the long term spotlight of many pedagogical researches in both Czech and foreign environment. These researches among others investigate this topic from the point of view of individual compensatory measure which tries to overcome or remove the social disadvantage. The focus of the study is to highlight the important role of teachers in the education of this specific group of students, among others in terms of their (teachers´) pre-graduate training. The aim of the study is to point out the importance of teachers´ self-efficacy. The study is based on the assumption that the teacher's self-efficacy may significantly affect the teacher's perception of a particular group of students and thereby affect the education of the students. The survey involved 245 teachers from the two regions in the Czech Republic. In the research were used TES questionnaire (with the dimensions personal teaching efficacy – PTE and general teaching efficacy – GTE) by Gibson and Dembo and the semantic differential (containing 12 scales with bipolar adjectives) which investigated the components of teachers' attitudes toward socially disadvantaged students. It was found that teachers’ self-efficacy significantly affects the teachers’ perception of the group of socially disadvantaged students. Based on this finding we believe that it is necessary to work with this concept (prepare teachers to educate this specific group of students) already during higher education and especially during the pre-graduate teachers training.

Keywords: teachers, socially disadvantaged students, semantic differential, teachers self-efficacy

Procedia PDF Downloads 418
3786 Empirical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: general appliance model, non intrusive load monitoring, events detection, unsupervised techniques;

Procedia PDF Downloads 75
3785 Stress Analysis of Tubular Bonded Joints under Torsion and Hygrothermal Effects Using DQM

Authors: Mansour Mohieddin Ghomshei, Reza Shahi

Abstract:

Laminated composite tubes with adhesively bonded joints are widely used in aerospace and automotive industries as well as oil and gas industries. In this research, adhesively tubular single lap joints subjected to torsional and hygrothermal loadings are studied using the differential quadrature method (DQM). The analysis is based on the classical shell theory. At first, an approximate closed form solution is developed by omitting the lateral deflections in the connecting tubes. Using the analytical model, the circumferential displacements in tubes and the shear stresses in the interfacing adhesive layer are determined. Then, a numerical formulation is presented using DQM in which the lateral deflections are taken into account. By using the DQM formulation, the circumferential and radial displacements in tubes as well as shear and peel stresses in the adhesive layer are calculated. Results obtained from the proposed DQM solutions are compared well with those of the approximate analytical model and those of some published references. Finally using the DQM model, parametric studies are carried out to investigate the influence of various parameters such as adhesive layer thickness, torsional loading, overlap length, tubes radii, relative humidity, and temperature.

Keywords: adhesively bonded joint, differential quadrature method (DQM), hygrothermal, laminated composite tube

Procedia PDF Downloads 296
3784 Residual Evaluation by Thresholding and Neuro-Fuzzy System: Application to Actuator

Authors: Y. Kourd, D. Lefebvre, N. Guersi

Abstract:

The monitoring of industrial processes is required to ensure operating conditions of industrial systems through automatic detection and isolation of faults. In this paper we propose a method of fault diagnosis based on neuro-fuzzy technique and the choice of a threshold. The validation of this method on a test bench "Actuator Electro DAMADICS Benchmark". In the first phase of the method, we construct a model represents the normal state of the system to fault detection. With residuals analysis generated and the choice of thresholds for signatures table. These signatures provide us with groups of non-detectable faults. In the second phase, we build faulty models to see the flaws in the system that are not located in the first phase.

Keywords: residuals analysis, threshold, neuro-fuzzy system, residual evaluation

Procedia PDF Downloads 442
3783 Refractometric Optical Sensing by Using Photonics Mach–Zehnder Interferometer

Authors: Gong Zhang, Hong Cai, Bin Dong, Jifang Tao, Aiqun Liu, Dim-Lee Kwong, Yuandong Gu

Abstract:

An on-chip refractive index sensor with high sensitivity and large measurement range is demonstrated in this paper. The sensing structures are based on Mach-Zehnder interferometer configuration, built on the SOI substrate. The wavelength sensitivity of the sensor is estimated to be 3129 nm/RIU. Meanwhile, according to the interference pattern period changes, the measured period sensitivities are 2.9 nm/RIU (TE mode) and 4.21 nm/RIU (TM mode), respectively. As such, the wavelength shift and the period shift can be used for fine index change detection and larger index change detection, respectively. Therefore, the sensor design provides an approach for large index change measurement with high sensitivity.

Keywords: Mach-Zehnder interferometer, nanotechnology, refractive index sensing, sensors

Procedia PDF Downloads 441
3782 Application of Remote Sensing and GIS in Assessing Land Cover Changes within Granite Quarries around Brits Area, South Africa

Authors: Refilwe Moeletsi

Abstract:

Dimension stone quarrying around Brits and Belfast areas started in the early 1930s and has been growing rapidly since then. Environmental impacts associated with these quarries have not been documented, and hence this study aims at detecting any change in the environment that might have been caused by these activities. Landsat images that were used to assess land use/land cover changes in Brits quarries from 1998 - 2015. A supervised classification using maximum likelihood classifier was applied to classify each image into different land use/land cover types. Classification accuracy was assessed using Google Earth™ as a source of reference data. Post-classification change detection method was used to determine changes. The results revealed significant increase in granite quarries and corresponding decrease in vegetation cover within the study region.

Keywords: remote sensing, GIS, change detection, granite quarries

Procedia PDF Downloads 306
3781 Scour Damaged Detection of Bridge Piers Using Vibration Analysis - Numerical Study of a Bridge

Authors: Solaine Hachem, Frédéric Bourquin, Dominique Siegert

Abstract:

The brutal collapse of bridges is mainly due to scour. Indeed, the soil erosion in the riverbed around a pier modifies the embedding conditions of the structure, reduces its overall stiffness and threatens its stability. Hence, finding an efficient technique that allows early scour detection becomes mandatory. Vibration analysis is an indirect method for scour detection that relies on real-time monitoring of the bridge. It tends to indicate the presence of a scour based on its consequences on the stability of the structure and its dynamic response. Most of the research in this field has focused on the dynamic behavior of a single pile and has examined the depth of the scour. In this paper, a bridge is fully modeled with all piles and spans and the scour is represented by a reduction in the foundation's stiffnesses. This work aims to identify the vibration modes sensitive to the rigidity’s loss in the foundations so that their variations can be considered as a scour indicator: the decrease in soil-structure interaction rigidity leads to a decrease in the natural frequencies’ values. By using the first-order perturbation method, the expression of sensitivity, which depends only on the selected vibration modes, is established to determine the deficiency of foundations stiffnesses. The solutions are obtained by using the singular value decomposition method for the regularization of the inverse problem. The propagation of uncertainties is also calculated to verify the efficiency of the inverse problem method. Numerical simulations describing different scenarios of scour are investigated on a simplified model of a real composite steel-concrete bridge located in France. The results of the modal analysis show that the modes corresponding to in-plane and out-of-plane piers vibrations are sensitive to the loss of foundation stiffness. While the deck bending modes are not affected by this damage.

Keywords: bridge’s piers, inverse problems, modal sensitivity, scour detection, vibration analysis

Procedia PDF Downloads 97
3780 Memory and Narratives Rereading before and after One Week

Authors: Abigail M. Csik, Gabriel A. Radvansky

Abstract:

As people read through event-based narratives, they construct an event model that captures information about the characters, goals, location, time, and causality. For many reasons, memory for such narratives is represented at different levels, namely, the surface form, textbase, and event model levels. Rereading has been shown to decrease surface form memory, while, at the same time, increasing textbase and event model memories. More generally, distributed practice has consistently shown memory benefits over massed practice for different types of materials, including texts. However, little research has investigated distributed practice of narratives at different inter-study intervals and these effects on these three levels of memory. Recent work in our lab has indicated that there may be dramatic changes in patterns of forgetting around one week, which may affect the three levels of memory. The present experiment aimed to determine the effects of rereading on the three levels of memory as a factor of whether the texts were reread before versus after one week. Participants (N = 42) read a set of stories, re-read them either before or after one week (with an inter-study interval of three days, seven days, or fourteen days), and then took a recognition test, from which the three levels of representation were derived. Signal detection results from this study reveal that differential patterns at the three levels as a factor of whether the narratives were re-read prior to one week or after one week. In particular, an ANOVA revealed that surface form memory was lower (p = .08) while textbase (p = .02) and event model memory (p = .04) were greater if narratives were re-read 14 days later compared to memory when narratives were re-read 3 days later. These results have implications for what type of memory benefits from distributed practice at various inter-study intervals.

Keywords: memory, event cognition, distributed practice, consolidation

Procedia PDF Downloads 216
3779 Improving Chest X-Ray Disease Detection with Enhanced Data Augmentation Using Novel Approach of Diverse Conditional Wasserstein Generative Adversarial Networks

Authors: Malik Muhammad Arslan, Muneeb Ullah, Dai Shihan, Daniyal Haider, Xiaodong Yang

Abstract:

Chest X-rays are instrumental in the detection and monitoring of a wide array of diseases, including viral infections such as COVID-19, tuberculosis, pneumonia, lung cancer, and various cardiac and pulmonary conditions. To enhance the accuracy of diagnosis, artificial intelligence (AI) algorithms, particularly deep learning models like Convolutional Neural Networks (CNNs), are employed. However, these deep learning models demand a substantial and varied dataset to attain optimal precision. Generative Adversarial Networks (GANs) can be employed to create new data, thereby supplementing the existing dataset and enhancing the accuracy of deep learning models. Nevertheless, GANs have their limitations, such as issues related to stability, convergence, and the ability to distinguish between authentic and fabricated data. In order to overcome these challenges and advance the detection and classification of CXR normal and abnormal images, this study introduces a distinctive technique known as DCWGAN (Diverse Conditional Wasserstein GAN) for generating synthetic chest X-ray (CXR) images. The study evaluates the effectiveness of this Idiosyncratic DCWGAN technique using the ResNet50 model and compares its results with those obtained using the traditional GAN approach. The findings reveal that the ResNet50 model trained on the DCWGAN-generated dataset outperformed the model trained on the classic GAN-generated dataset. Specifically, the ResNet50 model utilizing DCWGAN synthetic images achieved impressive performance metrics with an accuracy of 0.961, precision of 0.955, recall of 0.970, and F1-Measure of 0.963. These results indicate the promising potential for the early detection of diseases in CXR images using this Inimitable approach.

Keywords: CNN, classification, deep learning, GAN, Resnet50

Procedia PDF Downloads 75
3778 Analysis of Airborne Data Using Range Migration Algorithm for the Spotlight Mode of Synthetic Aperture Radar

Authors: Peter Joseph Basil Morris, Chhabi Nigam, S. Ramakrishnan, P. Radhakrishna

Abstract:

This paper brings out the analysis of the airborne Synthetic Aperture Radar (SAR) data using the Range Migration Algorithm (RMA) for the spotlight mode of operation. Unlike in polar format algorithm (PFA), space-variant defocusing and geometric distortion effects are mitigated in RMA since it does not assume that the illuminating wave-fronts are planar. This facilitates the use of RMA for imaging scenarios involving severe differential range curvatures enabling the imaging of larger scenes at fine resolution and at shorter ranges with low center frequencies. The RMA algorithm for the spotlight mode of SAR is analyzed in this paper using the airborne data. Pre-processing operations viz: - range de-skew and motion compensation to a line are performed on the raw data before being fed to the RMA component. Various stages of the RMA viz:- 2D Matched Filtering, Along Track Fourier Transform and Slot Interpolation are analyzed to find the performance limits and the dependence of the imaging geometry on the resolution of the final image. The ability of RMA to compensate for severe differential range curvatures in the two-dimensional spatial frequency domain are also illustrated in this paper.

Keywords: range migration algorithm, spotlight SAR, synthetic aperture radar, matched filtering, slot interpolation

Procedia PDF Downloads 235
3777 High-Performance Liquid Chromatographic Method with Diode Array Detection (HPLC-DAD) Analysis of Naproxen and Omeprazole Active Isomers

Authors: Marwa Ragab, Eman El-Kimary

Abstract:

Chiral separation and analysis of omeprazole and naproxen enantiomers in tablets were achieved using high-performance liquid chromatographic method with diode array detection (HPLC-DAD). Kromasil Cellucoat chiral column was used as a stationary phase for separation and the eluting solvent consisted of hexane, isopropanol and trifluoroacetic acid in a ratio of: 90, 9.9 and 0.1, respectively. The chromatographic system was suitable for the enantiomeric separation and analysis of active isomers of the drugs. Resolution values of 2.17 and 3.84 were obtained after optimization of the chromatographic conditions for omeprazole and naproxen isomers, respectively. The determination of S-isomers of each drug in their dosage form was fully validated.

Keywords: chiral analysis, esomeprazole, S-Naproxen, HPLC-DAD

Procedia PDF Downloads 297
3776 A Combination of Anisotropic Diffusion and Sobel Operator to Enhance the Performance of the Morphological Component Analysis for Automatic Crack Detection

Authors: Ankur Dixit, Hiroaki Wagatsuma

Abstract:

The crack detection on a concrete bridge is an important and constant task in civil engineering. Chronically, humans are checking the bridge for inspection of cracks to maintain the quality and reliability of bridge. But this process is very long and costly. To overcome such limitations, we have used a drone with a digital camera, which took some images of bridge deck and these images are processed by morphological component analysis (MCA). MCA technique is a very strong application of sparse coding and it explores the possibility of separation of images. In this paper, MCA has been used to decompose the image into coarse and fine components with the effectiveness of two dictionaries namely anisotropic diffusion and wavelet transform. An anisotropic diffusion is an adaptive smoothing process used to adjust diffusion coefficient by finding gray level and gradient as features. These cracks in image are enhanced by subtracting the diffused coarse image into the original image and the results are treated by Sobel edge detector and binary filtering to exhibit the cracks in a fine way. Our results demonstrated that proposed MCA framework using anisotropic diffusion followed by Sobel operator and binary filtering may contribute to an automation of crack detection even in open field sever conditions such as bridge decks.

Keywords: anisotropic diffusion, coarse component, fine component, MCA, Sobel edge detector and wavelet transform

Procedia PDF Downloads 168
3775 A Plasmonic Mass Spectrometry Approach for Detection of Small Nutrients and Toxins

Authors: Haiyang Su, Kun Qian

Abstract:

We developed a novel plasmonic matrix assisted laser desorption/ionization mass spectrometry (MALDI MS) approach to detect small nutrients and toxin in complex biological emulsion samples. We used silver nanoshells (SiO₂@Ag) with optimized structures as matrices and achieved direct analysis of ~6 nL of human breast milk without any enrichment or separation. We performed identification and quantitation of small nutrients and toxins with limit-of-detection down to 0.4 pmol (for melamine) and reaction time shortened to minutes, superior to the conventional biochemical methods currently in use. Our approach contributed to the near-future application of MALDI MS in a broad field and personalized design of plasmonic materials for real case bio-analysis.

Keywords: plasmonic materials, laser desorption/ionization, mass spectrometry, small nutrients, toxins

Procedia PDF Downloads 205
3774 A Pattern Recognition Neural Network Model for Detection and Classification of SQL Injection Attacks

Authors: Naghmeh Moradpoor Sheykhkanloo

Abstract:

Structured Query Language Injection (SQLI) attack is a code injection technique in which malicious SQL statements are inserted into a given SQL database by simply using a web browser. Losing data, disclosing confidential information or even changing the value of data are the severe damages that SQLI attack can cause on a given database. SQLI attack has also been rated as the number-one attack among top ten web application threats on Open Web Application Security Project (OWASP). OWASP is an open community dedicated to enabling organisations to consider, develop, obtain, function, and preserve applications that can be trusted. In this paper, we propose an effective pattern recognition neural network model for detection and classification of SQLI attacks. The proposed model is built from three main elements of: a Uniform Resource Locator (URL) generator in order to generate thousands of malicious and benign URLs, a URL classifier in order to: 1) classify each generated URL to either a benign URL or a malicious URL and 2) classify the malicious URLs into different SQLI attack categories, and an NN model in order to: 1) detect either a given URL is a malicious URL or a benign URL and 2) identify the type of SQLI attack for each malicious URL. The model is first trained and then evaluated by employing thousands of benign and malicious URLs. The results of the experiments are presented in order to demonstrate the effectiveness of the proposed approach.

Keywords: neural networks, pattern recognition, SQL injection attacks, SQL injection attack classification, SQL injection attack detection

Procedia PDF Downloads 459
3773 Principle Component Analysis on Colon Cancer Detection

Authors: N. K. Caecar Pratiwi, Yunendah Nur Fuadah, Rita Magdalena, R. D. Atmaja, Sofia Saidah, Ocky Tiaramukti

Abstract:

Colon cancer or colorectal cancer is a type of cancer that attacks the last part of the human digestive system. Lymphoma and carcinoma are types of cancer that attack human’s colon. Colon cancer causes deaths about half a million people every year. In Indonesia, colon cancer is the third largest cancer case for women and second in men. Unhealthy lifestyles such as minimum consumption of fiber, rarely exercising and lack of awareness for early detection are factors that cause high cases of colon cancer. The aim of this project is to produce a system that can detect and classify images into type of colon cancer lymphoma, carcinoma, or normal. The designed system used 198 data colon cancer tissue pathology, consist of 66 images for Lymphoma cancer, 66 images for carcinoma cancer and 66 for normal / healthy colon condition. This system will classify colon cancer starting from image preprocessing, feature extraction using Principal Component Analysis (PCA) and classification using K-Nearest Neighbor (K-NN) method. Several stages in preprocessing are resize, convert RGB image to grayscale, edge detection and last, histogram equalization. Tests will be done by trying some K-NN input parameter setting. The result of this project is an image processing system that can detect and classify the type of colon cancer with high accuracy and low computation time.

Keywords: carcinoma, colorectal cancer, k-nearest neighbor, lymphoma, principle component analysis

Procedia PDF Downloads 199
3772 Machine Learning Methods for Network Intrusion Detection

Authors: Mouhammad Alkasassbeh, Mohammad Almseidin

Abstract:

Network security engineers work to keep services available all the time by handling intruder attacks. Intrusion Detection System (IDS) is one of the obtainable mechanisms that is used to sense and classify any abnormal actions. Therefore, the IDS must be always up to date with the latest intruder attacks signatures to preserve confidentiality, integrity, and availability of the services. The speed of the IDS is a very important issue as well learning the new attacks. This research work illustrates how the Knowledge Discovery and Data Mining (or Knowledge Discovery in Databases) KDD dataset is very handy for testing and evaluating different Machine Learning Techniques. It mainly focuses on the KDD preprocess part in order to prepare a decent and fair experimental data set. The J48, MLP, and Bayes Network classifiers have been chosen for this study. It has been proven that the J48 classifier has achieved the highest accuracy rate for detecting and classifying all KDD dataset attacks, which are of type DOS, R2L, U2R, and PROBE.

Keywords: IDS, DDoS, MLP, KDD

Procedia PDF Downloads 232
3771 Automatic Early Breast Cancer Segmentation Enhancement by Image Analysis and Hough Transform

Authors: David Jurado, Carlos Ávila

Abstract:

Detection of early signs of breast cancer development is crucial to quickly diagnose the disease and to define adequate treatment to increase the survival probability of the patient. Computer Aided Detection systems (CADs), along with modern data techniques such as Machine Learning (ML) and Neural Networks (NN), have shown an overall improvement in digital mammography cancer diagnosis, reducing the false positive and false negative rates becoming important tools for the diagnostic evaluations performed by specialized radiologists. However, ML and NN-based algorithms rely on datasets that might bring issues to the segmentation tasks. In the present work, an automatic segmentation and detection algorithm is described. This algorithm uses image processing techniques along with the Hough transform to automatically identify microcalcifications that are highly correlated with breast cancer development in the early stages. Along with image processing, automatic segmentation of high-contrast objects is done using edge extraction and circle Hough transform. This provides the geometrical features needed for an automatic mask design which extracts statistical features of the regions of interest. The results shown in this study prove the potential of this tool for further diagnostics and classification of mammographic images due to the low sensitivity to noisy images and low contrast mammographies.

Keywords: breast cancer, segmentation, X-ray imaging, hough transform, image analysis

Procedia PDF Downloads 74
3770 Detection of Cryptosporidium Oocysts by Acid-Fast Staining Method and PCR in Surface Water from Tehran, Iran

Authors: Mohamad Mohsen Homayouni, Niloofar Taghipour, Ahmad Reza Memar, Niloofar Khalaji, Hamed Kiani, Seyyed Javad Seyyed Tabaei

Abstract:

Background and Objective: Cryptosporidium is a coccidian protozoan parasite; its oocysts in surface water are a global health problem. Due to the low number of parasites in the water resources and the lack of laboratory culture, rapid and sensitive method for detection of the organism in the water resources is necessarily required. We applied modified acid-fast staining and PCR for the detection of the Cryptosporidium spp. and analysed the genotypes in 55 samples collected from surface water. Methods: Over a period of nine months, 55 surface water samples were collected from the five rivers in Tehran, Iran. The samples were filtered by using cellulose acetate membrane filters. By acid fast method, initial identification of Cryptosporidium oocyst were carried out on surface water samples. Then, nested PCR assay was designed for the specific amplification and analysed the genotypes. Results: Modified Ziehl-Neelsen method revealed 5–20 Cryptosporidium oocysts detected per 10 Liter. Five out of the 55 (9.09%) surface water samples were found positive for Cryptosporidium spp. by Ziehl-Neelsen test and seven (12.7%) were found positive by nested PCR. The staining results were consistent with PCR. Seven Cryptosporidium PCR products were successfully sequenced and five gp60 subtypes were detected. Our finding of gp60 gene revealed that all of the positive isolates were Cryptosporidium parvum and belonged to subtype families IIa and IId. Conclusion: Our investigations were showed that collection of water samples were contaminated by Cryptosporidium, with potential hazards for the significant health problem. This study provides the first report on detection and genotyping of Cryptosporidium species from surface water samples in Iran, and its result confirmed the low clinical incidence of this parasite on the community.

Keywords: Cryptosporidium spp., membrane filtration, subtype, surface water, Iran

Procedia PDF Downloads 406
3769 Design and Development of Novel Anion Selective Chemosensors Derived from Vitamin B6 Cofactors

Authors: Darshna Sharma, Suban K. Sahoo

Abstract:

The detection of intracellular fluoride in human cancer cell HeLa was achieved by chemosensors derived from vitamin B6 cofactors using fluorescence imaging technique. These sensors were first synthesized by condensation of pyridoxal/pyridoxal phosphate with 2-amino(thio)phenol. The anion recognition ability was explored by experimental (UV-VIS, fluorescence and 1H NMR) and theoretical DFT [(B3LYP/6-31G(d,p)] methods in DMSO and mixed DMSO-H2O system. All the developed sensors showed both naked-eye detectable color change and remarkable fluorescence enhancement in the presence of F- and AcO-. The anion recognition was occurred through the formation of hydrogen bonded complexes between these anions and sensor, followed by the partial deprotonation of sensor. The detection limit of these sensors were down to micro(nano) molar level of F- and AcO-.

Keywords: chemosensors, fluoride, acetate, turn-on, live cells imaging, DFT

Procedia PDF Downloads 396
3768 Efficient Passenger Counting in Public Transport Based on Machine Learning

Authors: Chonlakorn Wiboonsiriruk, Ekachai Phaisangittisagul, Chadchai Srisurangkul, Itsuo Kumazawa

Abstract:

Public transportation is a crucial aspect of passenger transportation, with buses playing a vital role in the transportation service. Passenger counting is an essential tool for organizing and managing transportation services. However, manual counting is a tedious and time-consuming task, which is why computer vision algorithms are being utilized to make the process more efficient. In this study, different object detection algorithms combined with passenger tracking are investigated to compare passenger counting performance. The system employs the EfficientDet algorithm, which has demonstrated superior performance in terms of speed and accuracy. Our results show that the proposed system can accurately count passengers in varying conditions with an accuracy of 94%.

Keywords: computer vision, object detection, passenger counting, public transportation

Procedia PDF Downloads 141
3767 The Design of Multiple Detection Parallel Combined Spread Spectrum Communication System

Authors: Lixin Tian, Wei Xue

Abstract:

Many jobs in society go underground, such as mine mining, tunnel construction and subways, which are vital to the development of society. Once accidents occur in these places, the interruption of traditional wired communication is not conducive to the development of rescue work. In order to realize the positioning, early warning and command functions of underground personnel and improve rescue efficiency, it is necessary to develop and design an emergency ground communication system. It is easy to be subjected to narrowband interference when performing conventional underground communication. Spreading communication can be used for this problem. However, general spread spectrum methods such as direct spread communication are inefficient, so it is proposed to use parallel combined spread spectrum (PCSS) communication to improve efficiency. The PCSS communication not only has the anti-interference ability and the good concealment of the traditional spread spectrum system, but also has a relatively high frequency band utilization rate and a strong information transmission capability. So, this technology has been widely used in practice. This paper presents a PCSS communication model-multiple detection parallel combined spread spectrum (MDPCSS) communication system. In this paper, the principle of MDPCSS communication system is described, that is, the sequence at the transmitting end is processed in blocks and cyclically shifted to facilitate multiple detection at the receiving end. The block diagrams of the transmitter and receiver of the MDPCSS communication system are introduced. At the same time, the calculation formula of the system bit error rate (BER) is introduced, and the simulation and analysis of the BER of the system are completed. By comparing with the common parallel PCSS communication, we can draw a conclusion that it is indeed possible to reduce the BER and improve the system performance. Furthermore, the influence of different pseudo-code lengths selected on the system BER is simulated and analyzed, and the conclusion is that the larger the pseudo-code length is, the smaller the system error rate is.

Keywords: cyclic shift, multiple detection, parallel combined spread spectrum, PN code

Procedia PDF Downloads 132