Search results for: real time anomaly detection
21229 Multi-Stage Classification for Lung Lesion Detection on CT Scan Images Applying Medical Image Processing Technique
Authors: Behnaz Sohani, Sahand Shahalinezhad, Amir Rahmani, Aliyu Aliyu
Abstract:
Recently, medical imaging and specifically medical image processing is becoming one of the most dynamically developing areas of medical science. It has led to the emergence of new approaches in terms of the prevention, diagnosis, and treatment of various diseases. In the process of diagnosis of lung cancer, medical professionals rely on computed tomography (CT) scans, in which failure to correctly identify masses can lead to incorrect diagnosis or sampling of lung tissue. Identification and demarcation of masses in terms of detecting cancer within lung tissue are critical challenges in diagnosis. In this work, a segmentation system in image processing techniques has been applied for detection purposes. Particularly, the use and validation of a novel lung cancer detection algorithm have been presented through simulation. This has been performed employing CT images based on multilevel thresholding. The proposed technique consists of segmentation, feature extraction, and feature selection and classification. More in detail, the features with useful information are selected after featuring extraction. Eventually, the output image of lung cancer is obtained with 96.3% accuracy and 87.25%. The purpose of feature extraction applying the proposed approach is to transform the raw data into a more usable form for subsequent statistical processing. Future steps will involve employing the current feature extraction method to achieve more accurate resulting images, including further details available to machine vision systems to recognise objects in lung CT scan images.Keywords: lung cancer detection, image segmentation, lung computed tomography (CT) images, medical image processing
Procedia PDF Downloads 9921228 Unsupervised Echocardiogram View Detection via Autoencoder-Based Representation Learning
Authors: Andrea Treviño Gavito, Diego Klabjan, Sanjiv J. Shah
Abstract:
Echocardiograms serve as pivotal resources for clinicians in diagnosing cardiac conditions, offering non-invasive insights into a heart’s structure and function. When echocardiographic studies are conducted, no standardized labeling of the acquired views is performed. Employing machine learning algorithms for automated echocardiogram view detection has emerged as a promising solution to enhance efficiency in echocardiogram use for diagnosis. However, existing approaches predominantly rely on supervised learning, necessitating labor-intensive expert labeling. In this paper, we introduce a fully unsupervised echocardiographic view detection framework that leverages convolutional autoencoders to obtain lower dimensional representations and the K-means algorithm for clustering them into view-related groups. Our approach focuses on discriminative patches from echocardiographic frames. Additionally, we propose a trainable inverse average layer to optimize decoding of average operations. By integrating both public and proprietary datasets, we obtain a marked improvement in model performance when compared to utilizing a proprietary dataset alone. Our experiments show boosts of 15.5% in accuracy and 9.0% in the F-1 score for frame-based clustering, and 25.9% in accuracy and 19.8% in the F-1 score for view-based clustering. Our research highlights the potential of unsupervised learning methodologies and the utilization of open-sourced data in addressing the complexities of echocardiogram interpretation, paving the way for more accurate and efficient cardiac diagnoses.Keywords: artificial intelligence, echocardiographic view detection, echocardiography, machine learning, self-supervised representation learning, unsupervised learning
Procedia PDF Downloads 3121227 Security Issues on Smart Grid and Blockchain-Based Secure Smart Energy Management Systems
Authors: Surah Aldakhl, Dafer Alali, Mohamed Zohdy
Abstract:
The next generation of electricity grid infrastructure, known as the "smart grid," integrates smart ICT (information and communication technology) into existing grids in order to alleviate the drawbacks of existing one-way grid systems. Future power systems' efficiency and dependability are anticipated to significantly increase thanks to the Smart Grid, especially given the desire for renewable energy sources. The security of the Smart Grid's cyber infrastructure is a growing concern, though, as a result of the interconnection of significant power plants through communication networks. Since cyber-attacks can destroy energy data, beginning with personal information leaking from grid members, they can result in serious incidents like huge outages and the destruction of power network infrastructure. We shall thus propose a secure smart energy management system based on the Blockchain as a remedy for this problem. The power transmission and distribution system may undergo a transformation as a result of the inclusion of optical fiber sensors and blockchain technology in smart grids. While optical fiber sensors allow real-time monitoring and management of electrical energy flow, Blockchain offers a secure platform to safeguard the smart grid against cyberattacks and unauthorized access. Additionally, this integration makes it possible to see how energy is produced, distributed, and used in real time, increasing transparency. This strategy has advantages in terms of improved security, efficiency, dependability, and flexibility in energy management. An in-depth analysis of the advantages and drawbacks of combining blockchain technology with optical fiber is provided in this paper.Keywords: smart grids, blockchain, fiber optic sensor, security
Procedia PDF Downloads 11821226 Evaluation of Ensemble Classifiers for Intrusion Detection
Authors: M. Govindarajan
Abstract:
One of the major developments in machine learning in the past decade is the ensemble method, which finds highly accurate classifier by combining many moderately accurate component classifiers. In this research work, new ensemble classification methods are proposed with homogeneous ensemble classifier using bagging and heterogeneous ensemble classifier using arcing and their performances are analyzed in terms of accuracy. A Classifier ensemble is designed using Radial Basis Function (RBF) and Support Vector Machine (SVM) as base classifiers. The feasibility and the benefits of the proposed approaches are demonstrated by the means of standard datasets of intrusion detection. The main originality of the proposed approach is based on three main parts: preprocessing phase, classification phase, and combining phase. A wide range of comparative experiments is conducted for standard datasets of intrusion detection. The performance of the proposed homogeneous and heterogeneous ensemble classifiers are compared to the performance of other standard homogeneous and heterogeneous ensemble methods. The standard homogeneous ensemble methods include Error correcting output codes, Dagging and heterogeneous ensemble methods include majority voting, stacking. The proposed ensemble methods provide significant improvement of accuracy compared to individual classifiers and the proposed bagged RBF and SVM performs significantly better than ECOC and Dagging and the proposed hybrid RBF-SVM performs significantly better than voting and stacking. Also heterogeneous models exhibit better results than homogeneous models for standard datasets of intrusion detection.Keywords: data mining, ensemble, radial basis function, support vector machine, accuracy
Procedia PDF Downloads 24621225 Big Data Applications for the Transport Sector
Authors: Antonella Falanga, Armando Cartenì
Abstract:
Today, an unprecedented amount of data coming from several sources, including mobile devices, sensors, tracking systems, and online platforms, characterizes our lives. The term “big data” not only refers to the quantity of data but also to the variety and speed of data generation. These data hold valuable insights that, when extracted and analyzed, facilitate informed decision-making. The 4Vs of big data - velocity, volume, variety, and value - highlight essential aspects, showcasing the rapid generation, vast quantities, diverse sources, and potential value addition of these kinds of data. This surge of information has revolutionized many sectors, such as business for improving decision-making processes, healthcare for clinical record analysis and medical research, education for enhancing teaching methodologies, agriculture for optimizing crop management, finance for risk assessment and fraud detection, media and entertainment for personalized content recommendations, emergency for a real-time response during crisis/events, and also mobility for the urban planning and for the design/management of public and private transport services. Big data's pervasive impact enhances societal aspects, elevating the quality of life, service efficiency, and problem-solving capacities. However, during this transformative era, new challenges arise, including data quality, privacy, data security, cybersecurity, interoperability, the need for advanced infrastructures, and staff training. Within the transportation sector (the one investigated in this research), applications span planning, designing, and managing systems and mobility services. Among the most common big data applications within the transport sector are, for example, real-time traffic monitoring, bus/freight vehicle route optimization, vehicle maintenance, road safety and all the autonomous and connected vehicles applications. Benefits include a reduction in travel times, road accidents and pollutant emissions. Within these issues, the proper transport demand estimation is crucial for sustainable transportation planning. Evaluating the impact of sustainable mobility policies starts with a quantitative analysis of travel demand. Achieving transportation decarbonization goals hinges on precise estimations of demand for individual transport modes. Emerging technologies, offering substantial big data at lower costs than traditional methods, play a pivotal role in this context. Starting from these considerations, this study explores the usefulness impact of big data within transport demand estimation. This research focuses on leveraging (big) data collected during the COVID-19 pandemic to estimate the evolution of the mobility demand in Italy. Estimation results reveal in the post-COVID-19 era, more than 96 million national daily trips, about 2.6 trips per capita, with a mobile population of more than 37.6 million Italian travelers per day. Overall, this research allows us to conclude that big data better enhances rational decision-making for mobility demand estimation, which is imperative for adeptly planning and allocating investments in transportation infrastructures and services.Keywords: big data, cloud computing, decision-making, mobility demand, transportation
Procedia PDF Downloads 6121224 Supervised/Unsupervised Mahalanobis Algorithm for Improving Performance for Cyberattack Detection over Communications Networks
Authors: Radhika Ranjan Roy
Abstract:
Deployment of machine learning (ML)/deep learning (DL) algorithms for cyberattack detection in operational communications networks (wireless and/or wire-line) is being delayed because of low-performance parameters (e.g., recall, precision, and f₁-score). If datasets become imbalanced, which is the usual case for communications networks, the performance tends to become worse. Complexities in handling reducing dimensions of the feature sets for increasing performance are also a huge problem. Mahalanobis algorithms have been widely applied in scientific research because Mahalanobis distance metric learning is a successful framework. In this paper, we have investigated the Mahalanobis binary classifier algorithm for increasing cyberattack detection performance over communications networks as a proof of concept. We have also found that high-dimensional information in intermediate features that are not utilized as much for classification tasks in ML/DL algorithms are the main contributor to the state-of-the-art of improved performance of the Mahalanobis method, even for imbalanced and sparse datasets. With no feature reduction, MD offers uniform results for precision, recall, and f₁-score for unbalanced and sparse NSL-KDD datasets.Keywords: Mahalanobis distance, machine learning, deep learning, NS-KDD, local intrinsic dimensionality, chi-square, positive semi-definite, area under the curve
Procedia PDF Downloads 7721223 Some New Bounds for a Real Power of the Normalized Laplacian Eigenvalues
Authors: Ayşe Dilek Maden
Abstract:
For a given a simple connected graph, we present some new bounds via a new approach for a special topological index given by the sum of the real number power of the non-zero normalized Laplacian eigenvalues. To use this approach presents an advantage not only to derive old and new bounds on this topic but also gives an idea how some previous results in similar area can be developed.Keywords: degree Kirchhoff index, normalized Laplacian eigenvalue, spanning tree, simple connected graph
Procedia PDF Downloads 36521222 Determination of Marbofloxacin in Pig Plasma Using LC-MS/MS and Its Application to the Pharmacokinetic Studies
Authors: Jeong Woo Kang, MiYoung Baek, Ki-Suk Kim, Kwang-Jick Lee, ByungJae So
Abstract:
Introduction: A fast, easy and sensitive detection method was developed and validated by liquid chromatography tandem mass spectrometry for the determination of marbofloxacin in pig plasma which was further applied to study the pharmacokinetics of marbofloxacin. Materials and Methods: The plasma sample (500 μL) was mixed with 1.5 ml of 0.1% formic acid in MeCN to precipitate plasma proteins. After shaking for 20 min, The mixture was centrifuged at 5,000 × g for 30 min. It was dried under a nitrogen flow at 50℃. 500 μL aliquot of the sample was injected into the LC-MS/MS system. Chromatographic analysis was carried out mobile phase gradient consisting 0.1% formic acid in D.W. (A) and 0.1% formic acid in MeCN (B) with C18 reverse phase column. Mass spectrometry was performed using the positive ion mode and the selected ion monitoring (MRM). Results and Conclusions: The method validation was performed in the sample matrix. Good linearities (R2>0.999) were observed and the quantified average recoveries of marbofloxacin were 87 - 92% at level of 10 ng g-1 -100 ng g-1. The percent of coefficient of variation (CV) for the described method was less than 10 % over the range of concentrations studied. The limits of detection (LOD) and quantification (LOQ) were 2 and 5 ng g-1, respectively. This method has also been applied successfully to pharmacokinetic analysis of marbofloxacin after intravenous (IV), intramuscular (IM) and oral administration (PO). The mean peak plasma concentration (Cmax) was 2,597 ng g-1at 0.25 h, 2,587 ng g-1at 0.44 h and 2,355 ng g-1at 1.58 h for IV, IM and PO, respectively. The area under the plasma concentration-time curve (AUC0–t) was 24.8, 29.0 and 25.2 h μg/mL for IV, IM and PO, respectively. The elimination half-life (T1/2) was 8.6, 13.1 and 9.5 for IV, IM and PO, respectively. Bioavailability (F) of the marbofloxacin in pig was 117 and 101 % for IM and PO, respectively. Based on these result, marbofloxacin does not have any obstacles as therapeutics to develop the oral formulations such as tablets and capsules.Keywords: marbofloxacin, LC-MS/MS, pharmacokinetics, chromatographic
Procedia PDF Downloads 54621221 Epileptic Seizure Onset Detection via Energy and Neural Synchronization Decision Fusion
Authors: Marwa Qaraqe, Muhammad Ismail, Erchin Serpedin
Abstract:
This paper presents a novel architecture for a patient-specific epileptic seizure onset detector using scalp electroencephalography (EEG). The proposed architecture is based on the decision fusion calculated from energy and neural synchronization related features. Specifically, one level of the detector calculates the condition number (CN) of an EEG matrix to evaluate the amount of neural synchronization present within the EEG channels. On a parallel level, the detector evaluates the energy contained in four EEG frequency subbands. The information is then fed into two independent (parallel) classification units based on support vector machines to determine the onset of a seizure event. The decisions from the two classifiers are then combined together according to two fusion techniques to determine a global decision. Experimental results demonstrate that the detector based on the AND fusion technique outperforms existing detectors with a sensitivity of 100%, detection latency of 3 seconds, while it achieves a 2:76 false alarm rate per hour. The OR fusion technique achieves a sensitivity of 100%, and significantly improves delay latency (0:17 seconds), yet it achieves 12 false alarms per hour.Keywords: epilepsy, EEG, seizure onset, electroencephalography, neuron, detection
Procedia PDF Downloads 47521220 Towards Printed Green Time-Temperature Indicator
Authors: Mariia Zhuldybina, Ahmed Moulay, Mirko Torres, Mike Rozel, Ngoc-Duc Trinh, Chloé Bois
Abstract:
To reduce the global waste of perishable goods, a solution for monitoring and traceability of their environmental conditions is needed. Temperature is the most controllable environmental parameter determining the kinetics of physical, chemical, and microbial spoilage in food products. To store the time-temperature information, time-temperature indicator (TTI) is a promising solution. Printed electronics (PE) has shown a great potential to produce customized electronic devices using flexible substrates and inks with different functionalities. We propose to fabricate a hybrid printed TTI using environmentally friendly materials. The real-time TTI profile can be stored and transmitted to the smartphone via Near Field Communication (NFC). To ensure environmental performance, Canadian Green Electronics NSERC Network is developing green materials for the ink formulation with different functionalities. In terms of substrate, paper-based electronics has gained the great interest for utilization in a wide area of electronic systems because of their low costs in setup and methodology, as well as their eco-friendly fabrication technologies. The main objective is to deliver a prototype of TTI using small-scale printed techniques under typical printing conditions. All sub-components of the smart labels, including a memristor, a battery, an antenna compatible with NFC protocol, and a circuit compatible with integration performed by an offsite supplier will be fully printed with flexography or flat-bed screen printing.Keywords: NFC, printed electronics, time-temperature indicator, hybrid electronics
Procedia PDF Downloads 16321219 Bringing Design Science Research Methodology into Real World Applications
Authors: Maya Jaber
Abstract:
In today's ever-changing world, organizational leaders will need to transform their organizations to meet the demands they face from employees, consumers, local and federal governments, and the global market. Change agents and leaders will need a new paradigm of thinking for creative problem solving and innovation in a time of uncertainty. A new framework that is developed from Design Science Research foundations with holistic design thinking methodologies (HTDM) and action research approaches has been developed through Dr. Jaber’s research. It combines these philosophies into a three-step process that can be utilized in practice for any sustainability, change, or project management applications. This framework was developed to assist in the pedagogy for the implementation of her holistic strategy formalized framework Integral Design Thinking (IDT). Her work focuses on real world application for the streamlining and adoption of initiatives into organizational culture transformation. This paper will discuss the foundations of this philosophy and the methods for utilization in practice developed in Dr. Jaber's research.Keywords: design science research, action research, critical thinking, design thinking, organizational transformation, sustainability management, organizational culture change
Procedia PDF Downloads 17921218 A Step Towards Automating the Synthesis of a Scene Script
Authors: Americo Pereira, Ricardo Carvalho, Pedro Carvalho, Luis Corte-Real
Abstract:
Generating 3D content is a task mostly done by hand. It requires specific knowledge not only on how to use the tools for the task but also on the fundamentals of a 3D environment. In this work, we show that automatic generation of content can be achieved, from a scene script, by leveraging existing tools so that non-experts can easily engage in a 3D content generation without requiring vast amounts of time in exploring and learning how to use specific tools. This proposal carries several benefits, including flexible scene synthesis with different levels of detail. Our preliminary results show that the automatically generated content is comparable to the content generated by users with low experience in 3D modeling while vastly reducing the amount of time required for the generation and adds support to implement flexible scenarios for visual scene visualization.Keywords: 3D virtualization, multimedia, scene script, synthesis
Procedia PDF Downloads 26421217 CSRFDtool: Automated Detection and Prevention of a Reflected Cross-Site Request Forgery
Authors: Alaa A. Almarzuki, Nora A. Farraj, Aisha M. Alshiky, Omar A. Batarfi
Abstract:
The number of internet users is dramatically increased every year. Most of these users are exposed to the dangers of attackers in one way or another. The reason for this lies in the presence of many weaknesses that are not known for native users. In addition, the lack of user awareness is considered as the main reason for falling into the attackers’ snares. Cross Site Request Forgery (CSRF) has placed in the list of the most dangerous threats to security in OWASP Top Ten for 2013. CSRF is an attack that forces the user’s browser to send or perform unwanted request or action without user awareness by exploiting a valid session between the browser and the server. When CSRF attack successes, it leads to many bad consequences. An attacker may reach private and personal information and modify it. This paper aims to detect and prevent a specific type of CSRF, called reflected CSRF. In a reflected CSRF, a malicious code could be injected by the attackers. This paper explores how CSRF Detection Extension prevents the reflected CSRF by checking browser specific information. Our evaluation shows that the proposed solution succeeds in preventing this type of attack.Keywords: CSRF, CSRF detection extension, attackers, attacks
Procedia PDF Downloads 41221216 Mage Fusion Based Eye Tumor Detection
Authors: Ahmed Ashit
Abstract:
Image fusion is a significant and efficient image processing method used for detecting different types of tumors. This method has been used as an effective combination technique for obtaining high quality images that combine anatomy and physiology of an organ. It is the main key in the huge biomedical machines for diagnosing cancer such as PET-CT machine. This thesis aims to develop an image analysis system for the detection of the eye tumor. Different image processing methods are used to extract the tumor and then mark it on the original image. The images are first smoothed using median filtering. The background of the image is subtracted, to be then added to the original, results in a brighter area of interest or tumor area. The images are adjusted in order to increase the intensity of their pixels which lead to clearer and brighter images. once the images are enhanced, the edges of the images are detected using canny operators results in a segmented image comprises only of the pupil and the tumor for the abnormal images, and the pupil only for the normal images that have no tumor. The images of normal and abnormal images are collected from two sources: “Miles Research” and “Eye Cancer”. The computerized experimental results show that the developed image fusion based eye tumor detection system is capable of detecting the eye tumor and segment it to be superimposed on the original image.Keywords: image fusion, eye tumor, canny operators, superimposed
Procedia PDF Downloads 36121215 Development of Web Application for Warehouse Management System: A Case Study of Ceramics Factory
Authors: Thanaphat Suwanaklang, Supaporn Suwannarongsri
Abstract:
Presently, there are many industries in Thailand producing various products for both domestic distribution and export to foreign countries. Warehouse is one of the most important areas of business needing to store their products. Such businesses need to have a suitable warehouse management system for reducing the storage time and using the space as much as possible. This paper proposes the development of a web application for a warehouse management system. One of the ceramics factories in Thailand is conducted as a case study. By applying the ABC analysis, fixed location, commodity system, ECRS, and 7-waste theories and principles, the web application for the warehouse management system of the selected ceramics factory is developed to design the optimal storage area for groups of products and design the optimal routes of forklifts. From experimental results, it was found that the warehouse management system developed via the web application can reduce the travel distance of forklifts and the time of searching for storage area by 100% once compared with the conventional method. In addition, the entire storage area can be on-line and real-time monitored.Keywords: warehouse management system, warehouse design method, logistics system, web application
Procedia PDF Downloads 13521214 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record
Authors: Raghavi C. Janaswamy
Abstract:
In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.Keywords: electronic health record, graph neural network, heterogeneous data, prediction
Procedia PDF Downloads 8521213 Microfluidic Impedimetric Biochip and Related Methods for Measurement Chip Manufacture and Counting Cells
Authors: Amina Farooq, Nauman Zafar Butt
Abstract:
This paper is about methods and tools for counting particles of interest, such as cells. A microfluidic system with interconnected electronics on a flexible substrate, inlet-outlet ports and interface schemes, sensitive and selective detection of cells specificity, and processing of cell counting at polymer interfaces in a microscale biosensor for use in the detection of target biological and non-biological cells. The development of fluidic channels, planar fluidic contact ports, integrated metal electrodes on a flexible substrate for impedance measurements, and a surface modification plasma treatment as an intermediate bonding layer are all part of the fabrication process. Magnetron DC sputtering is used to deposit a double metal layer (Ti/Pt) over the polypropylene film. Using a photoresist layer, specified and etched zones are established. Small fluid volumes, a reduced detection region, and electrical impedance measurements over a range of frequencies for cell counts improve detection sensitivity and specificity. The procedure involves continuous flow of fluid samples that contain particles of interest through the microfluidic channels, counting all types of particles in a portion of the sample using the electrical differential counter to generate a bipolar pulse for each passing cell—calculating the total number of particles of interest originally in the fluid sample by using MATLAB program and signal processing. It's indeed potential to develop a robust and economical kit for cell counting in whole-blood samples using these methods and similar devices.Keywords: impedance, biochip, cell counting, microfluidics
Procedia PDF Downloads 15821212 Spatial Mapping and Change Detection of a Coastal Woodland Mangrove Habitat in Fiji
Authors: Ashneel Ajay Singh, Anish Maharaj, Havish Naidu, Michelle Kumar
Abstract:
Mangrove patches are the foundation species located in the estuarine land areas. These patches provide a nursery, food source and protection for numerous aquatic, intertidal and well as land-based organisms. Mangroves also help in coastal protection, maintain water clarity and are one of the biggest sinks for blue carbon sequestration. In the Pacific Island countries, numerous coastal communities have a heavy socioeconomic dependence on coastal resources and mangroves play a key ecological and economical role in structuring the availability of these resources. Fiji has a large mangrove patch located in the Votua area of the Ba province. Globally, mangrove population continues to decline with the changes in climatic conditions and anthropogenic activities. Baseline information through wetland maps and time series change are essential references for development of effective mangrove management plans. These maps reveal the status of the resource and the effects arising from anthropogenic activities and climate change. In this study, we used remote sensing and GIS tools for mapping and temporal change detection over a period of >20 years in Votua, Fiji using Landsat imagery. Landsat program started in 1972 initially as Earth Resources Technology Satellite. Since then it has acquired millions of images of Earth. This archive allows mapping of temporal changes in mangrove forests. Mangrove plants consisted of the species Rhizophora stylosa, Rhizophora samoensis, Bruguiera gymnorrhiza, Lumnitzera littorea, Heritiera littoralis, Excoecaria agallocha and Xylocarpus granatum. Change detection analysis revealed significant reduction in the mangrove patch over the years. This information serves as a baseline for the development and implementation of effective management plans for one of Fiji’s biggest mangrove patches.Keywords: climate change, GIS, Landsat, mangrove, temporal change
Procedia PDF Downloads 17821211 Infrared Lightbox and iPhone App for Improving Detection Limit of Phosphate Detecting Dip Strips
Authors: H. Heidari-Bafroui, B. Ribeiro, A. Charbaji, C. Anagnostopoulos, M. Faghri
Abstract:
In this paper, we report the development of a portable and inexpensive infrared lightbox for improving the detection limits of paper-based phosphate devices. Commercial paper-based devices utilize the molybdenum blue protocol to detect phosphate in the environment. Although these devices are easy to use and have a long shelf life, their main deficiency is their low sensitivity based on the qualitative results obtained via a color chart. To improve the results, we constructed a compact infrared lightbox that communicates wirelessly with a smartphone. The system measures the absorbance of radiation for the molybdenum blue reaction in the infrared region of the spectrum. It consists of a lightbox illuminated by four infrared light-emitting diodes, an infrared digital camera, a Raspberry Pi microcontroller, a mini-router, and an iPhone to control the microcontroller. An iPhone application was also developed to analyze images captured by the infrared camera in order to quantify phosphate concentrations. Additionally, the app connects to an online data center to present a highly scalable worldwide system for tracking and analyzing field measurements. In this study, the detection limits for two popular commercial devices were improved by a factor of 4 for the Quantofix devices (from 1.3 ppm using visible light to 300 ppb using infrared illumination) and a factor of 6 for the Indigo units (from 9.2 ppm to 1.4 ppm) with repeatability of less than or equal to 1.2% relative standard deviation (RSD). The system also provides more granular concentration information compared to the discrete color chart used by commercial devices and it can be easily adapted for use in other applications.Keywords: infrared lightbox, paper-based device, phosphate detection, smartphone colorimetric analyzer
Procedia PDF Downloads 12221210 Detection of Telomerase Activity as Cancer Biomarker Using Nanogap-Rich Au Nanowire SERS Sensor
Authors: G. Eom, H. Kim, A. Hwang, T. Kang, B. Kim
Abstract:
Telomerase activity is overexpressed in over 85% of human cancers while suppressed in normal somatic cells. Telomerase has been attracted as a universal cancer biomarker. Therefore, the development of effective telomerase activity detection methods is urgently demanded in cancer diagnosis and therapy. Herein, we report a nanogap-rich Au nanowire (NW) surface-enhanced Raman scattering (SERS) sensor for detection of human telomerase activity. The nanogap-rich Au NW SERS sensors were prepared simply by uniformly depositing nanoparticles (NPs) on single-crystalline Au NWs. We measured SERS spectra of methylene blue (MB) from 60 different nanogap-rich Au NWs and obtained the relative standard deviation (RSD) of 4.80%, confirming the superb reproducibility of nanogap-rich Au NW SERS sensors. The nanogap-rich Au NW SERS sensors enable us to detect telomerase activity in 0.2 cancer cells/mL. Furthermore, telomerase activity is detectable in 7 different cancer cell lines whereas undetectable in normal cell lines, which suggest the potential applicability of nanogap-rich Au NW SERS sensor in cancer diagnosis. We expect that the present nanogap-rich Au NW SERS sensor can be useful in biomedical applications including a diverse biomarker sensing.Keywords: cancer biomarker, nanowires, surface-enhanced Raman scattering, telomerase
Procedia PDF Downloads 34721209 Simultaneous Detection of Dopamine and Uric Acid in the Presence of Ascorbic Acid at Physiological Level Using Anodized Multiwalled Carbon Nanotube–Poldimethylsiloxane Paste Electrode
Authors: Angelo Gabriel Buenaventura, Allan Christopher Yago
Abstract:
A carbon paste electrode (CPE) composed of Multiwalled Carbon Nanotube (MWCNT) conducting particle and Polydimethylsiloxane (PDMS) binder was used for simultaneous detection of Dopamine (DA) and Uric Acid (UA) in the presence of Ascorbic Acid (AA) at physiological level. The MWCNT-PDMS CPE was initially activated via potentiodynamic cycling in a basic (NaOH) solution, which resulted in enhanced electrochemical properties. Electrochemical Impedance Spectroscopy measurements revealed a significantly lower charge transfer resistance (Rct) for the OH--activated MWCNT-PDMS CPE (Rct = 5.08kΩ) as compared to buffer (pH 7)-activated MWCNT-PDMS CPE (Rct = 25.9kΩ). Reversibility analysis of Fe(CN)63-/4- redox couple of both Buffer-Activated CPE and OH--Activated CPE showed that the OH—Activated CPE have peak current ratio (Ia/Ic) of 1.11 at 100mV/s while 2.12 for the Buffer-Activated CPE; this showed an electrochemically reversible behavior for Fe(CN)63-/4- redox couple even at relatively fast scan rate using the OH--activated CPE. Enhanced voltammetric signal for DA and significant peak separation between DA and UA was obtained using the OH--activated MWCNT-PDMS CPE in the presence of 50 μM AA via Differential Pulse Voltammetry technique. The anodic peak currents which appeared at 0.263V and 0.414 V were linearly increasing with increasing concentrations of DA and UA, respectively. The linear ranges were obtained at 25 μM – 100 μM for both DA and UA. The detection limit was determined to be 3.86 μM for DA and 5.61 μM for UA. These results indicate a practical approach in the simultaneous detection of important bio-organic molecules using a simple CPE composed of MWCNT and PDMS with base anodization as activation technique.Keywords: anodization, ascorbic acid, carbon paste electrodes, dopamine, uric acid
Procedia PDF Downloads 28321208 Molecular Epidemiology of Anthrax in Georgia
Authors: N. G. Vepkhvadze, T. Enukidze
Abstract:
Anthrax is a fatal disease caused by strains of Bacillus anthracis, a spore-forming gram-positive bacillus that causes the disease anthrax in animals and humans. Anthrax is a zoonotic disease that is also well-recognized as a potential agent of bioterrorism. Infection in humans is extremely rare in the developed world and is generally due to contact with infected animals or contaminated animal products. Testing of this zoonotic disease began in 1907 in Georgia and is still being tested routinely to provide accurate information and efficient testing results at the State Laboratory of Agriculture of Georgia. Each clinical sample is analyzed by RT-PCR and bacteriology methods; this study used Real-Time PCR assays for the detection of B. anthracis that rely on plasmid-encoded targets with a chromosomal marker to correctly differentiate pathogenic strains from non-anthracis Bacillus species. During the period of 2015-2022, the State Laboratory of Agriculture (SLA) tested 250 clinical and environmental (soil) samples from several different regions in Georgia. In total, 61 out of the 250 samples were positive during this period. Based on the results, Anthrax cases are mostly present in Eastern Georgia, with a high density of the population of livestock, specifically in the regions of Kakheti and Kvemo Kartli. All laboratory activities are being performed in accordance with International Quality standards, adhering to biosafety and biosecurity rules by qualified and experienced personnel handling pathogenic agents. Laboratory testing plays the largest role in diagnosing animals with anthrax, which helps pertinent institutions to quickly confirm a diagnosis of anthrax and evaluate the epidemiological situation that generates important data for further responses.Keywords: animal disease, baccilus anthracis, edp, laboratory molecular diagnostics
Procedia PDF Downloads 8521207 Development of a Sensitive Electrochemical Sensor Based on Carbon Dots and Graphitic Carbon Nitride for the Detection of 2-Chlorophenol and Arsenic
Authors: Theo H. G. Moundzounga
Abstract:
Arsenic and 2-chlorophenol are priority pollutants that pose serious health threats to humans and ecology. An electrochemical sensor, based on graphitic carbon nitride (g-C₃N₄) and carbon dots (CDs), was fabricated and used for the determination of arsenic and 2-chlorophenol. The g-C₃N₄/CDs nanocomposite was prepared via microwave irradiation heating method and was dropped-dried on the surface of the glassy carbon electrode (GCE). Transmission electron microscopy (TEM), X-ray diffraction (XRD), photoluminescence (PL), Fourier transform infrared spectroscopy (FTIR), UV-Vis diffuse reflectance spectroscopy (UV-Vis DRS) were used for the characterization of structure and morphology of the nanocomposite. Electrochemical characterization was done by electrochemical impedance spectroscopy (EIS) and cyclic voltammetry (CV). The electrochemical behaviors of arsenic and 2-chlorophenol on different electrodes (GCE, CDs/GCE, and g-C₃N₄/CDs/GCE) was investigated by differential pulse voltammetry (DPV). The results demonstrated that the g-C₃N₄/CDs/GCE significantly enhanced the oxidation peak current of both analytes. The analytes detection sensitivity was greatly improved, suggesting that this new modified electrode has great potential in the determination of trace level of arsenic and 2-chlorophenol. Experimental conditions which affect the electrochemical response of arsenic and 2-chlorophenol were studied, the oxidation peak currents displayed a good linear relationship to concentration for 2-chlorophenol (R²=0.948, n=5) and arsenic (R²=0.9524, n=5), with a linear range from 0.5 to 2.5μM for 2-CP and arsenic and a detection limit of 2.15μM and 0.39μM respectively. The modified electrode was used to determine arsenic and 2-chlorophenol in spiked tap and effluent water samples by the standard addition method, and the results were satisfying. According to the measurement, the new modified electrode is a good alternative as chemical sensor for determination of other phenols.Keywords: electrochemistry, electrode, limit of detection, sensor
Procedia PDF Downloads 14221206 Detection and Tracking Approach Using an Automotive Radar to Increase Active Pedestrian Safety
Authors: Michael Heuer, Ayoub Al-Hamadi, Alexander Rain, Marc-Michael Meinecke
Abstract:
Vulnerable road users, e.g. pedestrians, have a high impact on fatal accident numbers. To reduce these statistics, car manufactures are intensively developing suitable safety systems. Hereby, fast and reliable environment recognition is a major challenge. In this paper we describe a tracking approach that is only based on a 24 GHz radar sensor. While common radar signal processing loses much information, we make use of a track-before-detect filter to incorporate raw measurements. It is explained how the Range-Doppler spectrum can help to indicated pedestrians and stabilize tracking even in occultation scenarios compared to sensors in series.Keywords: radar, pedestrian detection, active safety, sensor
Procedia PDF Downloads 52821205 RP-HPLC Method Development and Its Validation for Simultaneous Estimation of Metoprolol Succinate and Olmesartan Medoxomil Combination in Bulk and Tablet Dosage Form
Authors: S. Jain, R. Savalia, V. Saini
Abstract:
A simple, accurate, precise, sensitive and specific RP-HPLC method was developed and validated for simultaneous estimation of Metoprolol Succinate and Olmesartan Medoxomil in bulk and tablet dosage form. The RP-HPLC method has shown adequate separation for Metoprolol Succinate and Olmesartan Medoxomil from its degradation products. The separation was achieved on a Phenomenex luna ODS C18 (250mm X 4.6mm i.d., 5μm particle size) with an isocratic mixture of acetonitrile: 50mM phosphate buffer pH 4.0 adjusted with glacial acetic acid in the ratio of 55:45 v/v. The mobile phase at a flow rate of 1.0ml/min, Injection volume 20μl and wavelength of detection was kept at 225nm. The retention time for Metoprolol Succinate and Olmesartan Medoxomil was 2.451±0.1min and 6.167±0.1min, respectively. The linearity of the proposed method was investigated in the range of 5-50μg/ml and 2-20μg/ml for Metoprolol Succinate and Olmesartan Medoxomil, respectively. Correlation coefficient was 0.999 and 0.9996 for Metoprolol Succinate and Olmesartan Medoxomil, respectively. The limit of detection was 0.2847μg/ml and 0.1251μg/ml for Metoprolol Succinate and Olmesartan Medoxomil, respectively and the limit of quantification was 0.8630μg/ml and 0.3793μg/ml for Metoprolol and Olmesartan, respectively. Proposed methods were validated as per ICH guidelines for linearity, accuracy, precision, specificity and robustness for estimation of Metoprolol Succinate and Olmesartan Medoxomil in commercially available tablet dosage form and results were found to be satisfactory. Thus the developed and validated stability indicating method can be used successfully for marketed formulations.Keywords: metoprolol succinate, olmesartan medoxomil, RP-HPLC method, validation, ICH
Procedia PDF Downloads 31321204 Immobilization of Cobalt Ions on F-Multi-Wall Carbon Nanotubes-Chitosan Thin Film: Preparation and Application for Paracetamol Detection
Authors: Shamima Akhter, Samira Bagheri, M. Shalauddin, Wan Jefrey Basirun
Abstract:
In the present study, a nanocomposite of f-MWCNTs-Chitosan was prepared by the immobilization of Co(II) transition metal through self-assembly method and used for the simultaneous voltammetric determination of paracetamol (PA). The composite material was characterized by field emission scanning electron microscopy (FESEM) and energy dispersive X-Ray analysis (EDX). The electroactivity of cobalt immobilized f-MWCNTs with excellent adsorptive polymer chitosan was assessed during the electro-oxidation of paracetamol. The resulting GCE modified f-MWCNTs/CTS-Co showed electrocatalytic activity towards the oxidation of PA. The electrochemical performances were investigated using cyclic voltammetry (CV), electrochemical impedance spectroscopy (EIS) and differential pulse voltammetry (DPV) methods. Under favorable experimental conditions, differential pulse voltammetry showed a linear dynamic range for paracetamol solution in the range of 0.1 to 400µmol L⁻¹ with a detection limit of 0.01 µmol L⁻¹. The proposed sensor exhibited significant selectivity for the paracetamol detection. The proposed method was successfully applied for the determination of paracetamol in commercial tablets and human serum sample.Keywords: nanomaterials, paracetamol, electrochemical technique, multi-wall carbon nanotube
Procedia PDF Downloads 20021203 A Multi-Release Software Reliability Growth Models Incorporating Imperfect Debugging and Change-Point under the Simulated Testing Environment and Software Release Time
Authors: Sujit Kumar Pradhan, Anil Kumar, Vijay Kumar
Abstract:
The testing process of the software during the software development time is a crucial step as it makes the software more efficient and dependable. To estimate software’s reliability through the mean value function, many software reliability growth models (SRGMs) were developed under the assumption that operating and testing environments are the same. Practically, it is not true because when the software works in a natural field environment, the reliability of the software differs. This article discussed an SRGM comprising change-point and imperfect debugging in a simulated testing environment. Later on, we extended it in a multi-release direction. Initially, the software was released to the market with few features. According to the market’s demand, the software company upgraded the current version by adding new features as time passed. Therefore, we have proposed a generalized multi-release SRGM where change-point and imperfect debugging concepts have been addressed in a simulated testing environment. The failure-increasing rate concept has been adopted to determine the change point for each software release. Based on nine goodness-of-fit criteria, the proposed model is validated on two real datasets. The results demonstrate that the proposed model fits the datasets better. We have also discussed the optimal release time of the software through a cost model by assuming that the testing and debugging costs are time-dependent.Keywords: software reliability growth models, non-homogeneous Poisson process, multi-release software, mean value function, change-point, environmental factors
Procedia PDF Downloads 7221202 EEG Diagnosis Based on Phase Space with Wavelet Transforms for Epilepsy Detection
Authors: Mohmmad A. Obeidat, Amjed Al Fahoum, Ayman M. Mansour
Abstract:
The recognition of an abnormal activity of the brain functionality is a vital issue. To determine the type of the abnormal activity either a brain image or brain signal are usually considered. Imaging localizes the defect within the brain area and relates this area with somebody functionalities. However, some functions may be disturbed without affecting the brain as in epilepsy. In this case, imaging may not provide the symptoms of the problem. A cheaper yet efficient approach that can be utilized to detect abnormal activity is the measurement and analysis of the electroencephalogram (EEG) signals. The main goal of this work is to come up with a new method to facilitate the classification of the abnormal and disorder activities within the brain directly using EEG signal processing, which makes it possible to be applied in an on-line monitoring system.Keywords: EEG, wavelet, epilepsy, detection
Procedia PDF Downloads 53621201 Time Travel Testing: A Mechanism for Improving Renewal Experience
Authors: Aritra Majumdar
Abstract:
While organizations strive to expand their new customer base, retaining existing relationships is a key aspect of improving overall profitability and also showcasing how successful an organization is in holding on to its customers. It is an experimentally proven fact that the lion’s share of profit always comes from existing customers. Hence seamless management of renewal journeys across different channels goes a long way in improving trust in the brand. From a quality assurance standpoint, time travel testing provides an approach to both business and technology teams to enhance the customer experience when they look to extend their partnership with the organization for a defined phase of time. This whitepaper will focus on key pillars of time travel testing: time travel planning, time travel data preparation, and enterprise automation. Along with that, it will call out some of the best practices and common accelerator implementation ideas which are generic across verticals like healthcare, insurance, etc. In this abstract document, a high-level snapshot of these pillars will be provided. Time Travel Planning: The first step of setting up a time travel testing roadmap is appropriate planning. Planning will include identifying the impacted systems that need to be time traveled backward or forward depending on the business requirement, aligning time travel with other releases, frequency of time travel testing, preparedness for handling renewal issues in production after time travel testing is done and most importantly planning for test automation testing during time travel testing. Time Travel Data Preparation: One of the most complex areas in time travel testing is test data coverage. Aligning test data to cover required customer segments and narrowing it down to multiple offer sequencing based on defined parameters are keys for successful time travel testing. Another aspect is the availability of sufficient data for similar combinations to support activities like defect retesting, regression testing, post-production testing (if required), etc. This section will talk about the necessary steps for suitable data coverage and sufficient data availability from a time travel testing perspective. Enterprise Automation: Time travel testing is never restricted to a single application. The workflow needs to be validated in the downstream applications to ensure consistency across the board. Along with that, the correctness of offers across different digital channels needs to be checked in order to ensure a smooth customer experience. This section will talk about the focus areas of enterprise automation and how automation testing can be leveraged to improve the overall quality without compromising on the project schedule. Along with the above-mentioned items, the white paper will elaborate on the best practices that need to be followed during time travel testing and some ideas pertaining to accelerator implementation. To sum it up, this paper will be written based on the real-time experience author had on time travel testing. While actual customer names and program-related details will not be disclosed, the paper will highlight the key learnings which will help other teams to implement time travel testing successfully.Keywords: time travel planning, time travel data preparation, enterprise automation, best practices, accelerator implementation ideas
Procedia PDF Downloads 15821200 Real Fictions: Converging Landscapes and Imagination in an English Village
Authors: Edoardo Lomi
Abstract:
A problem of central interest in anthropology concerns the ethnographic displacement of modernity’s conceptual sovereignty over that of native collectives worldwide. Part of this critical project has been the association of Western modernity with a dualist, naturalist ontology. Despite its demonstrated value for comparative work, this association often comes at the cost of reproducing ideas that lack an empirical ethnographic basis. This paper proposes a way forward by bringing to bear some of the results produced by an ethnographic study of a village in Wiltshire, South England. Due to its picturesque qualities, this village has served for decades as a ready-made set for fantasy movies and a backdrop to fictional stories. These forms of mediation have in turn generated some apparent paradoxes, such as fictitious characters that affect actual material changes, films that become more real than history, and animated stories that, while requiring material grounds to unfold, inhabit a time and space in other respects distinct from that of material processes. Drawing on ongoing fieldwork and interviews with locals and tourists, this paper considers the ways villagers engage with fiction as part of their everyday lives. The resulting image is one of convergence, in the same landscape, of people and things having different ontological status. This study invites reflection on the implications of this image for diversifying our imagery of Western lifeworlds. To this end, the notion of ‘real fictions’ is put forth, connecting the ethnographic blurring of modernist distinctions–such as sign and signified, mind and matter, materiality and immateriality–with discussions on anthropology’s own reliance on fictions for critical comparative work.Keywords: England, ethnography, landscape, modernity, mediation, ontology, post-structural theory
Procedia PDF Downloads 121