Search results for: phishing detection
2115 Fabrication of a New Electrochemical Sensor Based on New Nanostructured Molecularly Imprinted Polypyrrole for Selective and Sensitive Determination of Morphine
Authors: Samaneh Nabavi, Hadi Shirzad, Arash Ghoorchian, Maryam Shanesaz, Reza Naderi
Abstract:
Morphine (MO), the most effective painkiller, is considered the reference by which analgesics are assessed. It is very necessary for the biomedical applications to detect and maintain the MO concentrations in the blood and urine with in safe ranges. To date, there are many expensive techniques for detecting MO. Recently, many electrochemical sensors for direct determination of MO were constructed. The molecularly imprinted polymer (MIP) is a polymeric material, which has a built-in functionality for the recognition of a particular chemical substance with its complementary cavity.This paper reports a sensor for MO using a combination of a molecularly imprinted polymer (MIP) and differential-pulse voltammetry (DPV). Electropolymerization of MO doped polypyrrole yielded poor quality, but a well-doped, nanostructure and increased impregnation has been obtained in the pH=12. Above a pH of 11, MO is in the anionic forms. The effect of various experimental parameters including pH, scan rate and accumulation time on the voltammetric response of MO was investigated. At the optimum conditions, the concentration of MO was determined using DPV in a linear range of 7.07 × 10−6 to 2.1 × 10−4 mol L−1 with a correlation coefficient of 0.999, and a detection limit of 13.3 × 10-8 mol L−1, respectively. The effect of common interferences on the current response of MO namely ascorbic acid (AA) and uric acid (UA) is studied. The modified electrode can be used for the determination of MO spiked into urine samples, and excellent recovery results were obtained. The nanostructured polypyrrole films were characterized by field emission scanning electron microscopy (FESEM) and furrier transforms infrared (FTIR).Keywords: morphine detection, sensor, polypyrrole, nanostructure, molecularly imprinted polymer
Procedia PDF Downloads 4262114 FracXpert: Ensemble Machine Learning Approach for Localization and Classification of Bone Fractures in Cricket Athletes
Authors: Madushani Rodrigo, Banuka Athuraliya
Abstract:
In today's world of medical diagnosis and prediction, machine learning stands out as a strong tool, transforming old ways of caring for health. This study analyzes the use of machine learning in the specialized domain of sports medicine, with a focus on the timely and accurate detection of bone fractures in cricket athletes. Failure to identify bone fractures in real time can result in malunion or non-union conditions. To ensure proper treatment and enhance the bone healing process, accurately identifying fracture locations and types is necessary. When interpreting X-ray images, it relies on the expertise and experience of medical professionals in the identification process. Sometimes, radiographic images are of low quality, leading to potential issues. Therefore, it is necessary to have a proper approach to accurately localize and classify fractures in real time. The research has revealed that the optimal approach needs to address the stated problem and employ appropriate radiographic image processing techniques and object detection algorithms. These algorithms should effectively localize and accurately classify all types of fractures with high precision and in a timely manner. In order to overcome the challenges of misidentifying fractures, a distinct model for fracture localization and classification has been implemented. The research also incorporates radiographic image enhancement and preprocessing techniques to overcome the limitations posed by low-quality images. A classification ensemble model has been implemented using ResNet18 and VGG16. In parallel, a fracture segmentation model has been implemented using the enhanced U-Net architecture. Combining the results of these two implemented models, the FracXpert system can accurately localize exact fracture locations along with fracture types from the available 12 different types of fracture patterns, which include avulsion, comminuted, compressed, dislocation, greenstick, hairline, impacted, intraarticular, longitudinal, oblique, pathological, and spiral. This system will generate a confidence score level indicating the degree of confidence in the predicted result. Using ResNet18 and VGG16 architectures, the implemented fracture segmentation model, based on the U-Net architecture, achieved a high accuracy level of 99.94%, demonstrating its precision in identifying fracture locations. Simultaneously, the classification ensemble model achieved an accuracy of 81.0%, showcasing its ability to categorize various fracture patterns, which is instrumental in the fracture treatment process. In conclusion, FracXpert has become a promising ML application in sports medicine, demonstrating its potential to revolutionize fracture detection processes. By leveraging the power of ML algorithms, this study contributes to the advancement of diagnostic capabilities in cricket athlete healthcare, ensuring timely and accurate identification of bone fractures for the best treatment outcomes.Keywords: multiclass classification, object detection, ResNet18, U-Net, VGG16
Procedia PDF Downloads 1242113 Barriers to Tuberculosis Detection in Portuguese Prisons
Authors: M. F. Abreu, A. I. Aguiar, R. Gaio, R. Duarte
Abstract:
Background: Prison establishments constitute high-risk environments for the transmission and spread of tuberculosis (TB), given their epidemiological context and the difficulty of implementing preventive and control measures. Guidelines for control and prevention of tuberculosis in prisons have been described as incomplete and heterogeneous internationally, due to several identified obstacles, for example scarcity of human resources and funding of prisoner health services. In Portugal, a protocol was created in 2014 with the aim to define and standardize procedures of detection and prevention of tuberculosis within prisons. Objective: The main objective of this study was to identify and describe barriers to tuberculosis detection in prisons of Porto and Lisbon districts in Portugal. Methods: A cross-sectional study was conducted from 2ⁿᵈ January 2018 till 30ᵗʰ June 2018. Semi-structured questionnaires were applied to health care professionals working in the prisons of the districts of Porto (n=6) and Lisbon (n=8). As inclusion criteria we considered having work experience in the area of tuberculosis (either in diagnosis, treatment, or follow up). The questionnaires were self-administered, in paper format. Descriptive analyses of the questionnaire variables were made using frequencies and median. Afterwards, a hierarchical agglomerative clusters analysis was performed. After obtaining the clusters, the chi-square test was applied to study the association between the variables collected and the clusters. The level of significance considered was 0.05. Results: From the total of 186 health professionals, 139 met the criteria of inclusion and 82 health professionals were interviewed (62,2% of participation). Most were female, nurses, with a median age of 34 years, with term employment contract. From the cluster analysis, two groups were identified with different characteristics and behaviors for the procedures of this protocol. Statistically significant results were found in: elements of cluster 1 (78% of the total participants) work in prisons for a longer time (p=0.003), 45,3% work > 4 years while 50% of the elements of cluster 2 work for less than a year, and more frequently answered they know and apply the procedures of the protocol (p=0.000). Both clusters answered frequently the need of having theoretical-practical training for TB (p=0.000), especially in the areas of diagnosis, treatment and prevention and that there is scarcity of funding to prisoner health services (p=0.000). Regarding procedures for TB screening (periodic and contact screening) and procedures for transferring a prisoner with this disease, cluster 1 also answered more frequently to perform them (p=0.000). They also referred that the material/equipment for TB screening is accessible and available (p=0.000). From this clusters we identified as barriers scarcity of human resources, the need to theoretical-practical training for tuberculosis, inexperience in working in health services prisons and limited knowledge of protocol procedures. Conclusions: The barriers found in this study are the same described internationally. This protocol is mostly being applied in portuguese prisons. The study also showed the need to invest in human and material resources. This investigation bridged gaps in knowledge that could help prison health services optimize the care provided for early detection and adherence of prisoners to treatment of tuberculosis.Keywords: barriers, health care professionals, prisons, protocol, tuberculosis
Procedia PDF Downloads 1482112 A Geometric Based Hybrid Approach for Facial Feature Localization
Authors: Priya Saha, Sourav Dey Roy Jr., Debotosh Bhattacharjee, Mita Nasipuri, Barin Kumar De, Mrinal Kanti Bhowmik
Abstract:
Biometric face recognition technology (FRT) has gained a lot of attention due to its extensive variety of applications in both security and non-security perspectives. It has come into view to provide a secure solution in identification and verification of person identity. Although other biometric based methods like fingerprint scans, iris scans are available, FRT is verified as an efficient technology for its user-friendliness and contact freeness. Accurate facial feature localization plays an important role for many facial analysis applications including biometrics and emotion recognition. But, there are certain factors, which make facial feature localization a challenging task. On human face, expressions can be seen from the subtle movements of facial muscles and influenced by internal emotional states. These non-rigid facial movements cause noticeable alterations in locations of facial landmarks, their usual shapes, which sometimes create occlusions in facial feature areas making face recognition as a difficult problem. The paper proposes a new hybrid based technique for automatic landmark detection in both neutral and expressive frontal and near frontal face images. The method uses the concept of thresholding, sequential searching and other image processing techniques for locating the landmark points on the face. Also, a Graphical User Interface (GUI) based software is designed that could automatically detect 16 landmark points around eyes, nose and mouth that are mostly affected by the changes in facial muscles. The proposed system has been tested on widely used JAFFE and Cohn Kanade database. Also, the system is tested on DeitY-TU face database which is created in the Biometrics Laboratory of Tripura University under the research project funded by Department of Electronics & Information Technology, Govt. of India. The performance of the proposed method has been done in terms of error measure and accuracy. The method has detection rate of 98.82% on JAFFE database, 91.27% on Cohn Kanade database and 93.05% on DeitY-TU database. Also, we have done comparative study of our proposed method with other techniques developed by other researchers. This paper will put into focus emotion-oriented systems through AU detection in future based on the located features.Keywords: biometrics, face recognition, facial landmarks, image processing
Procedia PDF Downloads 4132111 Automated Video Surveillance System for Detection of Suspicious Activities during Academic Offline Examination
Authors: G. Sandhya Devi, G. Suvarna Kumar, S. Chandini
Abstract:
This research work aims to develop a system that will analyze and identify students who indulge in malpractices/suspicious activities during the course of an academic offline examination. Automated Video Surveillance provides an optimal solution which helps in monitoring the students and identifying the malpractice event immediately. This work is organized into three modules. The first module deals with performing an impersonation check using a PCA-based face recognition method which is done by cross checking his profile with the database. The presence or absence of the student is even determined in this module by implementing an image registration technique wherein a grid is formed by considering all the images registered using the frontal camera at the determined positions. Second, detecting such facial malpractices in which a student gets involved in conversation with another, trying to obtain unauthorized information etc., based on the threshold range evaluated by considering his/her mouth state whether open or closed. The third module deals with identification of unauthorized material or gadgets used in the examination hall by training the positive samples of the object through various stages. Here, a top view camera feed is analyzed to detect the suspicious activities. The system automatically alerts the administration when any suspicious activities are identified, thereby reducing the error rate caused due to manual monitoring. This work is an improvement over our previous work published in identifying suspicious activities done by examinees in an offline examination.Keywords: impersonation, image registration, incrimination, object detection, threshold evaluation
Procedia PDF Downloads 2312110 Recognition and Counting Algorithm for Sub-Regional Objects in a Handwritten Image through Image Sets
Authors: Kothuri Sriraman, Mattupalli Komal Teja
Abstract:
In this paper, a novel algorithm is proposed for the recognition of hulls in a hand written images that might be irregular or digit or character shape. Identification of objects and internal objects is quite difficult to extract, when the structure of the image is having bulk of clusters. The estimation results are easily obtained while going through identifying the sub-regional objects by using the SASK algorithm. Focusing mainly to recognize the number of internal objects exist in a given image, so as it is shadow-free and error-free. The hard clustering and density clustering process of obtained image rough set is used to recognize the differentiated internal objects, if any. In order to find out the internal hull regions it involves three steps pre-processing, Boundary Extraction and finally, apply the Hull Detection system. By detecting the sub-regional hulls it can increase the machine learning capability in detection of characters and it can also be extend in order to get the hull recognition even in irregular shape objects like wise black holes in the space exploration with their intensities. Layered hulls are those having the structured layers inside while it is useful in the Military Services and Traffic to identify the number of vehicles or persons. This proposed SASK algorithm is helpful in making of that kind of identifying the regions and can useful in undergo for the decision process (to clear the traffic, to identify the number of persons in the opponent’s in the war).Keywords: chain code, Hull regions, Hough transform, Hull recognition, Layered Outline Extraction, SASK algorithm
Procedia PDF Downloads 3512109 Long Short-Term Memory Stream Cruise Control Method for Automated Drift Detection and Adaptation
Authors: Mohammad Abu-Shaira, Weishi Shi
Abstract:
Adaptive learning, a commonly employed solution to drift, involves updating predictive models online during their operation to react to concept drifts, thereby serving as a critical component and natural extension for online learning systems that learn incrementally from each example. This paper introduces LSTM-SCCM “Long Short-Term Memory Stream Cruise Control Method”, a drift adaptation-as-a-service framework for online learning. LSTM-SCCM automates drift adaptation through prompt detection, drift magnitude quantification, dynamic hyperparameter tuning, performing shortterm optimization and model recalibration for immediate adjustments, and, when necessary, conducting long-term model recalibration to ensure deeper enhancements in model performance. LSTM-SCCM is incorporated into a suite of cutting-edge online regression models, assessing their performance across various types of concept drift using diverse datasets with varying characteristics. The findings demonstrate that LSTM-SCCM represents a notable advancement in both model performance and efficacy in handling concept drift occurrences. LSTM-SCCM stands out as the sole framework adept at effectively tackling concept drifts within regression scenarios. Its proactive approach to drift adaptation distinguishes it from conventional reactive methods, which typically rely on retraining after significant degradation to model performance caused by drifts. Additionally, LSTM-SCCM employs an in-memory approach combined with the Self-Adjusting Memory (SAM) architecture to enhance real-time processing and adaptability. The framework incorporates variable thresholding techniques and does not assume any particular data distribution, making it an ideal choice for managing high-dimensional datasets and efficiently handling large-scale data. Our experiments, which include abrupt, incremental, and gradual drifts across both low- and high-dimensional datasets with varying noise levels, and applied to four state-of-the-art online regression models, demonstrate that LSTM-SCCM is versatile and effective, rendering it a valuable solution for online regression models to address concept drift.Keywords: automated drift detection and adaptation, concept drift, hyperparameters optimization, online and adaptive learning, regression
Procedia PDF Downloads 172108 Detection of Leptospira interrogans in Kidney and Urine of water Buffalo and its Relationship with Histopathological and Serological Findings
Authors: M. R. Haji Hajikolaei, A. A. Nikvand, A. R. Ghadrdan, M. Ghorbanpoor, B. Mohammadian
Abstract:
This study was carried out on water buffalo for detection of Leptospira interrogans in kidney and urine and its relationship with serological findings. Blood, urine and kidney samples were taken immediately after slaughter from 353 water buffalos at Ahvaz abattoir in Khouzestan province, Iran. Sera were initially screened at serum dilution of 1:100 against seven live antigens of Leptospira interrogans: pomona, hardjo, ballum, icterohemorrhagiae, tarasovi, australis and grippotyphosa using the microscopic agglutination test (MAT) and sera with positive results were titrated against reacting antigens in serial twofold dilution from 1:100 to 1:800. The samples of kidney were embedded in paraffin wax and 5µm thick sections were stained routinely with Haematoxylin and Eosin (H&E). Polymerase chain reaction (PCR) examination was done on urine and kidney by using LipL32 gene primers. Antibodies against one or more serovars at dilution >:100 were detected in sera. The most frequent reactor was hardjo (56.2%), followed by pomona (52.3%), australis (9.8%), tarassovi (5.9%), grippotyphosa (4.5%) and icterohaemorrhagiae (3.9%). The L. interrogans were detected in 43 (12.2%) of examined buffaloes, so that 26 (8.2%) of kidney tissues, 14 (4.8%) of urine samples separately and 3 (0.84%) of both kidney and urine samples were positive in PCR. From 153 (43.3%) buffaloes with positive MAT, 24 cases were positive by PCR of kidney and/or urine samples, synchronously. Renal lesions such as interstitial nephritis, acute tubular necrosis (ATN), pyelonephritis, glomerolonephritis, renal fibrosis and hydronephrosis were found in 128 (36.3%) cases. Statistical analysis indicated that there was no significant association between results of MAT, PCR and interstitial nephritis.Keywords: leptospiral infection, PCR, MAT, histopathology, river buffalo
Procedia PDF Downloads 3342107 Cybersecurity Strategies for Protecting Oil and Gas Industrial Control Systems
Authors: Gaurav Kumar Sinha
Abstract:
The oil and gas industry is a critical component of the global economy, relying heavily on industrial control systems (ICS) to manage and monitor operations. However, these systems are increasingly becoming targets for cyber-attacks, posing significant risks to operational continuity, safety, and environmental integrity. This paper explores comprehensive cybersecurity strategies for protecting oil and gas industrial control systems. It delves into the unique vulnerabilities of ICS in this sector, including outdated legacy systems, integration with IT networks, and the increased connectivity brought by the Industrial Internet of Things (IIoT). We propose a multi-layered defense approach that includes the implementation of robust network security protocols, regular system updates and patch management, advanced threat detection and response mechanisms, and stringent access control measures. We illustrate the effectiveness of these strategies in mitigating cyber risks and ensuring the resilient and secure operation of oil and gas industrial control systems. The findings underscore the necessity for a proactive and adaptive cybersecurity framework to safeguard critical infrastructure in the face of evolving cyber threats.Keywords: cybersecurity, industrial control systems, oil and gas, cyber-attacks, network security, IoT, threat detection, system updates, patch management, access control, cybersecurity awareness, critical infrastructure, resilience, cyber threats, legacy systems, IT integration, multi-layered defense, operational continuity, safety, environmental integrity
Procedia PDF Downloads 492106 Development of Biosensor Chip for Detection of Specific Antibodies to HSV-1
Authors: Zatovska T. V., Nesterova N. V., Baranova G. V., Zagorodnya S. D.
Abstract:
In recent years, biosensor technologies based on the phenomenon of surface plasmon resonance (SPR) are becoming increasingly used in biology and medicine. Their application facilitates exploration in real time progress of binding of biomolecules and identification of agents that specifically interact with biologically active substances immobilized on the biosensor surface (biochips). Special attention is paid to the use of Biosensor analysis in determining the antibody-antigen interaction in the diagnostics of diseases caused by viruses and bacteria. According to WHO, the diseases that are caused by the herpes simplex virus (HSV), take second place (15.8%) after influenza as a cause of death from viral infections. Current diagnostics of HSV infection include PCR and ELISA assays. The latter allows determination the degree of immune response to viral infection and respective stages of its progress. In this regard, the searches for new and available diagnostic methods are very important. This work was aimed to develop Biosensor chip for detection of specific antibodies to HSV-1 in the human blood serum. The proteins of HSV1 (strain US) were used as antigens. The viral particles were accumulated in cell culture MDBK and purified by differential centrifugation in cesium chloride density gradient. Analysis of the HSV1 proteins was performed by polyacrylamide gel electrophoresis and ELISA. The protein concentration was measured using De Novix DS-11 spectrophotometer. The device for detection of antigen-antibody interactions was an optoelectronic two-channel spectrometer ‘Plasmon-6’, using the SPR phenomenon in the Krechman optical configuration. It was developed at the Lashkarev Institute of Semiconductor Physics of NASU. The used carrier was a glass plate covered with 45 nm gold film. Screening of human blood serums was performed using the test system ‘HSV-1 IgG ELISA’ (GenWay, USA). Development of Biosensor chip included optimization of conditions of viral antigen sorption and analysis steps. For immobilization of viral proteins 0.2% solution of Dextran 17, 200 (Sigma, USA) was used. Sorption of antigen took place at 4-8°C within 18-24 hours. After washing of chip, three times with citrate buffer (pH 5,0) 1% solution of BSA was applied to block the sites not occupied by viral antigen. It was found direct dependence between the amount of immobilized HSV1 antigen and SPR response. Using obtained biochips, panels of 25 positive and 10 negative for the content of antibodies to HSV-1 human sera were analyzed. The average value of SPR response was 185 a.s. for negative sera and from 312 to. 1264 a.s. for positive sera. It was shown that SPR data were agreed with ELISA results in 96% of samples proving the great potential of SPR in such researches. It was investigated the possibility of biochip regeneration and it was shown that application of 10 mM NaOH solution leads to rupture of intermolecular bonds. This allows reuse the chip several times. Thus, in this study biosensor chip for detection of specific antibodies to HSV1 was successfully developed expanding a range of diagnostic methods for this pathogen.Keywords: biochip, herpes virus, SPR
Procedia PDF Downloads 4172105 Molecular Characterization and Phylogenetic Analysis of Capripoxviruses from Outbreak in Iran 2021
Authors: Maryam Torabi, Habibi, Abdolahi, Mohammadi, Hassanzadeh, Darban Maghami, Baghi
Abstract:
Sheeppox Virus (SPPV) and goatpox virus (GTPV) are considerable diseases of sheep, and goats, caused by viruses of the Capripoxvirus (CaPV) genus. They are responsible for economic losses. Animal mortality, morbidity, cost of vaccinations, and restrictions in animal products’ trade are the reasons of economic losses. Control and eradication of CaPV depend on early detection of outbreaks so that molecular detection and genetic analysis could be effective to this aim. This study was undertaken to molecularly characterize SPPV and GTPV strains that have been circulating in Iran. 120 skin papules and nodule biopsies were collected from different regions of Iran and were examined for SPPV, GTPV viruses using TaqMan Real -Time PCR. Some of these amplified genes were sequenced, and phylogenetic trees were constructed. Out of the 120 samples analysed, 98 were positive for CaPV by Real- Time PCR (81.6%), and most of them wereSPPV. then 10 positive samples were sequenced and characterized by amplifying the ORF 103CaPV gene. sequencing and phylogenetic analysis for these positive samples revealed a high percentage of identity with SPPV isolated from different countries in Middle East. In conclusions, molecular characterization revealed nearly complete identity with all recent SPPVs strains in local countries that requires further studies to monitor the virus evolution and transmission pathways to better understand the virus pathobiology that will help for SPPV control.Keywords: molecular epidemiology, Real-Time PCR, phylogenetic analysis, capripoxviruses
Procedia PDF Downloads 1502104 A Comparative Study on Deep Learning Models for Pneumonia Detection
Authors: Hichem Sassi
Abstract:
Pneumonia, being a respiratory infection, has garnered global attention due to its rapid transmission and relatively high mortality rates. Timely detection and treatment play a crucial role in significantly reducing mortality associated with pneumonia. Presently, X-ray diagnosis stands out as a reasonably effective method. However, the manual scrutiny of a patient's X-ray chest radiograph by a proficient practitioner usually requires 5 to 15 minutes. In situations where cases are concentrated, this places immense pressure on clinicians for timely diagnosis. Relying solely on the visual acumen of imaging doctors proves to be inefficient, particularly given the low speed of manual analysis. Therefore, the integration of artificial intelligence into the clinical image diagnosis of pneumonia becomes imperative. Additionally, AI recognition is notably rapid, with convolutional neural networks (CNNs) demonstrating superior performance compared to human counterparts in image identification tasks. To conduct our study, we utilized a dataset comprising chest X-ray images obtained from Kaggle, encompassing a total of 5216 training images and 624 test images, categorized into two classes: normal and pneumonia. Employing five mainstream network algorithms, we undertook a comprehensive analysis to classify these diseases within the dataset, subsequently comparing the results. The integration of artificial intelligence, particularly through improved network architectures, stands as a transformative step towards more efficient and accurate clinical diagnoses across various medical domains.Keywords: deep learning, computer vision, pneumonia, models, comparative study
Procedia PDF Downloads 652103 Detection of Muscle Swelling Using the Cnts-Based Poc Wearable Strain Sensor
Authors: Nadeem Qaiser, Sherjeel Munsif Khan, Muhammad Mustafa Hussian, Vincent Tung
Abstract:
One of the emerging fields in the detection of chronic diseases is based on the point-of-care (POC) early monitoring of the symptoms and thus provides a state-of-the-art personalized healthcare system. Nowadays, wearable and flexible sensors are being used for analyzing sweat, glucose, blood pressure, and other skin conditions. However, localized jaw-bone swelling called parotid-swelling caused by some viruses has never been tracked before. To track physical motion or deformations, strain sensors, especially piezoresistive ones, are widely used. This work, for the first time, reports carbon nanotubes (CNTs)-based piezoresistive sensing patch that is highly flexible and stretchable and can record muscle deformations in real-time. The developed patch offers an excellent gauge factor for in-plane stretching and spatial expansion with low hysteresis. To calibrate the volumetric muscle expansion, we fabricated the pneumatic actuator that experienced volumetric expansion and thus redefined the gauge factor. Moreover, we employ a Bluetooth-low-energy system that can send information about muscle activity in real-time to a smartphone app. We utilized COMSOL calculations to reveal the mechanical robustness of the patch. The experiments showed the sensing patch's greater cyclability, making it a patch for personal healthcare and an excellent choice for monitoring the real-time POC monitoring of the human muscle swelling.Keywords: piezoresistive strain sensor, FEM simulations, CNTs sensor, flexible
Procedia PDF Downloads 882102 Muslim Husbands’ Participation in Women’s Health and Illness: A Descriptive Exploratory Study Applied to Muslim Women in Indonesia
Authors: Restuning Widiasih, Katherine Nelson, Joan Skinner
Abstract:
Muslim husbands have significant roles in the family including their roles in women’s health and illness. However, studies that explore Muslim husbands’ participation in women’s health is limited. The objective of this study was to uncover Muslim husbands’ participation in women’ health and illness including cancer prevention and screening. A descriptive exploratory approach was used involving 20 Muslim women from urban and rural areas of West Java Province, Indonesia. Muslim women shared experience related to their husbands support and activities in women’s health and illness. The data from the interviews were analyzed using the Comparative Analysis for Interview (CAI). Women perceived that husbands fully supported their health by providing opportunities for activities, and reminding them about healthy food, their workloads, and family planning. Husbands actively involved when women faced health issues including sharing knowledge and experience, discussing any health problems, advising for medical check-ups, and accompanying them for treatments. The analysis also found that husbands were less active and offered less advice regarding prevention and early detection of cancer. This study highlights the significant involvement of Muslim husbands in women’s health and illness, yet a lack of support from husbands related to screening and cancer prevention. This condition could be a burden for Muslim women to participate in health programs related to cancer prevention and early detection. Health education programs to improve Muslim husbands’ understanding of women’s health is needed.Keywords: descriptive exploratory study, Muslim husbands, Muslim women, women's health and illness
Procedia PDF Downloads 5142101 Agarose Amplification Based Sequencing (AG-seq) Characterization Cell-free RNA in Preimplantation Spent Embryo Medium
Authors: Huajuan Shi
Abstract:
Background: The biopsy of the preimplantation embryo may increase the potential risk and concern of embryo viability. Clinically discarded spent embryo medium (SEM) has entered the view of researchers, sparking an interest in noninvasive embryo screening. However, one of the major restrictions is the extremelty low quantity of cf-RNA, which is difficult to efficiently and unbiased amplify cf-RNA using traditional methods. Hence, there is urgently need to an efficient and low bias amplification method which can comprehensively and accurately obtain cf-RNA information to truly reveal the state of SEM cf-RNA. Result: In this present study, we established an agarose PCR amplification system, and has significantly improved the amplification sensitivity and efficiency by ~90 fold and 9.29 %, respectively. We applied agarose to sequencing library preparation (named AG-seq) to quantify and characterize cf-RNA in SEM. The number of detected cf-RNAs (3533 vs 598) and coverage of 3' end were significantly increased, and the noise of low abundance gene detection was reduced. The increasing percentage 5' end adenine and alternative splicing (AS) events of short fragments (< 400 bp) were discovered by AG-seq. Further, the profiles and characterizations of cf-RNA in spent cleavage medium (SCM) and spent blastocyst medium (SBM) indicated that 4‐mer end motifs of cf-RNA fragments could remarkably differentiate different embryo development stages. Significance: This study established an efficient and low-cost SEM amplification and library preparation method. Not only that, we successfully described the characterizations of SEM cf-RNA of preimplantation embryo by using AG-seq, including abundance features fragment lengths. AG-seq facilitates the study of cf-RNA as a noninvasive embryo screening biomarker and opens up potential clinical utilities of trace samples.Keywords: cell-free RNA, agarose, spent embryo medium, RNA sequencing, non-invasive detection
Procedia PDF Downloads 932100 SEAWIZARD-Multiplex AI-Enabled Graphene Based Lab-On-Chip Sensing Platform for Heavy Metal Ions Monitoring on Marine Water
Authors: M. Moreno, M. Alique, D. Otero, C. Delgado, P. Lacharmoise, L. Gracia, L. Pires, A. Moya
Abstract:
Marine environments are increasingly threatened by heavy metal contamination, including mercury (Hg), lead (Pb), and cadmium (Cd), posing significant risks to ecosystems and human health. Traditional monitoring techniques often fail to provide the spatial and temporal resolution needed for real-time detection of these contaminants, especially in remote or harsh environments. SEAWIZARD addresses these challenges by leveraging the flexibility, adaptability, and cost-effectiveness of printed electronics, with the integration of microfluidics to develop a compact, portable, and reusable sensor platform designed specifically for real-time monitoring of heavy metal ions in seawater. The SEAWIZARD sensor is a multiparametric Lab-on-Chip (LoC) device, a miniaturized system that integrates several laboratory functions into a single chip, drastically reducing sample volumes and improving adaptability. This platform integrates three printed graphene electrodes for the simultaneous detection of Hg, Cd and Pb via square wave voltammetry. These electrodes share the reference and the counter electrodes to improve space efficiency. Additionally, it integrates printed pH and temperature sensors to correct environmental interferences that may impact the accuracy of metal detection. The pH sensor is based on a carbon electrode with iridium oxide electrodeposited while the temperature sensor is graphene based. A protective dielectric layer is printed on top of the sensor to safeguard it in harsh marine conditions. The use of flexible polyethylene terephthalate (PET) as the substrate enables the sensor to conform to various surfaces and operate in challenging environments. One of the key innovations of SEAWIZARD is its integrated microfluidic layer, fabricated from cyclic olefin copolymer (COC). This microfluidic component allows a controlled flow of seawater over the sensing area, allowing for significant improved detection limits compared to direct water sampling. The system’s dual-channel design separates the detection of heavy metals from the measurement of pH and temperature, ensuring that each parameter is measured under optimal conditions. In addition, the temperature sensor is finely tuned with a serpentine-shaped microfluidic channel to ensure precise thermal measurements. SEAWIZARD also incorporates custom electronics that allow for wireless data transmission via Bluetooth, facilitating rapid data collection and user interface integration. Embedded artificial intelligence further enhances the platform by providing an automated alarm system, capable of detecting predefined metal concentration thresholds and issuing warnings when limits are exceeded. This predictive feature enables early warnings of potential environmental disasters, such as industrial spills or toxic levels of heavy metal pollutants, making SEAWIZARD not just a detection tool, but a comprehensive monitoring and early intervention system. In conclusion, SEAWIZARD represents a significant advancement in printed electronics applied to environmental sensing. By combining flexible, low-cost materials with advanced microfluidics, custom electronics, and AI-driven intelligence, SEAWIZARD offers a highly adaptable and scalable solution for real-time, high-resolution monitoring of heavy metals in marine environments. Its compact and portable design makes it an accessible, user-friendly tool with the potential to transform water quality monitoring practices and provide critical data to protect marine ecosystems from contamination-related risks.Keywords: lab-on-chip, printed electronics, real-time monitoring, microfluidics, heavy metal contamination
Procedia PDF Downloads 352099 Fiber Stiffness Detection of GFRP Using Combined ABAQUS and Genetic Algorithms
Authors: Gyu-Dong Kim, Wuk-Jae Yoo, Sang-Youl Lee
Abstract:
Composite structures offer numerous advantages over conventional structural systems in the form of higher specific stiffness and strength, lower life-cycle costs, and benefits such as easy installation and improved safety. Recently, there has been a considerable increase in the use of composites in engineering applications and as wraps for seismic upgrading and repairs. However, these composites deteriorate with time because of outdated materials, excessive use, repetitive loading, climatic conditions, manufacturing errors, and deficiencies in inspection methods. In particular, damaged fibers in a composite result in significant degradation of structural performance. In order to reduce the failure probability of composites in service, techniques to assess the condition of the composites to prevent continual growth of fiber damage are required. Condition assessment technology and nondestructive evaluation (NDE) techniques have provided various solutions for the safety of structures by means of detecting damage or defects from static or dynamic responses induced by external loading. A variety of techniques based on detecting the changes in static or dynamic behavior of isotropic structures has been developed in the last two decades. These methods, based on analytical approaches, are limited in their capabilities in dealing with complex systems, primarily because of their limitations in handling different loading and boundary conditions. Recently, investigators have introduced direct search methods based on metaheuristics techniques and artificial intelligence, such as genetic algorithms (GA), simulated annealing (SA) methods, and neural networks (NN), and have promisingly applied these methods to the field of structural identification. Among them, GAs attract our attention because they do not require a considerable amount of data in advance in dealing with complex problems and can make a global solution search possible as opposed to classical gradient-based optimization techniques. In this study, we propose an alternative damage-detection technique that can determine the degraded stiffness distribution of vibrating laminated composites made of Glass Fiber-reinforced Polymer (GFRP). The proposed method uses a modified form of the bivariate Gaussian distribution function to detect degraded stiffness characteristics. In addition, this study presents a method to detect the fiber property variation of laminated composite plates from the micromechanical point of view. The finite element model is used to study free vibrations of laminated composite plates for fiber stiffness degradation. In order to solve the inverse problem using the combined method, this study uses only first mode shapes in a structure for the measured frequency data. In particular, this study focuses on the effect of the interaction among various parameters, such as fiber angles, layup sequences, and damage distributions, on fiber-stiffness damage detection.Keywords: stiffness detection, fiber damage, genetic algorithm, layup sequences
Procedia PDF Downloads 2772098 Use of Galileo Advanced Features in Maritime Domain
Authors: Olivier Chaigneau, Damianos Oikonomidis, Marie-Cecile Delmas
Abstract:
GAMBAS (Galileo Advanced features for the Maritime domain: Breakthrough Applications for Safety and security) is a project funded by the European Space Program Agency (EUSPA) aiming at identifying the search-and-rescue and ship security alert system needs for maritime users (including operators and fishing stakeholders) and developing operational concepts to answer these needs. The general objective of the GAMBAS project is to support the deployment of Galileo exclusive features in the maritime domain in order to improve safety and security at sea, detection of illegal activities and associated surveillance means, resilience to natural and human-induced emergency situations, and develop, integrate, demonstrate, standardize and disseminate these new associated capabilities. The project aims to demonstrate: improvement of the SAR (Search And Rescue) and SSAS (Ship Security Alert System) detection and response to maritime distress through the integration of new features into the beacon for SSAS in terms of cost optimization, user-friendly aspects, integration of Galileo and OS NMA (Open Service Navigation Message Authentication) reception for improved authenticated localization performance and reliability, and at sea triggering capabilities, optimization of the responsiveness of RCCs (Rescue Co-ordination Centre) towards the distress situations affecting vessels, the adaptation of the MCCs (Mission Control Center) and MEOLUT (Medium Earth Orbit Local User Terminal) to the data distribution of SSAS alerts.Keywords: Galileo new advanced features, maritime, safety, security
Procedia PDF Downloads 942097 Rapid Building Detection in Population-Dense Regions with Overfitted Machine Learning Models
Authors: V. Mantey, N. Findlay, I. Maddox
Abstract:
The quality and quantity of global satellite data have been increasing exponentially in recent years as spaceborne systems become more affordable and the sensors themselves become more sophisticated. This is a valuable resource for many applications, including disaster management and relief. However, while more information can be valuable, the volume of data available is impossible to manually examine. Therefore, the question becomes how to extract as much information as possible from the data with limited manpower. Buildings are a key feature of interest in satellite imagery with applications including telecommunications, population models, and disaster relief. Machine learning tools are fast becoming one of the key resources to solve this problem, and models have been developed to detect buildings in optical satellite imagery. However, by and large, most models focus on affluent regions where buildings are generally larger and constructed further apart. This work is focused on the more difficult problem of detection in populated regions. The primary challenge with detecting small buildings in densely populated regions is both the spatial and spectral resolution of the optical sensor. Densely packed buildings with similar construction materials will be difficult to separate due to a similarity in color and because the physical separation between structures is either non-existent or smaller than the spatial resolution. This study finds that training models until they are overfitting the input sample can perform better in these areas than a more robust, generalized model. An overfitted model takes less time to fine-tune from a generalized pre-trained model and requires fewer input data. The model developed for this study has also been fine-tuned using existing, open-source, building vector datasets. This is particularly valuable in the context of disaster relief, where information is required in a very short time span. Leveraging existing datasets means that little to no manpower or time is required to collect data in the region of interest. The training period itself is also shorter for smaller datasets. Requiring less data means that only a few quality areas are necessary, and so any weaknesses or underpopulated regions in the data can be skipped over in favor of areas with higher quality vectors. In this study, a landcover classification model was developed in conjunction with the building detection tool to provide a secondary source to quality check the detected buildings. This has greatly reduced the false positive rate. The proposed methodologies have been implemented and integrated into a configurable production environment and have been employed for a number of large-scale commercial projects, including continent-wide DEM production, where the extracted building footprints are being used to enhance digital elevation models. Overfitted machine learning models are often considered too specific to have any predictive capacity. However, this study demonstrates that, in cases where input data is scarce, overfitted models can be judiciously applied to solve time-sensitive problems.Keywords: building detection, disaster relief, mask-RCNN, satellite mapping
Procedia PDF Downloads 1712096 A Comprehensive Characterization of Cell-free RNA in Spent Blastocyst Medium and Quality Prediction for Blastocyst
Authors: Huajuan Shi
Abstract:
Background: The biopsy of the preimplantation embryo may increase the potential risk and concern of embryo viability. Clinically discarded spent embryo medium (SEM) has entered the view of researchers, sparking an interest in noninvasive embryo screening. However, one of the major restrictions is the extremelty low quantity of cf-RNA, which is difficult to efficiently and unbiased amplify cf-RNA using traditional methods. Hence, there is urgently need to an efficient and low bias amplification method which can comprehensively and accurately obtain cf-RNA information to truly reveal the state of SEM cf-RNA. Result: In this present study, we established an agarose PCR amplification system, and has significantly improved the amplification sensitivity and efficiency by ~90 fold and 9.29 %, respectively. We applied agarose to sequencing library preparation (named AG-seq) to quantify and characterize cf-RNA in SEM. The number of detected cf-RNAs (3533 vs 598) and coverage of 3' end were significantly increased, and the noise of low abundance gene detection was reduced. The increasing percentage 5' end adenine and alternative splicing (AS) events of short fragments (< 400 bp) were discovered by AG-seq. Further, the profiles and characterizations of cf-RNA in spent cleavage medium (SCM) and spent blastocyst medium (SBM) indicated that 4‐mer end motifs of cf-RNA fragments could remarkably differentiate different embryo development stages. Significance: This study established an efficient and low-cost SEM amplification and library preparation method. Not only that, we successfully described the characterizations of SEM cf-RNA of preimplantation embryo by using AG-seq, including abundance features fragment lengths. AG-seq facilitates the study of cf-RNA as a noninvasive embryo screening biomarker and opens up potential clinical utilities of trace samples.Keywords: cell-free RNA, agarose, spent embryo medium, RNA sequencing, non-invasive detection
Procedia PDF Downloads 652095 Error Detection and Correction for Onboard Satellite Computers Using Hamming Code
Authors: Rafsan Al Mamun, Md. Motaharul Islam, Rabana Tajrin, Nabiha Noor, Shafinaz Qader
Abstract:
In an attempt to enrich the lives of billions of people by providing proper information, security and a way of communicating with others, the need for efficient and improved satellites is constantly growing. Thus, there is an increasing demand for better error detection and correction (EDAC) schemes, which are capable of protecting the data onboard the satellites. The paper is aimed towards detecting and correcting such errors using a special algorithm called the Hamming Code, which uses the concept of parity and parity bits to prevent single-bit errors onboard a satellite in Low Earth Orbit. This paper focuses on the study of Low Earth Orbit satellites and the process of generating the Hamming Code matrix to be used for EDAC using computer programs. The most effective version of Hamming Code generated was the Hamming (16, 11, 4) version using MATLAB, and the paper compares this particular scheme with other EDAC mechanisms, including other versions of Hamming Codes and Cyclic Redundancy Check (CRC), and the limitations of this scheme. This particular version of the Hamming Code guarantees single-bit error corrections as well as double-bit error detections. Furthermore, this version of Hamming Code has proved to be fast with a checking time of 5.669 nanoseconds, that has a relatively higher code rate and lower bit overhead compared to the other versions and can detect a greater percentage of errors per length of code than other EDAC schemes with similar capabilities. In conclusion, with the proper implementation of the system, it is quite possible to ensure a relatively uncorrupted satellite storage system.Keywords: bit-flips, Hamming code, low earth orbit, parity bits, satellite, single error upset
Procedia PDF Downloads 1302094 A Microwave and Millimeter-Wave Transmit/Receive Switch Subsystem for Communication Systems
Authors: Donghyun Lee, Cam Nguyen
Abstract:
Multi-band systems offer a great deal of benefit in modern communication and radar systems. In particular, multi-band antenna-array radar systems with their extended frequency diversity provide numerous advantages in detection, identification, locating and tracking a wide range of targets, including enhanced detection coverage, accurate target location, reduced survey time and cost, increased resolution, improved reliability and target information. An accurate calibration is a critical issue in antenna array systems. The amplitude and phase errors in multi-band and multi-polarization antenna array transceivers result in inaccurate target detection, deteriorated resolution and reduced reliability. Furthermore, the digital beam former without the RF domain phase-shifting is less immune to unfiltered interference signals, which can lead to receiver saturation in array systems. Therefore, implementing integrated front-end architecture, which can support calibration function with low insertion and filtering function from the farthest end of an array transceiver is of great interest. We report a dual K/Ka-band T/R/Calibration switch module with quasi-elliptic dual-bandpass filtering function implementing a Q-enhanced metamaterial transmission line. A unique dual-band frequency response is incorporated in the reception and calibration path of the proposed switch module utilizing the composite right/left-handed meta material transmission line coupled with a Colpitts-style negative generation circuit. The fabricated fully integrated T/R/Calibration switch module in 0.18-μm BiCMOS technology exhibits insertion loss of 4.9-12.3 dB and isolation of more than 45 dB in the reception, transmission and calibration mode of operation. In the reception and calibration mode, the dual-band frequency response centered at 24.5 and 35 GHz exhibits out-of-band rejection of more than 30 dB compared to the pass bands below 10.5 GHz and above 59.5 GHz. The rejection between the pass bands reaches more than 50 dB. In all modes of operation, the IP1-dB is between 4 and 11 dBm. Acknowledgement: This paper was made possible by NPRP grant # 6-241-2-102 from the Qatar National Research Fund (a member of Qatar Foundation). The statements made herein are solely the responsibility of the authors.Keywords: microwaves, millimeter waves, T/R switch, wireless communications, wireless communications
Procedia PDF Downloads 1612093 Thresholding Approach for Automatic Detection of Pseudomonas aeruginosa Biofilms from Fluorescence in situ Hybridization Images
Authors: Zonglin Yang, Tatsuya Akiyama, Kerry S. Williamson, Michael J. Franklin, Thiruvarangan Ramaraj
Abstract:
Pseudomonas aeruginosa is an opportunistic pathogen that forms surface-associated microbial communities (biofilms) on artificial implant devices and on human tissue. Biofilm infections are difficult to treat with antibiotics, in part, because the bacteria in biofilms are physiologically heterogeneous. One measure of biological heterogeneity in a population of cells is to quantify the cellular concentrations of ribosomes, which can be probed with fluorescently labeled nucleic acids. The fluorescent signal intensity following fluorescence in situ hybridization (FISH) analysis correlates to the cellular level of ribosomes. The goals here are to provide computationally and statistically robust approaches to automatically quantify cellular heterogeneity in biofilms from a large library of epifluorescent microscopy FISH images. In this work, the initial steps were developed toward these goals by developing an automated biofilm detection approach for use with FISH images. The approach allows rapid identification of biofilm regions from FISH images that are counterstained with fluorescent dyes. This methodology provides advances over other computational methods, allowing subtraction of spurious signals and non-biological fluorescent substrata. This method will be a robust and user-friendly approach which will enable users to semi-automatically detect biofilm boundaries and extract intensity values from fluorescent images for quantitative analysis of biofilm heterogeneity.Keywords: image informatics, Pseudomonas aeruginosa, biofilm, FISH, computer vision, data visualization
Procedia PDF Downloads 1372092 Evaluating Traffic Congestion Using the Bayesian Dirichlet Process Mixture of Generalized Linear Models
Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig
Abstract:
This study applied traffic speed and occupancy to develop clustering models that identify different traffic conditions. Particularly, these models are based on the Dirichlet Process Mixture of Generalized Linear regression (DML) and change-point regression (CR). The model frameworks were implemented using 2015 historical traffic data aggregated at a 15-minute interval from an Interstate 295 freeway in Jacksonville, Florida. Using the deviance information criterion (DIC) to identify the appropriate number of mixture components, three traffic states were identified as free-flow, transitional, and congested condition. Results of the DML revealed that traffic occupancy is statistically significant in influencing the reduction of traffic speed in each of the identified states. Influence on the free-flow and the congested state was estimated to be higher than the transitional flow condition in both evening and morning peak periods. Estimation of the critical speed threshold using CR revealed that 47 mph and 48 mph are speed thresholds for congested and transitional traffic condition during the morning peak hours and evening peak hours, respectively. Free-flow speed thresholds for morning and evening peak hours were estimated at 64 mph and 66 mph, respectively. The proposed approaches will facilitate accurate detection and prediction of traffic congestion for developing effective countermeasures.Keywords: traffic congestion, multistate speed distribution, traffic occupancy, Dirichlet process mixtures of generalized linear model, Bayesian change-point detection
Procedia PDF Downloads 2942091 Building User Behavioral Models by Processing Web Logs and Clustering Mechanisms
Authors: Madhuka G. P. D. Udantha, Gihan V. Dias, Surangika Ranathunga
Abstract:
Today Websites contain very interesting applications. But there are only few methodologies to analyze User navigations through the Websites and formulating if the Website is put to correct use. The web logs are only used if some major attack or malfunctioning occurs. Web Logs contain lot interesting dealings on users in the system. Analyzing web logs has become a challenge due to the huge log volume. Finding interesting patterns is not as easy as it is due to size, distribution and importance of minor details of each log. Web logs contain very important data of user and site which are not been put to good use. Retrieving interesting information from logs gives an idea of what the users need, group users according to their various needs and improve site to build an effective and efficient site. The model we built is able to detect attacks or malfunctioning of the system and anomaly detection. Logs will be more complex as volume of traffic and the size and complexity of web site grows. Unsupervised techniques are used in this solution which is fully automated. Expert knowledge is only used in validation. In our approach first clean and purify the logs to bring them to a common platform with a standard format and structure. After cleaning module web session builder is executed. It outputs two files, Web Sessions file and Indexed URLs file. The Indexed URLs file contains the list of URLs accessed and their indices. Web Sessions file lists down the indices of each web session. Then DBSCAN and EM Algorithms are used iteratively and recursively to get the best clustering results of the web sessions. Using homogeneity, completeness, V-measure, intra and inter cluster distance and silhouette coefficient as parameters these algorithms self-evaluate themselves to input better parametric values to run the algorithms. If a cluster is found to be too large then micro-clustering is used. Using Cluster Signature Module the clusters are annotated with a unique signature called finger-print. In this module each cluster is fed to Associative Rule Learning Module. If it outputs confidence and support as value 1 for an access sequence it would be a potential signature for the cluster. Then the access sequence occurrences are checked in other clusters. If it is found to be unique for the cluster considered then the cluster is annotated with the signature. These signatures are used in anomaly detection, prevent cyber attacks, real-time dashboards that visualize users, accessing web pages, predict actions of users and various other applications in Finance, University Websites, News and Media Websites etc.Keywords: anomaly detection, clustering, pattern recognition, web sessions
Procedia PDF Downloads 2892090 Design and Simulation of a Radiation Spectrometer Using Scintillation Detectors
Authors: Waleed K. Saib, Abdulsalam M. Alhawsawi, Essam Banoqitah
Abstract:
The idea of this research is to design a radiation spectrometer using LSO scintillation detector coupled to a C series of SiPM (silicon photomultiplier). The device can be used to detects gamma and X-ray radiation. This device is also designed to estimates the activity of the source contamination. The SiPM will detect light in the visible range above the threshold and read them as counts. Three gamma sources were used for these experiments Cs-137, Am-241 and Co-60 with various activities. These sources are applied for four experiments operating the SiPM as a spectrometer, energy resolution, pile-up set and efficiency. The SiPM is connected to a MCA to perform as a spectrometer. Cerium doped Lutetium Silicate (Lu₂SiO₅) with light yield 26000 photons/Mev coupled with the SiPM. As a result, all the main features of the Cs-137, Am-241 and Co-60 are identified in MCA. The experiment shows how photon energy and probability of interaction are inversely related. Total attenuation reduces as photon energy increases. An analytical calculation was made to obtain the FWHM resolution for each gamma source. The FWHM resolution for Am-241 (59 keV) is 28.75 %, for Cs-137 (662 keV) is 7.85 %, for Co-60 (1173 keV) is 4.46 % and for Co-60 (1332 keV) is 3.70%. Moreover, the experiment shows that the dead time and counts number decreased when the pile-up rejection was disabled and the FWHM decreased when the pile-up was enabled. The efficiencies were calculated at four different distances from the detector 2, 4, 8 and 16 cm. The detection efficiency was observed to declined exponentially with increasing distance from the detector face. Conclusively, the SiPM board operated with an LSO scintillator crystal as a spectrometer. The SiPM energy resolution for the three gamma sources used was a decent comparison to other PMTs.Keywords: PMT, radiation, radiation detection, scintillation detectors, silicon photomultiplier, spectrometer
Procedia PDF Downloads 1562089 Model-Based Fault Diagnosis in Carbon Fiber Reinforced Composites Using Particle Filtering
Abstract:
Carbon fiber reinforced composites (CFRP) used as aircraft structure are subject to lightning strike, putting structural integrity under risk. Indirect damage may occur after a lightning strike where the internal structure can be damaged due to excessive heat induced by lightning current, while the surface of the structures remains intact. Three damage modes may be observed after a lightning strike: fiber breakage, inter-ply delamination and intra-ply cracks. The assessment of internal damage states in composite is challenging due to complicated microstructure, inherent uncertainties, and existence of multiple damage modes. In this work, a model based approach is adopted to diagnose faults in carbon composites after lighting strikes. A resistor network model is implemented to relate the overall electrical and thermal conduction behavior under simulated lightning current waveform to the intrinsic temperature dependent material properties, microstructure and degradation of materials. A fault detection and identification (FDI) module utilizes the physics based model and a particle filtering algorithm to identify damage mode as well as calculate the probability of structural failure. Extensive simulation results are provided to substantiate the proposed fault diagnosis methodology with both single fault and multiple faults cases. The approach is also demonstrated on transient resistance data collected from a IM7/Epoxy laminate under simulated lightning strike.Keywords: carbon composite, fault detection, fault identification, particle filter
Procedia PDF Downloads 1962088 Advanced Magnetic Resonance Imaging in Differentiation of Neurocysticercosis and Tuberculoma
Authors: Rajendra N. Ghosh, Paramjeet Singh, Niranjan Khandelwal, Sameer Vyas, Pratibha Singhi, Naveen Sankhyan
Abstract:
Background: Tuberculoma and neurocysticercosis (NCC) are two most common intracranial infections in developing country. They often simulate on neuroimaging and in absence of typical imaging features cause significant diagnostic dilemmas. Differentiation is extremely important to avoid empirical exposure to antitubercular medications or nonspecific treatment causing disease progression. Purpose: Better characterization and differentiation of CNS tuberculoma and NCC by using morphological and multiple advanced functional MRI. Material and Methods: Total fifty untreated patients (20 tuberculoma and 30 NCC) were evaluated by using conventional and advanced sequences like CISS, SWI, DWI, DTI, Magnetization transfer (MT), T2Relaxometry (T2R), Perfusion and Spectroscopy. rCBV,ADC,FA,T2R,MTR values and metabolite ratios were calculated from lesion and normal parenchyma. Diagnosis was confirmed by typical biochemical, histopathological and imaging features. Results: CISS was most useful sequence for scolex detection (90% on CISS vs 73% on routine sequences). SWI showed higher scolex detection ability. Mean values of ADC, FA,T2R from core and rCBV from wall of lesion were significantly different in tuberculoma and NCC (P < 0.05). Mean values of rCBV, ADC, T2R and FA for tuberculoma and NCC were (3.36 vs1.3), (1.09x10⁻³vs 1.4x10⁻³), (0.13 x10⁻³ vs 0.09 x10⁻³) and (88.65 ms vs 272.3 ms) respectively. Tuberculomas showed high lipid peak, more choline and lower creatinine with Ch/Cr ratio > 1. T2R value was most significant parameter for differentiation. Cut off values for each significant parameters have proposed. Conclusion: Quantitative MRI in combination with conventional sequences can better characterize and differentiate similar appearing tuberculoma and NCC and may be incorporated in routine protocol which may avoid brain biopsy and empirical therapy.Keywords: advanced functional MRI, differentiation, neurcysticercosis, tuberculoma
Procedia PDF Downloads 5692087 Automatic Detection of Traffic Stop Locations Using GPS Data
Authors: Areej Salaymeh, Loren Schwiebert, Stephen Remias, Jonathan Waddell
Abstract:
Extracting information from new data sources has emerged as a crucial task in many traffic planning processes, such as identifying traffic patterns, route planning, traffic forecasting, and locating infrastructure improvements. Given the advanced technologies used to collect Global Positioning System (GPS) data from dedicated GPS devices, GPS equipped phones, and navigation tools, intelligent data analysis methodologies are necessary to mine this raw data. In this research, an automatic detection framework is proposed to help identify and classify the locations of stopped GPS waypoints into two main categories: signalized intersections or highway congestion. The Delaunay triangulation is used to perform this assessment in the clustering phase. While most of the existing clustering algorithms need assumptions about the data distribution, the effectiveness of the Delaunay triangulation relies on triangulating geographical data points without such assumptions. Our proposed method starts by cleaning noise from the data and normalizing it. Next, the framework will identify stoppage points by calculating the traveled distance. The last step is to use clustering to form groups of waypoints for signalized traffic and highway congestion. Next, a binary classifier was applied to find distinguish highway congestion from signalized stop points. The binary classifier uses the length of the cluster to find congestion. The proposed framework shows high accuracy for identifying the stop positions and congestion points in around 99.2% of trials. We show that it is possible, using limited GPS data, to distinguish with high accuracy.Keywords: Delaunay triangulation, clustering, intelligent transportation systems, GPS data
Procedia PDF Downloads 2762086 Deep Learning-Based Classification of 3D CT Scans with Real Clinical Data; Impact of Image format
Authors: Maryam Fallahpoor, Biswajeet Pradhan
Abstract:
Background: Artificial intelligence (AI) serves as a valuable tool in mitigating the scarcity of human resources required for the evaluation and categorization of vast quantities of medical imaging data. When AI operates with optimal precision, it minimizes the demand for human interpretations and, thereby, reduces the burden on radiologists. Among various AI approaches, deep learning (DL) stands out as it obviates the need for feature extraction, a process that can impede classification, especially with intricate datasets. The advent of DL models has ushered in a new era in medical imaging, particularly in the context of COVID-19 detection. Traditional 2D imaging techniques exhibit limitations when applied to volumetric data, such as Computed Tomography (CT) scans. Medical images predominantly exist in one of two formats: neuroimaging informatics technology initiative (NIfTI) and digital imaging and communications in medicine (DICOM). Purpose: This study aims to employ DL for the classification of COVID-19-infected pulmonary patients and normal cases based on 3D CT scans while investigating the impact of image format. Material and Methods: The dataset used for model training and testing consisted of 1245 patients from IranMehr Hospital. All scans shared a matrix size of 512 × 512, although they exhibited varying slice numbers. Consequently, after loading the DICOM CT scans, image resampling and interpolation were performed to standardize the slice count. All images underwent cropping and resampling, resulting in uniform dimensions of 128 × 128 × 60. Resolution uniformity was achieved through resampling to 1 mm × 1 mm × 1 mm, and image intensities were confined to the range of (−1000, 400) Hounsfield units (HU). For classification purposes, positive pulmonary COVID-19 involvement was designated as 1, while normal images were assigned a value of 0. Subsequently, a U-net-based lung segmentation module was applied to obtain 3D segmented lung regions. The pre-processing stage included normalization, zero-centering, and shuffling. Four distinct 3D CNN models (ResNet152, ResNet50, DensNet169, and DensNet201) were employed in this study. Results: The findings revealed that the segmentation technique yielded superior results for DICOM images, which could be attributed to the potential loss of information during the conversion of original DICOM images to NIFTI format. Notably, ResNet152 and ResNet50 exhibited the highest accuracy at 90.0%, and the same models achieved the best F1 score at 87%. ResNet152 also secured the highest Area under the Curve (AUC) at 0.932. Regarding sensitivity and specificity, DensNet201 achieved the highest values at 93% and 96%, respectively. Conclusion: This study underscores the capacity of deep learning to classify COVID-19 pulmonary involvement using real 3D hospital data. The results underscore the significance of employing DICOM format 3D CT images alongside appropriate pre-processing techniques when training DL models for COVID-19 detection. This approach enhances the accuracy and reliability of diagnostic systems for COVID-19 detection.Keywords: deep learning, COVID-19 detection, NIFTI format, DICOM format
Procedia PDF Downloads 90