Search results for: change detection method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26074

Search results for: change detection method

24664 A Low-Power Two-Stage Seismic Sensor Scheme for Earthquake Early Warning System

Authors: Arvind Srivastav, Tarun Kanti Bhattacharyya

Abstract:

The north-eastern, Himalayan, and Eastern Ghats Belt of India comprise of earthquake-prone, remote, and hilly terrains. Earthquakes have caused enormous damages in these regions in the past. A wireless sensor network based earthquake early warning system (EEWS) is being developed to mitigate the damages caused by earthquakes. It consists of sensor nodes, distributed over the region, that perform majority voting of the output of the seismic sensors in the vicinity, and relay a message to a base station to alert the residents when an earthquake is detected. At the heart of the EEWS is a low-power two-stage seismic sensor that continuously tracks seismic events from incoming three-axis accelerometer signal at the first-stage, and, in the presence of a seismic event, triggers the second-stage P-wave detector that detects the onset of P-wave in an earthquake event. The parameters of the P-wave detector have been optimized for minimizing detection time and maximizing the accuracy of detection.Working of the sensor scheme has been verified with seven earthquakes data retrieved from IRIS. In all test cases, the scheme detected the onset of P-wave accurately. Also, it has been established that the P-wave onset detection time reduces linearly with the sampling rate. It has been verified with test data; the detection time for data sampled at 10Hz was around 2 seconds which reduced to 0.3 second for the data sampled at 100Hz.

Keywords: earthquake early warning system, EEWS, STA/LTA, polarization, wavelet, event detector, P-wave detector

Procedia PDF Downloads 165
24663 Ambient Electrospray Deposition: An Efficient Technique to Immobilize Laccase on Cheap Electrodes With Unprecedented Reuse and Storage Performances

Authors: Mattea Carmen Castrovilli, Antonella Cartoni

Abstract:

Electrospray ionisation (ESI), a well-established technique widely used to produce ion beams of biomolecules in mass spectrometry (ESI-MS), can be used for ambient soft landing of enzymes on a specific substrate. In this work, we show how the ambient electrospray deposition (ESD) technique can be successfully exploited for manufacturing a promising, green-friendly electrochemical amperometric laccase-based biosensor with unprecedented reuse and storage performance. These biosensors have been manufactured by spraying a laccase solution of 2μg/μL at 20% of methanol on a commercial carbon screen printed electrode (C-SPE) using a custom ESD set-up. The laccase-based ESD biosensor has been tested against catechol compounds in the linear range 2-100 μM, with a limit of detection of 1.7 μM, without interference from cadmium, chrome, arsenic, and zinc and without any memory effects, but showing a matrix effect in lake and well water. The ESD biosensor shows enhanced performances compared to the ones fabricated with other immobilization methods, like drop-casting. Indeed, it retains 100% activity up to two months of storage at ambient conditions without any special care and working stability up to 63 measurements on the same electrode just prepared and 20 on a one-year-old electrode subjected to redeposition together with a 100% resistance to use of the same electrode in subsequent days. The ESD method is a one-step, environmentally friendly method that allows the deposition of the bio-recognition layer without using any additional chemicals. The promising results in terms of storage and working stability also obtained with the more fragile lactate oxidase enzyme suggest these improvements should be attributed to the ESD technique rather than to the bioreceptor, highlighting how the ESD could be useful in reducing pollution from disposable devices. Acknowledgment: The understanding at the molecular level of this promising biosensor by using different spectroscopies, microscopies and analytical techniques is the subject of our PRIN 2022 project ESILARANTE.

Keywords: reuse, storage performance, immobilization, electrospray deposition, biosensor, laccase, catechol detection, green chemistry

Procedia PDF Downloads 45
24662 Bipolar Impulse Noise Removal and Edge Preservation in Color Images and Video Using Improved Kuwahara Filter

Authors: Reji Thankachan, Varsha PS

Abstract:

Both image capturing devices and human visual systems are nonlinear. Hence nonlinear filtering methods outperforms its linear counterpart in many applications. Linear methods are unable to remove impulsive noise in images by preserving its edges and fine details. In addition, linear algorithms are unable to remove signal dependent or multiplicative noise in images. This paper presents an approach to denoise and smoothen the Bipolar impulse noised images and videos using improved Kuwahara filter. It involves a 2 stage algorithm which includes a noise detection followed by filtering. Numerous simulation demonstrate that proposed method outperforms the existing method by eliminating the painting like flattening effect along the local feature direction while preserving edge with improvement in PSNR and MSE.

Keywords: bipolar impulse noise, Kuwahara, PSNR MSE, PDF

Procedia PDF Downloads 482
24661 An Image Processing Scheme for Skin Fungal Disease Identification

Authors: A. A. M. A. S. S. Perera, L. A. Ranasinghe, T. K. H. Nimeshika, D. M. Dhanushka Dissanayake, Namalie Walgampaya

Abstract:

Nowadays, skin fungal diseases are mostly found in people of tropical countries like Sri Lanka. A skin fungal disease is a particular kind of illness caused by fungus. These diseases have various dangerous effects on the skin and keep on spreading over time. It becomes important to identify these diseases at their initial stage to control it from spreading. This paper presents an automated skin fungal disease identification system implemented to speed up the diagnosis process by identifying skin fungal infections in digital images. An image of the diseased skin lesion is acquired and a comprehensive computer vision and image processing scheme is used to process the image for the disease identification. This includes colour analysis using RGB and HSV colour models, texture classification using Grey Level Run Length Matrix, Grey Level Co-Occurrence Matrix and Local Binary Pattern, Object detection, Shape Identification and many more. This paper presents the approach and its outcome for identification of four most common skin fungal infections, namely, Tinea Corporis, Sporotrichosis, Malassezia and Onychomycosis. The main intention of this research is to provide an automated skin fungal disease identification system that increase the diagnostic quality, shorten the time-to-diagnosis and improve the efficiency of detection and successful treatment for skin fungal diseases.

Keywords: Circularity Index, Grey Level Run Length Matrix, Grey Level Co-Occurrence Matrix, Local Binary Pattern, Object detection, Ring Detection, Shape Identification

Procedia PDF Downloads 212
24660 An Approach for Detection Efficiency Determination of High Purity Germanium Detector Using Cesium-137

Authors: Abdulsalam M. Alhawsawi

Abstract:

Estimation of a radiation detector's efficiency plays a significant role in calculating the activity of radioactive samples. Detector efficiency is measured using sources that emit a variety of energies from low to high-energy photons along the energy spectrum. Some photon energies are hard to find in lab settings either because check sources are hard to obtain or the sources have short half-lives. This work aims to develop a method to determine the efficiency of a High Purity Germanium Detector (HPGe) based on the 662 keV gamma ray photon emitted from Cs-137. Cesium-137 is readily available in most labs with radiation detection and health physics applications and has a long half-life of ~30 years. Several photon efficiencies were calculated using the MCNP5 simulation code. The simulated efficiency of the 662 keV photon was used as a base to calculate other photon efficiencies in a point source and a Marinelli Beaker form. In the Marinelli Beaker filled with water case, the efficiency of the 59 keV low energy photons from Am-241 was estimated with a 9% error compared to the MCNP5 simulated efficiency. The 1.17 and 1.33 MeV high energy photons emitted by Co-60 had errors of 4% and 5%, respectively. The estimated errors are considered acceptable in calculating the activity of unknown samples as they fall within the 95% confidence level.

Keywords: MCNP5, MonteCarlo simulations, efficiency calculation, absolute efficiency, activity estimation, Cs-137

Procedia PDF Downloads 102
24659 Developing a Set of Primers Targeting Chondroitin Ac Lyase Gene for Specific and Sensitive Detection of Flavobacterium Columnare, a Causative Agent of Freshwater Columnaris

Authors: Mahmoud Mabrok, Channarong Rodkhum

Abstract:

Flavobacterium columanre is one of the devastating pathogen that causes noticeable economic losses in freshwater cultured fish. Like other filamentous bacteria, F. columanre tends to aggregate and fluctuate to all kind of media, thus revealing obstacles in recognition of its colonies. Since the molecular typing is the only fundamental tool for rapid and precise detection of this pathgen. The present study developed a species-specific PCR assay based on cslA unique gene of F. columnare. The cslA gene sequences of 13 F. columnare, strains retrieved from gene bank database, were aligned to identify a conserved homologous segment prior to primers design. The new primers yielded amplicons of 287 bp from F. columnare strains but not from relevant or other pathogens, unlike to other published set that showed no specificity and cross-reactivity with F. indicum. The primers were sensitive and detected as few as 7 CFUs of bacteria and 3 pg of gDNA template. The sensitivity was reduced ten times when using tissue samples. These primers precisely defined all field isolates in a double-blind study, proposing their applicable use for field detection.

Keywords: Columnaris infection, cslA gene, Flavobacterium columnare, PCR

Procedia PDF Downloads 109
24658 Thermal Analysis and Experimental Procedure of Integrated Phase Change Material in a Storage Tank

Authors: Chargui Ridha, Agrebi Sameh

Abstract:

The integration of phase change materials (PCM) for the storage of thermal energy during the period of sunshine before being released during the night is a complement of free energy to improve the system formed by a solar collector, tank storage, and a heat exchanger. This paper is dedicated to the design of a thermal storage tank based on a PCM-based heat exchanger. The work is divided into two parts: an experimental part using paraffin as PCM was carried out within the Laboratory of Thermal Processes of Borj Cedria in order to improve the performance of the system formed by the coupling of a flat solar collector and a thermal storage tank and to subsequently determine the influence of PCM on the whole system. This phase is based on the measurement instrumentation, namely, a differential scanning calorimeter (DSC) and the thermal analyzer (hot disk: HOT DISK) in order to determine the physical properties of the paraffin (PCM), which has been chosen. The second phase involves the detailed design of the PCM heat exchanger, which is incorporated into a thermal storage tank and coupled with a solar air collector installed at the Research and Technology Centre of Energy (CRTEn). A numerical part based on the TRANSYS and Fluent software, as well as the finite volume method, was carried out for the storage reservoir systems in order to determine the temperature distribution in each chosen system.

Keywords: phase change materials, storage tank, heat exchanger, flat plate collector

Procedia PDF Downloads 77
24657 Feature Engineering Based Detection of Buffer Overflow Vulnerability in Source Code Using Deep Neural Networks

Authors: Mst Shapna Akter, Hossain Shahriar

Abstract:

One of the most important challenges in the field of software code audit is the presence of vulnerabilities in software source code. Every year, more and more software flaws are found, either internally in proprietary code or revealed publicly. These flaws are highly likely exploited and lead to system compromise, data leakage, or denial of service. C and C++ open-source code are now available in order to create a largescale, machine-learning system for function-level vulnerability identification. We assembled a sizable dataset of millions of opensource functions that point to potential exploits. We developed an efficient and scalable vulnerability detection method based on deep neural network models that learn features extracted from the source codes. The source code is first converted into a minimal intermediate representation to remove the pointless components and shorten the dependency. Moreover, we keep the semantic and syntactic information using state-of-the-art word embedding algorithms such as glove and fastText. The embedded vectors are subsequently fed into deep learning networks such as LSTM, BilSTM, LSTM-Autoencoder, word2vec, BERT, and GPT-2 to classify the possible vulnerabilities. Furthermore, we proposed a neural network model which can overcome issues associated with traditional neural networks. Evaluation metrics such as f1 score, precision, recall, accuracy, and total execution time have been used to measure the performance. We made a comparative analysis between results derived from features containing a minimal text representation and semantic and syntactic information. We found that all of the deep learning models provide comparatively higher accuracy when we use semantic and syntactic information as the features but require higher execution time as the word embedding the algorithm puts on a bit of complexity to the overall system.

Keywords: cyber security, vulnerability detection, neural networks, feature extraction

Procedia PDF Downloads 69
24656 Metamorphic Computer Virus Classification Using Hidden Markov Model

Authors: Babak Bashari Rad

Abstract:

A metamorphic computer virus uses different code transformation techniques to mutate its body in duplicated instances. Characteristics and function of new instances are mostly similar to their parents, but they cannot be easily detected by the majority of antivirus in market, as they depend on string signature-based detection techniques. The purpose of this research is to propose a Hidden Markov Model for classification of metamorphic viruses in executable files. In the proposed solution, portable executable files are inspected to extract the instructions opcodes needed for the examination of code. A Hidden Markov Model trained on portable executable files is employed to classify the metamorphic viruses of the same family. The proposed model is able to generate and recognize common statistical features of mutated code. The model has been evaluated by examining the model on a test data set. The performance of the model has been practically tested and evaluated based on False Positive Rate, Detection Rate and Overall Accuracy. The result showed an acceptable performance with high average of 99.7% Detection Rate.

Keywords: malware classification, computer virus classification, metamorphic virus, metamorphic malware, Hidden Markov Model

Procedia PDF Downloads 298
24655 Development, Evaluation and Scale-Up of a Mental Health Care Plan (MHCP) in Nepal

Authors: Nagendra P. Luitel, Mark J. D. Jordans

Abstract:

Globally, there is a significant gap between the number of individuals in need of mental health care and those who actually receive treatment. The evidence is accumulating that mental health services can be delivered effectively by primary health care workers through community-based programs and task-sharing approaches. Changing the role of specialist mental health workers from service delivery to building clinical capacity of the primary health care (PHC) workers could help in reducing treatment gap in low and middle-income countries (LMICs). We developed a comprehensive mental health care plan in 2012 and evaluated its feasibility and effectiveness over the past three years. Initially, a mixed method formative study was conducted for the development of mental health care plan (MHCP). Routine monitoring and evaluation data, including client flow and reports of satisfaction, were obtained from beneficiaries (n=135) during the pilot-testing phase. Repeated community survey (N=2040); facility detection survey (N=4704) and the cohort study (N=576) were conducted for evaluation of the MHCP. The resulting MHCP consists of twelve packages divided over the community, health facility, and healthcare organization platforms. Detection of mental health problems increased significantly after introducing MHCP. Service implementation data support the real-life applicability of the MHCP, with reasonable treatment uptake. Currently, MHCP has been implemented in the entire Chitwan district where over 1400 people (438 people with depression, 406 people with psychosis, 181 people with epilepsy, 360 people with alcohol use disorder and 51 others) have received mental health services from trained health workers. Key barriers were identified and addressed, namely dissatisfaction with privacy, perceived burden among health workers, high drop-out rates and continue the supply of medicines. The results indicated that involvement of PHC workers in detection and management of mental health problems is an effective strategy to minimize treatment gap on mental health care in Nepal.

Keywords: mental health, Nepal, primary care, treatment gap

Procedia PDF Downloads 280
24654 Digital Image Forensics: Discovering the History of Digital Images

Authors: Gurinder Singh, Kulbir Singh

Abstract:

Digital multimedia contents such as image, video, and audio can be tampered easily due to the availability of powerful editing softwares. Multimedia forensics is devoted to analyze these contents by using various digital forensic techniques in order to validate their authenticity. Digital image forensics is dedicated to investigate the reliability of digital images by analyzing the integrity of data and by reconstructing the historical information of an image related to its acquisition phase. In this paper, a survey is carried out on the forgery detection by considering the most recent and promising digital image forensic techniques.

Keywords: Computer Forensics, Multimedia Forensics, Image Ballistics, Camera Source Identification, Forgery Detection

Procedia PDF Downloads 224
24653 Health of Riveted Joints with Active and Passive Structural Health Monitoring Techniques

Authors: Javad Yarmahmoudi, Alireza Mirzaee

Abstract:

Many active and passive structural health monitoring (SHM) techniques have been developed for detection of the defects of plates. Generally, riveted joints hold the plates together and their failure may create accidents. In this study, well known active and passive methods were modified for the evaluation of the health of the riveted joints between the plates. The active method generated Lamb waves and monitored their propagation by using lead zirconate titanate (PZT) disks. The signal was analyzed by using the wavelet transformations. The passive method used the Fiber Bragg Grating (FBG) sensors and evaluated the spectral characteristics of the signals by using Fast Fourier Transformation (FFT). The results indicated that the existing methods designed for the evaluation of the health of individual plates may be used for inspection of riveted joints with software modifications.

Keywords: structural health monitoring, SHM, active SHM, passive SHM, fiber bragg grating sensor, lead zirconate titanate, PZT

Procedia PDF Downloads 308
24652 Using Machine Learning to Build a Real-Time COVID-19 Mask Safety Monitor

Authors: Yash Jain

Abstract:

The US Center for Disease Control has recommended wearing masks to slow the spread of the virus. The research uses a video feed from a camera to conduct real-time classifications of whether or not a human is correctly wearing a mask, incorrectly wearing a mask, or not wearing a mask at all. Utilizing two distinct datasets from the open-source website Kaggle, a mask detection network had been trained. The first dataset that was used to train the model was titled 'Face Mask Detection' on Kaggle, where the dataset was retrieved from and the second dataset was titled 'Face Mask Dataset, which provided the data in a (YOLO Format)' so that the TinyYoloV3 model could be trained. Based on the data from Kaggle, two machine learning models were implemented and trained: a Tiny YoloV3 Real-time model and a two-stage neural network classifier. The two-stage neural network classifier had a first step of identifying distinct faces within the image, and the second step was a classifier to detect the state of the mask on the face and whether it was worn correctly, incorrectly, or no mask at all. The TinyYoloV3 was used for the live feed as well as for a comparison standpoint against the previous two-stage classifier and was trained using the darknet neural network framework. The two-stage classifier attained a mean average precision (MAP) of 80%, while the model trained using TinyYoloV3 real-time detection had a mean average precision (MAP) of 59%. Overall, both models were able to correctly classify stages/scenarios of no mask, mask, and incorrectly worn masks.

Keywords: datasets, classifier, mask-detection, real-time, TinyYoloV3, two-stage neural network classifier

Procedia PDF Downloads 137
24651 Hate Speech Detection Using Machine Learning: A Survey

Authors: Edemealem Desalegn Kingawa, Kafte Tasew Timkete, Mekashaw Girmaw Abebe, Terefe Feyisa, Abiyot Bitew Mihretie, Senait Teklemarkos Haile

Abstract:

Currently, hate speech is a growing challenge for society, individuals, policymakers, and researchers, as social media platforms make it easy to anonymously create and grow online friends and followers and provide an online forum for debate about specific issues of community life, culture, politics, and others. Despite this, research on identifying and detecting hate speech is not satisfactory performance, and this is why future research on this issue is constantly called for. This paper provides a systematic review of the literature in this field, with a focus on approaches like word embedding techniques, machine learning, deep learning technologies, hate speech terminology, and other state-of-the-art technologies with challenges. In this paper, we have made a systematic review of the last six years of literature from Research Gate and Google Scholar. Furthermore, limitations, along with algorithm selection and use challenges, data collection, and cleaning challenges, and future research directions, are discussed in detail.

Keywords: Amharic hate speech, deep learning approach, hate speech detection review, Afaan Oromo hate speech detection

Procedia PDF Downloads 151
24650 Microfluidic Plasmonic Bio-Sensing of Exosomes by Using a Gold Nano-Island Platform

Authors: Srinivas Bathini, Duraichelvan Raju, Simona Badilescu, Muthukumaran Packirisamy

Abstract:

A bio-sensing method, based on the plasmonic property of gold nano-islands, has been developed for detection of exosomes in a clinical setting. The position of the gold plasmon band in the UV-Visible spectrum depends on the size and shape of gold nanoparticles as well as on the surrounding environment. By adsorbing various chemical entities, or binding them, the gold plasmon band will shift toward longer wavelengths and the shift is proportional to the concentration. Exosomes transport cargoes of molecules and genetic materials to proximal and distal cells. Presently, the standard method for their isolation and quantification from body fluids is by ultracentrifugation, not a practical method to be implemented in a clinical setting. Thus, a versatile and cutting-edge platform is required to selectively detect and isolate exosomes for further analysis at clinical level. The new sensing protocol, instead of antibodies, makes use of a specially synthesized polypeptide (Vn96), to capture and quantify the exosomes from different media, by binding the heat shock proteins from exosomes. The protocol has been established and optimized by using a glass substrate, in order to facilitate the next stage, namely the transfer of the protocol to a microfluidic environment. After each step of the protocol, the UV-Vis spectrum was recorded and the position of gold Localized Surface Plasmon Resonance (LSPR) band was measured. The sensing process was modelled, taking into account the characteristics of the nano-island structure, prepared by thermal convection and annealing. The optimal molar ratios of the most important chemical entities, involved in the detection of exosomes were calculated as well. Indeed, it was found that the results of the sensing process depend on the two major steps: the molar ratios of streptavidin to biotin-PEG-Vn96 and, the final step, the capture of exosomes by the biotin-PEG-Vn96 complex. The microfluidic device designed for sensing of exosomes consists of a glass substrate, sealed by a PDMS layer that contains the channel and a collecting chamber. In the device, the solutions of linker, cross-linker, etc., are pumped over the gold nano-islands and an Ocean Optics spectrometer is used to measure the position of the Au plasmon band at each step of the sensing. The experiments have shown that the shift of the Au LSPR band is proportional to the concentration of exosomes and, thereby, exosomes can be accurately quantified. An important advantage of the method is the ability to discriminate between exosomes having different origins.

Keywords: exosomes, gold nano-islands, microfluidics, plasmonic biosensing

Procedia PDF Downloads 155
24649 Accurate Positioning Method of Indoor Plastering Robot Based on Line Laser

Authors: Guanqiao Wang, Hongyang Yu

Abstract:

There is a lot of repetitive work in the traditional construction industry. These repetitive tasks can significantly improve production efficiency by replacing manual tasks with robots. There- fore, robots appear more and more frequently in the construction industry. Navigation and positioning are very important tasks for construction robots, and the requirements for accuracy of positioning are very high. Traditional indoor robots mainly use radiofrequency or vision methods for positioning. Compared with ordinary robots, the indoor plastering robot needs to be positioned closer to the wall for wall plastering, so the requirements for construction positioning accuracy are higher, and the traditional navigation positioning method has a large error, which will cause the robot to move. Without the exact position, the wall cannot be plastered, or the error of plastering the wall is large. A new positioning method is proposed, which is assisted by line lasers and uses image processing-based positioning to perform more accurate positioning on the traditional positioning work. In actual work, filter, edge detection, Hough transform and other operations are performed on the images captured by the camera. Each time the position of the laser line is found, it is compared with the standard value, and the position of the robot is moved or rotated to complete the positioning work. The experimental results show that the actual positioning error is reduced to less than 0.5 mm by this accurate positioning method.

Keywords: indoor plastering robot, navigation, precise positioning, line laser, image processing

Procedia PDF Downloads 127
24648 Evaluation of Osteoprotegrin (OPG) and Tumor Necrosis Factor A (TNF-A) Changes in Synovial Fluid and Serum in Dogs with Osteoarthritis; An Experimental Study

Authors: Behrooz Nikahval, Mohammad Saeed Ahrari-Khafi, Sakineh Behroozpoor, Saeed Nazifi

Abstract:

Osteoarthritis (OA) is a progressive and degenerative condition of the articular cartilage and other joints’ structures. It is essential to diagnose this condition as early as possible. The present research was performed to measure the Osteoprotegrin (OPG) and Tumor Necrosis Factor α (TNF-α) in synovial fluid and blood serum of dogs with surgically transected cruciate ligament as a model of OA, to evaluate if measuring of these parameters can be used as a way of early diagnosis of OA. In the present study, four mature, clinically healthy dogs were selected to investigate the effect of experimental OA, on OPG and TNF-α as a way of early detection of OA. OPG and TNF-α were measured in synovial fluid and blood serum on days 0, 14, 28, 90 and 180 after surgical transaction of cranial cruciate ligament in one stifle joint. Statistical analysis of the results showed that there was a significant increase in TNF-α in both synovial fluid and blood serum. OPG showed a decrease two weeks after OA induction. However, it fluctuated afterward. In conclusion, TNF-α could be used in both synovial fluid and blood serum as a way of early detection of OA; however, further research still needs to be conducted on OPG values in OA detection.

Keywords: osteoarthritis, osteoprotegrin, tumor necrosis factor α, synovial fluid, serum, dog

Procedia PDF Downloads 304
24647 Intelligent Fault Diagnosis for the Connection Elements of Modular Offshore Platforms

Authors: Jixiang Lei, Alexander Fuchs, Franz Pernkopf, Katrin Ellermann

Abstract:

Within the Space@Sea project, funded by the Horizon 2020 program, an island consisting of multiple platforms was designed. The platforms are connected by ropes and fenders. The connection is critical with respect to the safety of the whole system. Therefore, fault detection systems are investigated, which could detect early warning signs for a possible failure in the connection elements. Previously, a model-based method called Extended Kalman Filter was developed to detect the reduction of rope stiffness. This method detected several types of faults reliably, but some types of faults were much more difficult to detect. Furthermore, the model-based method is sensitive to environmental noise. When the wave height is low, a long time is needed to detect a fault and the accuracy is not always satisfactory. In this sense, it is necessary to develop a more accurate and robust technique that can detect all rope faults under a wide range of operational conditions. Inspired by this work on the Space at Sea design, we introduce a fault diagnosis method based on deep neural networks. Our method cannot only detect rope degradation by using the acceleration data from each platform but also estimate the contributions of the specific acceleration sensors using methods from explainable AI. In order to adapt to different operational conditions, the domain adaptation technique DANN is applied. The proposed model can accurately estimate rope degradation under a wide range of environmental conditions and help users understand the relationship between the output and the contributions of each acceleration sensor.

Keywords: fault diagnosis, deep learning, domain adaptation, explainable AI

Procedia PDF Downloads 160
24646 Microfluidic Method for Measuring Blood Viscosity

Authors: Eunseop Yeom

Abstract:

Many cardiovascular diseases, such as thrombosis and atherosclerosis, can change biochemical molecules in plasma and red blood cell. These alterations lead to excessive increase of blood viscosity contributing to peripheral vascular diseases. In this study, a simple microfluidic-based method is used to measure blood viscosity. Microfluidic device is composed of two parallel side channels and a bridge channel. To estimate blood viscosity, blood samples and reference fluid are separately delivered into each inlet of two parallel side channels using pumps. An interfacial line between blood samples and reference fluid occurs by blocking the outlet of one side-channel. Since width for this interfacial line is determined by pressure ratio between blood and reference flows, blood viscosity can be estimated by measuring width for this interfacial line. This microfluidic-based method can be used for evaluating variations in the viscosity of animal models with cardiovascular diseases under flow conditions.

Keywords: blood viscosity, microfluidic chip, pressure, shear rate

Procedia PDF Downloads 353
24645 A Review on Aviation Emissions and Their Role in Climate Change Scenarios

Authors: J. Niemisto, A. Nissinen, S. Soimakallio

Abstract:

Aviation causes carbon dioxide (CO2) emissions and other climate forcers which increase the contribution of aviation on climate change. Aviation industry and number of air travellers are constantly increasing. Aviation industry has an ambitious goal to strongly cut net CO2 emissions. Modern fleet, alternative jet fuels technologies and route optimisation are important technological tools in the emission reduction. Faster approaches are needed as well. Emission trade systems, voluntary carbon offset compensation schemes and taxation are already in operation. Global scenarios of aviation industry and its greenhouse gas emissions and other climate forcers are discussed in this review study based on literature and other published data. The focus is on the aviation in Nordic countries, but also European and global situation are considered. Different emission reduction technologies and compensation modes are examined. In addition, the role of aviation in a single passenger’s (a Finnish consumer) annual carbon footprint is analysed and a comparison of available emission calculators and carbon offset systems is performed. Long-haul fights have a significant role in a single consumer´s and company´s carbon footprint, but remarkable change in global emission level would need a huge change in attitudes towards flying.

Keywords: aviation, climate change, emissions, environment

Procedia PDF Downloads 189
24644 A Study to Examine the Use of Traditional Agricultural Practices to Fight the Effects of Climate Change

Authors: Rushva Parihar, Anushka Barua

Abstract:

The negative repercussions of a warming planet are already visible, with biodiversity loss, water scarcity, and extreme weather events becoming ever so frequent. The agriculture sector is perhaps the most impacted, and modern agriculture has failed to defend farmers from the effects of climate change. This, coupled with the added pressure of higher demands for food production caused due to population growth, has only compounded the impact. Traditional agricultural practices that are routed in indigenous knowledge have long safeguarded the delicate balance of the ecosystem through sustainable production techniques. This paper uses secondary data to explore these traditional processes (like Beejamrita, Jeevamrita, sheep penning, earthen bunding, and others) from around the world that have been developed over centuries and focuses on how they can be used to tackle contemporary issues arising from climate change (such as nutrient and water loss, soil degradation, increased incidences of pests). Finally, the resulting framework has been applied to the context of Indian agriculture as a means to combat climate change and improve food security, all while encouraging documentation and transfer of local knowledge as a shared resource among farmers.

Keywords: sustainable food systems, traditional agricultural practices, climate smart agriculture, climate change, indigenous knowledge

Procedia PDF Downloads 109
24643 Investigation of Fire Damaged Concrete Using Nonlinear Resonance Vibration Method

Authors: Kang-Gyu Park, Sun-Jong Park, Hong Jae Yim, Hyo-Gyung Kwak

Abstract:

This paper attempts to evaluate the effect of fire damage on concrete by using nonlinear resonance vibration method, one of the nonlinear nondestructive method. Concrete exhibits not only nonlinear stress-strain relation but also hysteresis and discrete memory effect which are contained in consolidated materials. Hysteretic materials typically show the linear resonance frequency shift. Also, the shift of resonance frequency is changed according to the degree of micro damage. The degree of the shift can be obtained through nonlinear resonance vibration method. Five exposure scenarios were considered in order to make different internal micro damage. Also, the effect of post-fire-curing on fire-damaged concrete was taken into account to conform the change in internal damage. Hysteretic non linearity parameter was obtained by amplitude-dependent resonance frequency shift after specific curing periods. In addition, splitting tensile strength was measured on each sample to characterize the variation of residual strength. Then, a correlation between the hysteretic non linearity parameter and residual strength was proposed from each test result.

Keywords: nonlinear resonance vibration method, non linearity parameter, splitting tensile strength, micro damage, post-fire-curing, fire damaged concrete

Procedia PDF Downloads 251
24642 A Review on Application of Phase Change Materials in Textiles Finishing

Authors: Mazyar Ahrari, Ramin Khajavi, Mehdi Kamali Dolatabadi, Tayebeh Toliyat, Abosaeed Rashidi

Abstract:

Fabric as the first and most common layer that is in permanent contact with human skin is a very good interface to provide coverage, as well as heat and cold insulation. Phase change materials (PCMs) are organic and inorganic compounds which have the capability of absorbing and releasing noticeable amounts of latent heat during phase transitions between solid and liquid phases at a low temperature range. PCMs come across phase changes (liquid-solid and solid-liquid transitions) during absorbing and releasing thermal heat; so, in order to use them for a long time, they should have been encapsulated in polymeric shells, so-called microcapsules. Microencapsulation and nanoencapsulation methods have been developed in order to reduce the reactivity of a PCM with outside environment, promoting the ease of handling, decreasing the diffusion and evaporation rates. Methods of incorporation of PCMs in textiles such as electrospinning and determining thermal properties had been summarized. Paraffin waxes catch a lot of attention due to their high thermal storage density, repeatability of phase change, thermal stability, small volume change during phase transition, chemical stability, non-toxicity, non-flammability, non-corrosive and low cost and they seem to play a key role in confronting with climate change and global warming. In this article, we aimed to review the researches concentrating on the characteristics of PCMs and new materials and methods of microencapsulation.

Keywords: thermoregulation, microencapsulation, phase change materials, thermal energy storage, nanoencapsulation

Procedia PDF Downloads 369
24641 Change in Food Choice Behavior: Trend and Challenges

Authors: Gargi S. Kumar, Mrinmoyi Kulkarni

Abstract:

Food choice behavior is complex and determined by biological, psychological, socio-cultural, and economic factors. The past two decades, have seen dramatic changes in food consumption patterns among urban Indian consumers. The objective of the current study was to evaluate perceptions about changes with respect to food choice behavior. Ten participants [urban men and women] ranging in age from 40 to 65 were selected and in-depth interviews were conducted with a set of open ended questions. The recorded interviews were transcribed and thematically analyzed using inductive, open and axial coding. The results identified themes that act as drivers and consequences of change in food choice behavior. Drivers such as globalization [sub themes of urbanization, education, income, and work environment], media and advertising, changing gender roles, women in the workforce, and change in family structure have influenced food choice, both at an individual and national level. The consequences of changes in food choice were health implications, processed food consumption, food decisions driven by children and eating out among others. The study reveals that, over time, food choices change and evolve. However it is interesting to note how market forces and culture interact to influence individual behavior and the overall food environment which subsequently affects food choice and the health of the people.

Keywords: change, consequences, drivers, food choice, globalization

Procedia PDF Downloads 209
24640 Investigation of Suspected Viral Hepatitis Outbreaks in North India

Authors: Mini P. Singh, Manasi Majumdar, Kapil Goyal, Pvm Lakshmi, Deepak Bhatia, Radha Kanta Ratho

Abstract:

India is endemic for Hepatitis E virus and frequent water borne outbreaks are reported. The conventional diagnosis rests on the detection of serum anti-HEV IgM antibodies which may take 7-10 days to develop. Early diagnosis in such a situation is desirable for the initiation of prompt control measures. The present study compared three diagnostic methods in 60 samples collected during two suspected HEV outbreaks in the vicinity of Chandigarh, India. The anti-HEV IgM, HEV antigen and HEV-RNA could be detected in serum samples of 52 (86.66%), 16 (26.66%) and 18 (30%) patients respectively. The suitability of saliva samples for antibody detection was also evaluated in 21 paired serum- saliva samples. A total of 15 serum samples showed the presence of anti HEV IgM antibodies, out of which 10 (10/15; 66.6%) were also positive for these antibodies in saliva samples (χ2 = 7.636, p < 0.0057), thus showing a concordance of 76.91%. The positivity of reverse transcriptase PCR and HEV antigen detection was 100% within one week of illness which declined to 5-10% thereafter. The outbreak was attributed to HEV Genotype 1, Subtype 1a and the clinical and environmental strains clustered together. HEV antigen and RNA were found to be an early diagnostic marker with 96.66% concordance. The results indicate that the saliva samples can be used as an alternative to serum samples in an outbreak situation.

Keywords: HEV-antigen, outbreak, phylogenetic analysis, saliva

Procedia PDF Downloads 393
24639 Grid Pattern Recognition and Suppression in Computed Radiographic Images

Authors: Igor Belykh

Abstract:

Anti-scatter grids used in radiographic imaging for the contrast enhancement leave specific artifacts. Those artifacts may be visible or may cause Moiré effect when a digital image is resized on a diagnostic monitor. In this paper, we propose an automated grid artifacts detection and suppression algorithm which is still an actual problem. Grid artifacts detection is based on statistical approach in spatial domain. Grid artifacts suppression is based on Kaiser bandstop filter transfer function design and application avoiding ringing artifacts. Experimental results are discussed and concluded with description of advantages over existing approaches.

Keywords: grid, computed radiography, pattern recognition, image processing, filtering

Procedia PDF Downloads 260
24638 A Comparative Study of Medical Image Segmentation Methods for Tumor Detection

Authors: Mayssa Bensalah, Atef Boujelben, Mouna Baklouti, Mohamed Abid

Abstract:

Image segmentation has a fundamental role in analysis and interpretation for many applications. The automated segmentation of organs and tissues throughout the body using computed imaging has been rapidly increasing. Indeed, it represents one of the most important parts of clinical diagnostic tools. In this paper, we discuss a thorough literature review of recent methods of tumour segmentation from medical images which are briefly explained with the recent contribution of various researchers. This study was followed by comparing these methods in order to define new directions to develop and improve the performance of the segmentation of the tumour area from medical images.

Keywords: features extraction, image segmentation, medical images, tumor detection

Procedia PDF Downloads 147
24637 Early Diagnosis of Myocardial Ischemia Based on Support Vector Machine and Gaussian Mixture Model by Using Features of ECG Recordings

Authors: Merve Begum Terzi, Orhan Arikan, Adnan Abaci, Mustafa Candemir

Abstract:

Acute myocardial infarction is a major cause of death in the world. Therefore, its fast and reliable diagnosis is a major clinical need. ECG is the most important diagnostic methodology which is used to make decisions about the management of the cardiovascular diseases. In patients with acute myocardial ischemia, temporary chest pains together with changes in ST segment and T wave of ECG occur shortly before the start of myocardial infarction. In this study, a technique which detects changes in ST/T sections of ECG is developed for the early diagnosis of acute myocardial ischemia. For this purpose, a database of real ECG recordings that contains a set of records from 75 patients presenting symptoms of chest pain who underwent elective percutaneous coronary intervention (PCI) is constituted. 12-lead ECG’s of the patients were recorded before and during the PCI procedure. Two ECG epochs, which are the pre-inflation ECG which is acquired before any catheter insertion and the occlusion ECG which is acquired during balloon inflation, are analyzed for each patient. By using pre-inflation and occlusion recordings, ECG features that are critical in the detection of acute myocardial ischemia are identified and the most discriminative features for the detection of acute myocardial ischemia are extracted. A classification technique based on support vector machine (SVM) approach operating with linear and radial basis function (RBF) kernels to detect ischemic events by using ST-T derived joint features from non-ischemic and ischemic states of the patients is developed. The dataset is randomly divided into training and testing sets and the training set is used to optimize SVM hyperparameters by using grid-search method and 10fold cross-validation. SVMs are designed specifically for each patient by tuning the kernel parameters in order to obtain the optimal classification performance results. As a result of implementing the developed classification technique to real ECG recordings, it is shown that the proposed technique provides highly reliable detections of the anomalies in ECG signals. Furthermore, to develop a detection technique that can be used in the absence of ECG recording obtained during healthy stage, the detection of acute myocardial ischemia based on ECG recordings of the patients obtained during ischemia is also investigated. For this purpose, a Gaussian mixture model (GMM) is used to represent the joint pdf of the most discriminating ECG features of myocardial ischemia. Then, a Neyman-Pearson type of approach is developed to provide detection of outliers that would correspond to acute myocardial ischemia. Neyman – Pearson decision strategy is used by computing the average log likelihood values of ECG segments and comparing them with a range of different threshold values. For different discrimination threshold values and number of ECG segments, probability of detection and probability of false alarm values are computed, and the corresponding ROC curves are obtained. The results indicate that increasing number of ECG segments provide higher performance for GMM based classification. Moreover, the comparison between the performances of SVM and GMM based classification showed that SVM provides higher classification performance results over ECG recordings of considerable number of patients.

Keywords: ECG classification, Gaussian mixture model, Neyman–Pearson approach, support vector machine

Procedia PDF Downloads 141
24636 Performance of the Aptima® HIV-1 Quant Dx Assay on the Panther System

Authors: Siobhan O’Shea, Sangeetha Vijaysri Nair, Hee Cheol Kim, Charles Thomas Nugent, Cheuk Yan William Tong, Sam Douthwaite, Andrew Worlock

Abstract:

The Aptima® HIV-1 Quant Dx Assay is a fully automated assay on the Panther system. It is based on Transcription-Mediated Amplification and real time detection technologies. This assay is intended for monitoring HIV-1 viral load in plasma specimens and for the detection of HIV-1 in plasma and serum specimens. Nine-hundred and seventy nine specimens selected at random from routine testing at St Thomas’ Hospital, London were anonymised and used to compare the performance of the Aptima HIV-1 Quant Dx assay and Roche COBAS® AmpliPrep/COBAS® TaqMan® HIV-1 Test, v2.0. Two-hundred and thirty four specimens gave quantitative HIV-1 viral load results in both assays. The quantitative results reported by the Aptima Assay were comparable those reported by the Roche COBAS AmpliPrep/COBAS TaqMan HIV-1 Test, v2.0 with a linear regression slope of 1.04 and an intercept on -0.097. The Aptima assay detected HIV-1 in more samples than the Roche assay. This was not due to lack of specificity of the Aptima assay because this assay gave 99.83% specificity on testing plasma specimens from 600 HIV-1 negative individuals. To understand the reason for this higher detection rate a side-by-side comparison of low level panels made from the HIV-1 3rd international standard (NIBSC10/152) and clinical samples of various subtypes were tested in both assays. The Aptima assay was more sensitive than the Roche assay. The good sensitivity, specificity and agreement with other commercial assays make the HIV-1 Quant Dx Assay appropriate for both viral load monitoring and detection of HIV-1 infections.

Keywords: HIV viral load, Aptima, Roche, Panther system

Procedia PDF Downloads 353
24635 A New Computational Package for Using in CFD and Other Problems (Third Edition)

Authors: Mohammad Reza Akhavan Khaleghi

Abstract:

This paper shows changes done to the Reduced Finite Element Method (RFEM) that its result will be the most powerful numerical method that has been proposed so far (some forms of this method are so powerful that they can approximate the most complex equations simply Laplace equation!). Finite Element Method (FEM) is a powerful numerical method that has been used successfully for the solution of the existing problems in various scientific and engineering fields such as its application in CFD. Many algorithms have been expressed based on FEM, but none have been used in popular CFD software. In this section, full monopoly is according to Finite Volume Method (FVM) due to better efficiency and adaptability with the physics of problems in comparison with FEM. It doesn't seem that FEM could compete with FVM unless it was fundamentally changed. This paper shows those changes and its result will be a powerful method that has much better performance in all subjects in comparison with FVM and another computational method. This method is not to compete with the finite volume method but to replace it.

Keywords: reduced finite element method, new computational package, new finite element formulation, new higher-order form, new isogeometric analysis

Procedia PDF Downloads 96