Search results for: web proxy detection
2126 Development of a Direct Immunoassay for Human Ferritin Using Diffraction-Based Sensing Method
Authors: Joel Ballesteros, Harriet Jane Caleja, Florian Del Mundo, Cherrie Pascual
Abstract:
Diffraction-based sensing was utilized in the quantification of human ferritin in blood serum to provide an alternative to label-based immunoassays currently used in clinical diagnostics and researches. The diffraction intensity was measured by the diffractive optics technology or dotLab™ system. Two methods were evaluated in this study: direct immunoassay and direct sandwich immunoassay. In the direct immunoassay, human ferritin was captured by human ferritin antibodies immobilized on an avidin-coated sensor while the direct sandwich immunoassay had an additional step for the binding of a detector human ferritin antibody on the analyte complex. Both methods were repeatable with coefficient of variation values below 15%. The direct sandwich immunoassay had a linear response from 10 to 500 ng/mL which is wider than the 100-500 ng/mL of the direct immunoassay. The direct sandwich immunoassay also has a higher calibration sensitivity with value 0.002 Diffractive Intensity (ng mL-1)-1) compared to the 0.004 Diffractive Intensity (ng mL-1)-1 of the direct immunoassay. The limit of detection and limit of quantification values of the direct immunoassay were found to be 29 ng/mL and 98 ng/mL, respectively, while the direct sandwich immunoassay has a limit of detection (LOD) of 2.5 ng/mL and a limit of quantification (LOQ) of 8.2 ng/mL. In terms of accuracy, the direct immunoassay had a percent recovery of 88.8-93.0% in PBS while the direct sandwich immunoassay had 94.1 to 97.2%. Based on the results, the direct sandwich immunoassay is a better diffraction-based immunoassay in terms of accuracy, LOD, LOQ, linear range, and sensitivity. The direct sandwich immunoassay was utilized in the determination of human ferritin in blood serum and the results are validated by Chemiluminescent Magnetic Immunoassay (CMIA). The calculated Pearson correlation coefficient was 0.995 and the p-values of the paired-sample t-test were less than 0.5 which show that the results of the direct sandwich immunoassay was comparable to that of CMIA and could be utilized as an alternative analytical method.Keywords: biosensor, diffraction, ferritin, immunoassay
Procedia PDF Downloads 3522125 Modern Information Security Management and Digital Technologies: A Comprehensive Approach to Data Protection
Authors: Mahshid Arabi
Abstract:
With the rapid expansion of digital technologies and the internet, information security has become a critical priority for organizations and individuals. The widespread use of digital tools such as smartphones and internet networks facilitates the storage of vast amounts of data, but simultaneously, vulnerabilities and security threats have significantly increased. The aim of this study is to examine and analyze modern methods of information security management and to develop a comprehensive model to counteract threats and information misuse. This study employs a mixed-methods approach, including both qualitative and quantitative analyses. Initially, a systematic review of previous articles and research in the field of information security was conducted. Then, using the Delphi method, interviews with 30 information security experts were conducted to gather their insights on security challenges and solutions. Based on the results of these interviews, a comprehensive model for information security management was developed. The proposed model includes advanced encryption techniques, machine learning-based intrusion detection systems, and network security protocols. AES and RSA encryption algorithms were used for data protection, and machine learning models such as Random Forest and Neural Networks were utilized for intrusion detection. Statistical analyses were performed using SPSS software. To evaluate the effectiveness of the proposed model, T-Test and ANOVA statistical tests were employed, and results were measured using accuracy, sensitivity, and specificity indicators of the models. Additionally, multiple regression analysis was conducted to examine the impact of various variables on information security. The findings of this study indicate that the comprehensive proposed model reduced cyber-attacks by an average of 85%. Statistical analysis showed that the combined use of encryption techniques and intrusion detection systems significantly improves information security. Based on the obtained results, it is recommended that organizations continuously update their information security systems and use a combination of multiple security methods to protect their data. Additionally, educating employees and raising public awareness about information security can serve as an effective tool in reducing security risks. This research demonstrates that effective and up-to-date information security management requires a comprehensive and coordinated approach, including the development and implementation of advanced techniques and continuous training of human resources.Keywords: data protection, digital technologies, information security, modern management
Procedia PDF Downloads 282124 Experimental-Numerical Inverse Approaches in the Characterization and Damage Detection of Soft Viscoelastic Layers from Vibration Test Data
Authors: Alaa Fezai, Anuj Sharma, Wolfgang Mueller-Hirsch, André Zimmermann
Abstract:
Viscoelastic materials have been widely used in the automotive industry over the last few decades with different functionalities. Besides their main application as a simple and efficient surface damping treatment, they may ensure optimal operating conditions for on-board electronics as thermal interface or sealing layers. The dynamic behavior of viscoelastic materials is generally dependent on many environmental factors, the most important being temperature and strain rate or frequency. Prior to the reliability analysis of systems including viscoelastic layers, it is, therefore, crucial to accurately predict the dynamic and lifetime behavior of these materials. This includes the identification of the dynamic material parameters under critical temperature and frequency conditions along with a precise damage localization and identification methodology. The goal of this work is twofold. The first part aims at applying an inverse viscoelastic material-characterization approach for a wide frequency range and under different temperature conditions. For this sake, dynamic measurements are carried on a single lap joint specimen using an electrodynamic shaker and an environmental chamber. The specimen consists of aluminum beams assembled to adapter plates through a viscoelastic adhesive layer. The experimental setup is reproduced in finite element (FE) simulations, and frequency response functions (FRF) are calculated. The parameters of both the generalized Maxwell model and the fractional derivatives model are identified through an optimization algorithm minimizing the difference between the simulated and the measured FRFs. The second goal of the current work is to guarantee an on-line detection of the damage, i.e., delamination in the viscoelastic bonding of the described specimen during frequency monitored end-of-life testing. For this purpose, an inverse technique, which determines the damage location and size based on the modal frequency shift and on the change of the mode shapes, is presented. This includes a preliminary FE model-based study correlating the delamination location and size to the change in the modal parameters and a subsequent experimental validation achieved through dynamic measurements of specimen with different, pre-generated crack scenarios and comparing it to the virgin specimen. The main advantage of the inverse characterization approach presented in the first part resides in the ability of adequately identifying the material damping and stiffness behavior of soft viscoelastic materials over a wide frequency range and under critical temperature conditions. Classic forward characterization techniques such as dynamic mechanical analysis are usually linked to limitations under critical temperature and frequency conditions due to the material behavior of soft viscoelastic materials. Furthermore, the inverse damage detection described in the second part guarantees an accurate prediction of not only the damage size but also its location using a simple test setup and outlines; therefore, the significance of inverse numerical-experimental approaches in predicting the dynamic behavior of soft bonding layers applied in automotive electronics.Keywords: damage detection, dynamic characterization, inverse approaches, vibration testing, viscoelastic layers
Procedia PDF Downloads 2042123 Contactless Electromagnetic Detection of Stress Fluctuations in Steel Elements
Authors: M. A. García, J. Vinolas, A. Hernando
Abstract:
Steel is nowadays one of the most important structural materials because of its outstanding mechanical properties. Therefore, in order to look for a sustainable economic model and to optimize the use of extensive resources, new methods to monitor and prevent failure of steel-based facilities are required. The classical mechanical tests, as for instance building tasting, are invasive and destructive. Moreover, for facilities where the steel element is embedded, (as reinforced concrete) these techniques are directly non applicable. Hence, non-invasive monitoring techniques to prevent failure, without altering the structural properties of the elements are required. Among them, electromagnetic methods are particularly suitable for non-invasive inspection of the mechanical state of steel-based elements. The magnetoelastic coupling effects induce a modification of the electromagnetic properties of an element upon applied stress. Since most steels are ferromagnetic because of their large Fe content, it is possible to inspect their structure and state in a non-invasive way. We present here a distinct electromagnetic method for contactless evaluation of internal stress in steel-based elements. In particular, this method relies on measuring the magnetic induction between two coils with the steel specimen in between them. We found that the alteration of electromagnetic properties of the steel specimen induced by applied stress-induced changes in the induction allowed us to detect stress well below half of the elastic limit of the material. Hence, it represents an outstanding non-invasive method to prevent failure in steel-based facilities. We here describe the theoretical model, present experimental results to validate it and finally we show a practical application for detection of stress and inhomogeneities in train railways.Keywords: magnetoelastic, magnetic induction, mechanical stress, steel
Procedia PDF Downloads 482122 Flexible Ethylene-Propylene Copolymer Nanofibers Decorated with Ag Nanoparticles as Effective 3D Surface-Enhanced Raman Scattering Substrates
Authors: Yi Li, Rui Lu, Lianjun Wang
Abstract:
With the rapid development of chemical industry, the consumption of volatile organic compounds (VOCs) has increased extensively. In the process of VOCs production and application, plenty of them have been transferred to environment. As a result, it has led to pollution problems not only in soil and ground water but also to human beings. Thus, it is important to develop a sensitive and cost-effective analytical method for trace VOCs detection in environment. Surface-enhanced Raman Spectroscopy (SERS), as one of the most sensitive optical analytical technique with rapid response, pinpoint accuracy and noninvasive detection, has been widely used for ultratrace analysis. Based on the plasmon resonance on the nanoscale metallic surface, SERS technology can even detect single molecule due to abundant nanogaps (i.e. 'hot spots') on the nanosubstrate. In this work, a self-supported flexible silver nitrate (AgNO3)/ethylene-propylene copolymer (EPM) hybrid nanofibers was fabricated by electrospinning. After an in-situ chemical reduction using ice-cold sodium borohydride as reduction agent, numerous silver nanoparticles were formed on the nanofiber surface. By adjusting the reduction time and AgNO3 content, the morphology and dimension of silver nanoparticles could be controlled. According to the principles of solid-phase extraction, the hydrophobic substance is more likely to partition into the hydrophobic EPM membrane in an aqueous environment while water and other polar components are excluded from the analytes. By the enrichment of EPM fibers, the number of hydrophobic molecules located on the 'hot spots' generated from criss-crossed nanofibers is greatly increased, which further enhances SERS signal intensity. The as-prepared Ag/EPM hybrid nanofibers were first employed to detect common SERS probe molecule (p-aminothiophenol) with the detection limit down to 10-12 M, which demonstrated an excellent SERS performance. To further study the application of the fabricated substrate for monitoring hydrophobic substance in water, several typical VOCs, such as benzene, toluene and p-xylene, were selected as model compounds. The results showed that the characteristic peaks of these target analytes in the mixed aqueous solution could be distinguished even at a concentration of 10-6 M after multi-peaks gaussian fitting process, including C-H bending (850 cm-1), C-C ring stretching (1581 cm-1, 1600 cm-1) of benzene, C-H bending (844 cm-1 ,1151 cm-1), C-C ring stretching (1001 cm-1), CH3 bending vibration (1377 cm-1) of toluene, C-H bending (829 cm-1), C-C stretching (1614 cm-1) of p-xylene. The SERS substrate has remarkable advantages which combine the enrichment capacity from EPM and the Raman enhancement of Ag nanoparticles. Meanwhile, the huge specific surface area resulted from electrospinning is benificial to increase the number of adsoption sites and promotes 'hot spots' formation. In summary, this work provides powerful potential in rapid, on-site and accurate detection of trace VOCs using a portable Raman.Keywords: electrospinning, ethylene-propylene copolymer, silver nanoparticles, SERS, VOCs
Procedia PDF Downloads 1592121 Design of Bacterial Pathogens Identification System Based on Scattering of Laser Beam Light and Classification of Binned Plots
Authors: Mubashir Hussain, Mu Lv, Xiaohan Dong, Zhiyang Li, Bin Liu, Nongyue He
Abstract:
Detection and classification of microbes have a vast range of applications in biomedical engineering especially in detection, characterization, and quantification of bacterial contaminants. For identification of pathogens, different techniques are emerging in the field of biomedical engineering. Latest technology uses light scattering, capable of identifying different pathogens without any need for biochemical processing. Bacterial Pathogens Identification System (BPIS) which uses a laser beam, passes through the sample and light scatters off. An assembly of photodetectors surrounded by the sample at different angles to detect the scattering of light. The algorithm of the system consists of two parts: (a) Library files, and (b) Comparator. Library files contain data of known species of bacterial microbes in the form of binned plots, while comparator compares data of unknown sample with library files. Using collected data of unknown bacterial species, highest voltage values stored in the form of peaks and arranged in 3D histograms to find the frequency of occurrence. Resulting data compared with library files of known bacterial species. If sample data matching with any library file of known bacterial species, sample identified as a matched microbe. An experiment performed to identify three different bacteria particles: Enterococcus faecalis, Pseudomonas aeruginosa, and Escherichia coli. By applying algorithm using library files of given samples, results were compromising. This system is potentially applicable to several biomedical areas, especially those related to cell morphology.Keywords: microbial identification, laser scattering, peak identification, binned plots classification
Procedia PDF Downloads 1462120 Evaluation of Simulated Noise Levels through the Analysis of Temperature and Rainfall: A Case Study of Nairobi Central Business District
Authors: Emmanuel Yussuf, John Muthama, John Ng'ang'A
Abstract:
There has been increasing noise levels all over the world in the last decade. Many factors contribute to this increase, which is causing health related effects to humans. Developing countries are not left out of the whole picture as they are still growing and advancing their development. Motor vehicles are increasing on urban roads; there is an increase in infrastructure due to the rising population, increasing number of industries to provide goods and so many other activities. All this activities lead to the high noise levels in cities. This study was conducted in Nairobi’s Central Business District (CBD) with the main objective of simulating noise levels in order to understand the noise exposed to the people within the urban area, in relation to weather parameters namely temperature, rainfall and wind field. The study was achieved using the Neighbourhood Proximity Model and Time Series Analysis, with data obtained from proxies/remotely-sensed from satellites, in order to establish the levels of noise exposed to which people of Nairobi CBD are exposed to. The findings showed that there is an increase in temperature (0.1°C per year) and a decrease in precipitation (40 mm per year), which in comparison to the noise levels in the area, are increasing. The study also found out that noise levels exposed to people in Nairobi CBD were roughly between 61 and 63 decibels and has been increasing, a level which is high and likely to cause adverse physical and psychological effects on the human body in which air temperature, precipitation and wind contribute so much in the spread of noise. As a noise reduction measure, the use of sound proof materials in buildings close to busy roads, implementation of strict laws to most emitting sources as well as further research on the study was recommended. The data used for this study ranged from the year 2000 to 2015, rainfall being in millimeters (mm), temperature in degrees Celsius (°C) and the urban form characteristics being in meters (m).Keywords: simulation, noise exposure, weather, proxy
Procedia PDF Downloads 3782119 Data Compression in Ultrasonic Network Communication via Sparse Signal Processing
Authors: Beata Zima, Octavio A. Márquez Reyes, Masoud Mohammadgholiha, Jochen Moll, Luca de Marchi
Abstract:
This document presents the approach of using compressed sensing in signal encoding and information transferring within a guided wave sensor network, comprised of specially designed frequency steerable acoustic transducers (FSATs). Wave propagation in a damaged plate was simulated using commercial FEM-based software COMSOL. Guided waves were excited by means of FSATs, characterized by the special shape of its electrodes, and modeled using PIC255 piezoelectric material. The special shape of the FSAT, allows for focusing wave energy in a certain direction, accordingly to the frequency components of its actuation signal, which makes available a larger monitored area. The process begins when a FSAT detects and records reflection from damage in the structure, this signal is then encoded and prepared for transmission, using a combined approach, based on Compressed Sensing Matching Pursuit and Quadrature Amplitude Modulation (QAM). After codification of the signal is in binary chars the information is transmitted between the nodes in the network. The message reaches the last node, where it is finally decoded and processed, to be used for damage detection and localization purposes. The main aim of the investigation is to determine the location of detected damage using reconstructed signals. The study demonstrates that the special steerable capabilities of FSATs, not only facilitate the detection of damage but also permit transmitting the damage information to a chosen area in a specific direction of the investigated structure.Keywords: data compression, ultrasonic communication, guided waves, FEM analysis
Procedia PDF Downloads 1232118 The Determinants of Enterprise Risk Management: Literature Review, and Future Research
Authors: Sylvester S. Horvey, Jones Mensah
Abstract:
The growing complexities and dynamics in the business environment have led to a new approach to risk management, known as enterprise risk management (ERM). ERM is a system and an approach to managing the risks of an organization in an integrated manner to achieve the corporate goals and strategic objectives. Regardless of the diversities in the business environment, ERM has become an essential factor in managing individual and business risks because ERM is believed to enhance shareholder value and firm growth. Despite the growing number of literature on ERM, the question about what factors drives ERM remains limited. This study provides a comprehensive literature review of the main factors that contribute to ERM implementation. Google Scholar was the leading search engine used to identify empirical literature, and the review spanned between 2000 and 2020. Articles published in Scimago journal ranking and Scopus were examined. Thirteen firm characteristics and sixteen articles were considered for the empirical review. Most empirical studies agreed that firm size, institutional ownership, industry type, auditor type, industrial diversification, earnings volatility, stock price volatility, and internal auditor had a positive relationship with ERM adoption, whereas firm size, institutional ownership, auditor type, and type of industry were mostly seen be statistically significant. Other factors such as financial leverage, profitability, asset opacity, international diversification, and firm complexity revealed an inconclusive result. The growing literature on ERM is not without limitations; hence, this study suggests that further research should examine ERM determinants within a new geographical context while considering a new and robust way of measuring ERM rather than relying on a simple proxy (dummy) for ERM measurement. Other firm characteristics such as organizational culture and context, corporate scandals and losses, and governance could be considered determinants of ERM adoption.Keywords: enterprise risk management, determinants, ERM adoption, literature review
Procedia PDF Downloads 1722117 Array Type Miniaturized Ultrasonic Sensors for Detecting Sinkhole in the City
Authors: Won Young Choi, Kwan Kyu Park
Abstract:
Recently, the road depression happening in the urban area is different from the cause of the sink hole and the generation mechanism occurring in the limestone area. The main cause of sinkholes occurring in the city center is the loss of soil due to the damage of old underground buried materials and groundwater discharge due to large underground excavation works. The method of detecting the sinkhole in the urban area is mostly using the Ground Penetration Radar (GPR). However, it is challenging to implement compact system and detecting watery state since it is based on electromagnetic waves. Although many ultrasonic underground detection studies have been conducted, near-ground detection (several tens of cm to several meters) has been developed for bulk systems using geophones as a receiver. The goal of this work is to fabricate a miniaturized sinkhole detecting system based on low-cost ultrasonic transducers of 40 kHz resonant frequency with high transmission pressure and receiving sensitivity. Motived by biomedical ultrasonic imaging methods, we detect air layers below the ground such as asphalt through the pulse-echo method. To improve image quality using multi-channel, linear array system is implemented, and image is acquired by classical synthetic aperture imaging method. We present the successful feasibility test of multi-channel sinkhole detector based on ultrasonic transducer. In this work, we presented and analyzed image results which are imaged by single channel pulse-echo imaging, synthetic aperture imaging.Keywords: road depression, sinkhole, synthetic aperture imaging, ultrasonic transducer
Procedia PDF Downloads 1412116 Intrusion Detection in Cloud Computing Using Machine Learning
Authors: Faiza Babur Khan, Sohail Asghar
Abstract:
With an emergence of distributed environment, cloud computing is proving to be the most stimulating computing paradigm shift in computer technology, resulting in spectacular expansion in IT industry. Many companies have augmented their technical infrastructure by adopting cloud resource sharing architecture. Cloud computing has opened doors to unlimited opportunities from application to platform availability, expandable storage and provision of computing environment. However, from a security viewpoint, an added risk level is introduced from clouds, weakening the protection mechanisms, and hardening the availability of privacy, data security and on demand service. Issues of trust, confidentiality, and integrity are elevated due to multitenant resource sharing architecture of cloud. Trust or reliability of cloud refers to its capability of providing the needed services precisely and unfailingly. Confidentiality is the ability of the architecture to ensure authorization of the relevant party to access its private data. It also guarantees integrity to protect the data from being fabricated by an unauthorized user. So in order to assure provision of secured cloud, a roadmap or model is obligatory to analyze a security problem, design mitigation strategies, and evaluate solutions. The aim of the paper is twofold; first to enlighten the factors which make cloud security critical along with alleviation strategies and secondly to propose an intrusion detection model that identifies the attackers in a preventive way using machine learning Random Forest classifier with an accuracy of 99.8%. This model uses less number of features. A comparison with other classifiers is also presented.Keywords: cloud security, threats, machine learning, random forest, classification
Procedia PDF Downloads 3192115 Method Validation for Heavy Metal Determination in Spring Water and Sediments
Authors: Habtamu Abdisa
Abstract:
Spring water is particularly valuable due to its high mineral content, which is beneficial for human health. However, anthropogenic activities usually imbalance the natural levels of its composition, which can cause adverse health effects. Regular monitoring of a naturally given environmental resource is of great concern in the world today. The spectrophotometric application is one of the best methods for qualifying and quantifying the mineral contents of environmental water samples. This research was conducted to evaluate the quality of spring water concerning its heavy metal composition. A grab sampling technique was employed to collect representative samples, including duplicates. The samples were then treated with concentrated HNO3 to a pH level below 2 and stored at 4oC. The samples were digested and analyzed for cadmium (Cd), chromium (Cr), manganese (Mn), copper (Cu), iron (Fe), and zinc (Zn) following method validation. Atomic Absorption Spectrometry (AAS) was utilized for the sample analysis. Quality control measures, including blanks, duplicates, and certified reference materials (CRMs), were implemented to ensure the accuracy and precision of the analytical results. Of the metals analyzed in the water samples, Cd and Cr were found to be below the detection limit. However, the concentrations of Mn, Cu, Fe, and Zn ranged from mean values of 0.119-0.227 mg/L, 0.142-0.166 mg/L, 0.183-0.267 mg/L, and 0.074-0.181 mg/L, respectively. Sediment analysis revealed mean concentration ranges of 348.31-429.21 mg/kg, 0.23-0.28 mg/kg, 18.73-22.84 mg/kg, 2.76-3.15 mg/kg, 941.84-1128.56 mg/kg, and 42.39-66.53 mg/kg for Mn, Cd, Cu, Cr, Fe, and Zn, respectively. The study results established that the evaluated spring water and its associated sediment met the regulatory standards and guidelines for heavy metal concentrations. Furthermore, this research can enhance the quality assurance and control processes for environmental sample analysis, ensuring the generation of reliable data.Keywords: method validation, heavy metal, spring water, sediment, method detection limit
Procedia PDF Downloads 672114 Foot-and-Mouth Virus Detection in Asymptomatic Dairy Cows without Foot-and-Mouth Disease Outbreak
Authors: Duanghathai Saipinta, Tanittian Panyamongkol, Witaya Suriyasathaporn
Abstract:
Animal management aims to provide a suitable environment for animals allowing maximal productivity in those animals. Prevention of disease is an important part of animal management. Foot-and-mouth disease (FMD) is a highly contagious viral disease in cattle and is an economically important animal disease worldwide. Monitoring the FMD virus in farms is useful management for the prevention of the FMD outbreak. A recent publication indicated collection samples from nasal swabs can be used for monitoring FMD in symptomatic cows. Therefore, the objectives of this study were to determine the FMD virus in asymptomatic dairy cattle using nasal swab samples during the absence of an FMD outbreak. The study was conducted from December 2020 to June 2021 using 185 asymptomatic signs of FMD dairy cattle in Chiang Mai Province, Thailand. By random cow selection, nasal mucosal swabs were used to collect samples from the selected cows and then were to evaluate the presence of FMD viruses using the real-time rt-PCR assay. In total, 4.9% of dairy cattle detected FMD virus, including 2 dairy farms in Mae-on (8 samples; 9.6%) and 1 farm in the Chai-Prakan district (1 sample; 1.2%). Interestingly, both farms in Mae-on were the outbreak of the FMD after this detection for 6 months. This indicated that the FMD virus presented in asymptomatic cattle might relate to the subsequent outbreak of FMD. The outbreak demonstrates the presence of the virus in the environment. In conclusion, monitoring of FMD can be performed by nasal swab collection. Further investigation is needed to show whether the FMD virus presented in asymptomatic FMD cattle could be the cause of the subsequent FMD outbreak or not.Keywords: cattle, foot-and-mouth disease, nasal swab, real-time rt-PCR assay
Procedia PDF Downloads 2302113 The Dangers of Attentional Inertia in the Driving Task
Authors: Catherine Thompson, Maryam Jalali, Peter Hills
Abstract:
The allocation of visual attention is critical when driving and anything that limits attention will have a detrimental impact on safety. Engaging in a secondary task reduces the amount of attention directed to the road because drivers allocate resources towards this task, leaving fewer resources to process driving-relevant information. Yet the dangers associated with a secondary task do not end when the driver returns their attention to the road. Instead, the attentional settings adopted to complete a secondary task may persist to the road, affecting attention, and therefore affecting driver performance. This 'attentional inertia' effect was investigated in the current work. Forty drivers searched for hazards in driving video clips while their eye-movements were recorded. At varying intervals they were instructed to attend to a secondary task displayed on a tablet situated to their left-hand side. The secondary task consisted of three separate computer games that induced horizontal, vertical, and random eye movements. Visual search and hazard detection in the driving clips were compared across the three conditions of the secondary task. Results showed that the layout of information in the secondary task, and therefore the allocation of attention in this task, had an impact on subsequent search in the driving clips. Vertically presented information reduced the wide horizontal spread of search usually associated with accurate driving and had a negative influence on the detection of hazards. The findings show the additional dangers of engaging in a secondary task while driving. The attentional inertia effect has significant implications for semi-autonomous and autonomous vehicles in which drivers have greater opportunity to direct their attention away from the driving task.Keywords: attention, eye-movements, hazard perception, visual search
Procedia PDF Downloads 1622112 Shark Detection and Classification with Deep Learning
Authors: Jeremy Jenrette, Z. Y. C. Liu, Pranav Chimote, Edward Fox, Trevor Hastie, Francesco Ferretti
Abstract:
Suitable shark conservation depends on well-informed population assessments. Direct methods such as scientific surveys and fisheries monitoring are adequate for defining population statuses, but species-specific indices of abundance and distribution coming from these sources are rare for most shark species. We can rapidly fill these information gaps by boosting media-based remote monitoring efforts with machine learning and automation. We created a database of shark images by sourcing 24,546 images covering 219 species of sharks from the web application spark pulse and the social network Instagram. We used object detection to extract shark features and inflate this database to 53,345 images. We packaged object-detection and image classification models into a Shark Detector bundle. We developed the Shark Detector to recognize and classify sharks from videos and images using transfer learning and convolutional neural networks (CNNs). We applied these models to common data-generation approaches of sharks: boosting training datasets, processing baited remote camera footage and online videos, and data-mining Instagram. We examined the accuracy of each model and tested genus and species prediction correctness as a result of training data quantity. The Shark Detector located sharks in baited remote footage and YouTube videos with an average accuracy of 89\%, and classified located subjects to the species level with 69\% accuracy (n =\ eight species). The Shark Detector sorted heterogeneous datasets of images sourced from Instagram with 91\% accuracy and classified species with 70\% accuracy (n =\ 17 species). Data-mining Instagram can inflate training datasets and increase the Shark Detector’s accuracy as well as facilitate archiving of historical and novel shark observations. Base accuracy of genus prediction was 68\% across 25 genera. The average base accuracy of species prediction within each genus class was 85\%. The Shark Detector can classify 45 species. All data-generation methods were processed without manual interaction. As media-based remote monitoring strives to dominate methods for observing sharks in nature, we developed an open-source Shark Detector to facilitate common identification applications. Prediction accuracy of the software pipeline increases as more images are added to the training dataset. We provide public access to the software on our GitHub page.Keywords: classification, data mining, Instagram, remote monitoring, sharks
Procedia PDF Downloads 1202111 Evaluation of Commercials by Psychological Changes in Consumers’ Physiological Characteristics
Authors: Motoki Seguchi, Fumiko Harada, Hiromitsu Shimakawa
Abstract:
There have been many local companies in countryside that carefully produce and sell products, which include crafts and foods produced with traditional methods. These companies are likely to use commercials to advertise their products. However, it is difficult for companies to judge whether the commercials they create are having an impact on consumers. Therefore, to create effective commercials, this study researches what kind of gimmicks in commercials affect what kind of consumers. This study proposes a method for extracting psychological change points from the physiological characteristics of consumers while they are watching commercials and estimating the gimmicks in the commercial that affect consumer engagement. In this method, change point detection is applied to pupil size for estimating gimmicks that affect consumers’ emotional engagement, and to EDA for estimating gimmicks that affect cognitive engagement. A questionnaire is also used to estimate the commercials that influence behavioral engagement. As a result of estimating the gimmicks that influence consumer engagement using this method, it was found that there are some common features among the gimmicks. To influence cognitive engagement, it was found that it was useful to include flashback scenes, messages to be appealed to, the company’s name, and the company’s logos as gimmicks. It was also found that flashback scenes and story climaxes were useful in influencing emotional engagement. Furthermore, it was found that the use of storytelling commercials may or may not be useful, depending on which consumers are desired to take which behaviors. It also estimated the gimmicks that influence consumers for each target and found that the useful gimmicks are slightly different for students and working adults. By using this method, it can understand which gimmicks in the commercial affect which engagement of the consumers. Therefore, the results of this study can be used as a reference for the gimmicks that should be included in commercials when companies create their commercials in the future.Keywords: change point detection, estimating engagement, physiological characteristics, psychological changes, watching commercials
Procedia PDF Downloads 1842110 Quantitative Analysis of (+)-Catechin and (-)-Epicatechin in Pentace burmanica Stem Bark by HPLC
Authors: Thidarat Duangyod, Chanida Palanuvej, Nijsiri Ruangrungsi
Abstract:
Pentace burmanica Kurz., belonging to the Malvaceae family, is commonly used for anti-diarrhea in Thai traditional medicine. A method for quantification of (+)-catechin and (-)-epicatechin in P. burmanica stem bark from 12 different Thailand markets by reverse-phase high performance liquid chromatography (HPLC) was investigated and validated. The analysis was performed by a Shimadzu DGU-20A3 HPLC equipped with a Shimadzu SPD-M20A photo diode array detector. The separation was accomplished with an Inersil ODS-3 column (5 µm x 4.6 x 250 mm) using 0.1% formic acid in water (A) and 0.1% formic acid in acetonitrile (B) as mobile phase at the flow rate of 1 ml/min. The isocratic was set at 20% B for 15 min and the column temperature was maintained at 40 ºC. The detection was at the wavelength of 280 nm. Both (+)-catechin and (-)-epicatechin existed in the ethanolic extract of P. burmanica stem bark. The content of (-)-epicatechin was found as 59.74 ± 1.69 µg/mg of crude extract. In contrast, the quantitation of (+)-catechin content was omitted because of its small amount. The method was linear over a range of 5-200 µg/ml with good coefficients (r2 > 0.99) for (+)-catechin and (-)-epicatechin. Limit of detection values were found to be 4.80 µg/ml for (+)-catechin and 5.14 µg/ml for (-)-epicatechin. Limit of quantitation of (+)-catechin and (-)-epicatechin were of 14.54 µg/ml and 15.57 µg/ml respectively. Good repeatability and intermediate precision (%RSD < 3) were found in this study. The average recoveries of both (+)-catechin and (-)-epicatechin were obtained with good recovery in the range of 91.11 – 97.02% and 88.53 – 93.78%, respectively, with the %RSD less than 2. The peak purity indices of catechins were more than 0.99. The results suggested that HPLC method proved to be precise and accurate and the method can be conveniently used for (+)-catechin and (-)-epicatechin determination in ethanolic extract of P. burmanica stem bark. Moreover, the stem bark of P. burmanica was found to be a rich source of (-)-epicatechin.Keywords: pentace burmanica, (+)-catechin, (-)-epicatechin, high performance liquid chromatography
Procedia PDF Downloads 4522109 Investigation of Leptospira Infection in Stray Animals in Thailand: Leptospirosis Risk Reduction in Human
Authors: Ruttayaporn Ngasaman, Saowakon Indouang, Usa Chethanond
Abstract:
Leptospirosis is a public health concern zoonosis in Thailand. Human and animals are often infected by contact with contaminated water. The infected animals play an important role in leptospira infection for both human and other hosts via urine. In humans, it can cause a wide range of symptoms, some of which may present mild flu-like symptoms including fever, vomiting, and jaundice. Without treatment, Leptospirosis can lead to kidney damage, meningitis, liver failure, respiratory distress, and even death. The prevalence of leptospirosis in stray animals in Thailand is unknown. The aim of this study was to investigate leptospira infection in stray animals including dogs and cats in Songkhla province, Thailand. Total of 434 blood samples were collected from 370 stray dogs and 64 stray cats during the population control program from 2014 to 2018. Screening test using latex agglutination for the detection of antibodies against Leptospira interrogans in serum samples shows 29.26% (127/434) positive. There were 120 positive samples of stray dogs and 7 positive samples of stray cats. Detection by polymerase chain reaction specific to LipL32 gene of Leptospira interrogans showed 1.61% (7/434) positive. Stray cats (5/64) show higher prevalence than stray dogs (2/370). Although active infection was low detected, but seroprevalence was high. This result indicated that stray animals were not active infection during sample collection but they use to get infected or in a latent period of infection. They may act as a reservoir for domestic animals and human in which stay in the same environment. In order to prevent and reduce the risk of leptospira infection in a human, stray animals should be done health checking, vaccination, and disease treatment.Keywords: leptospirosis, stray animals, risk reduction, Thailand
Procedia PDF Downloads 1322108 Multiple Etiologies and Incidences of Co-Infections in Childhood Diarrhea in a Hospital Based Screening Study in Odisha, India
Authors: Arpit K. Shrivastava, Nirmal K. Mohakud, Subrat Kumar, Priyadarshi S. Sahu
Abstract:
Acute diarrhea is one of the major causes of morbidity and mortality among children less than five years of age. Multiple etiologies have been implicated for infectious gastroenteritis causing acute diarrhea. In our study fecal samples (n=165) were collected from children (<5 years) presenting with symptoms of acute diarrhea. Samples were screened for viral, bacterial, and parasitic etiologies such as Rotavirus, Adenovirus, Diarrhoeagenic Escherichia coli (EPEC, EHEC, STEC, O157, O111), Shigella spp., Salmonella spp., Vibrio cholera, Cryptosporidium spp., and Giardia spp. The overall results from our study showed that 57% of children below 5 years of age with acute diarrhea were positive for at least one infectious etiology. Diarrhoeagenic Escherichia coli was detected to be the major etiological agent (29.09%) followed by Rotavirus (24.24%), Shigella (21.21%), Adenovirus (5.45%), Cryptosporidium (2.42%), and Giardia (0.60%). Among the different DEC strains, EPEC was detected significantly higher in <2 years children in comparison to >2 years age group (p =0.001). Concurrent infections with two or more pathogens were observed in 47 of 160 (28.48%) cases with a predominant incidence particularly in <2-year-old children (66.66%) compared to children of 2 to 5 years age group. Co-infection of Rotavirus with Shigella was the most frequent combination, which was detected in 17.94% cases, followed by Rotavirus with EPEC (15.38%) and Shigella with STEC (12.82%). Detection of multiple infectious etiologies and diagnosis of the right causative agent(s) can immensely help in better management of acute childhood diarrhea. In future more studies focusing on the detection of cases with concurrent infections must be carried out, as we believe that the etiological agents might be complementing each other’s strategies of pathogenesis resulting in severe diarrhea.Keywords: children, co-infection, infectious diarrhea, Odisha
Procedia PDF Downloads 3352107 S. cerevisiae Strains Co-Cultured with Isochrysis Galbana Create Greater Biomass for Biofuel Production than Nannochloropsis sp.
Authors: Madhalasa Iyer
Abstract:
The increase in sustainable practices have encouraged the research and production of alternative fuels. New techniques of bio flocculation with the addition of yeast and bacteria strains have increased the efficiency of biofuel production. Fatty acid methyl ester (FAME) analysis in previous research has indicated that yeast can serve as a plausible enhancer for microalgal lipid production. The research hopes to identify the yeast and microalgae treatment group that produces the largest algae biomass. The mass of the dried algae is used as a proxy for TAG production correlating to the cultivation of biofuels. The study uses a model bioreactor created and built using PVC pipes, 8-port sprinkler system manifold, CO2 aquarium tank, and disposable water bottles to grow the microalgae. Nannochloropsis sp., and Isochrysis galbanawere inoculated separately in experimental group 1 and 2 with no treatments and in experimental groups 3 and 4 with each algaeco-cultured with Saccharomyces cerevisiae in the medium of standard garden stone fertilizer. S. cerevisiae was grown in a petri dish with nutrient agar medium before inoculation. A Secchi stick was used before extraction to collect data for the optical density of the microalgae. The biomass estimator was then used to measure the approximate production of biomass. The microalgae were grown and extracted with a french press to analyze secondary measurements using the dried biomass. The experimental units of Isochrysis galbana treated with the baker’s yeast strains showed an increase in the overall mass of the dried algae. S. cerevisiae proved to be an accurate and helpful addition to the solution to provide for the growth of algae. The increase in productivity of this fuel source legitimizes the possible replacement of non-renewable sources with more promising renewable alternatives. This research furthers the notion that yeast and mutants can be engineered to be employed in efficient biofuel creation.Keywords: biofuel, co-culture, S. cerevisiae, microalgae, yeast
Procedia PDF Downloads 1072106 A Facile Nanocomposite of Graphene Oxide Reinforced Chitosan/Poly-Nitroaniline Polymer as a Highly Efficient Adsorbent for Extracting Polycyclic Aromatic Hydrocarbons from Tea Samples
Authors: Adel M. Al-Shutairi, Ahmed H. Al-Zahrani
Abstract:
Tea is a popular beverage drunk by millions of people throughout the globe. Tea has considerable health advantages, in-cluding antioxidant, antibacterial, antiviral, chemopreventive, and anticarcinogenic properties. As a result of environmental pollution (atmospheric deposition) and the production process, tealeaves may also include a variety of dangerous substances, such as polycyclic aromatic hydrocarbons (PAHs). In this study, graphene oxide reinforced chitosan/poly-nitroaniline polymer was prepared to develop a sensitive and reliable solid phase extraction method (SPE) for extraction of PAH7 in tea samples, followed by high-performance liquid chromatography- fluorescence detection. The prepared adsorbent was validated in terms of linearity, the limit of detection, the limit of quantification, recovery (%), accuracy (%), and precision (%) for the determination of the PAH7 (benzo[a]pyrene, benzo[a]anthracene, benzo[b]fluoranthene, chrysene, benzo[b]fluoranthene, Dibenzo[a,h]anthracene and Benzo[g,h,i]perylene) in tea samples. The concentration was determined in two types of tea commercially available in Saudi Arabia, including black tea and green tea. The maximum mean of Σ7PAHs in black tea samples was 68.23 ± 0.02 ug kg-1 and 26.68 ± 0.01 ug kg-1 in green tea samples. The minimum mean of Σ7PAHs in black tea samples was 37.93 ± 0.01 ug kg-1 and 15.26 ± 0.01 ug kg-1 in green tea samples. The mean value of benzo[a]pyrene in black tea samples ranged from 6.85 to 12.17 ug kg-1, where two samples exceeded the standard level (10 ug kg-1) established by the European Union (UE), while in green tea ranged from 1.78 to 2.81 ug kg-1. Low levels of Σ7PAHs in green tea samples were detected in comparison with black tea samples.Keywords: polycyclic aromatic hydrocarbons, CS, PNA and GO, black/green tea, solid phase extraction, Saudi Arabia
Procedia PDF Downloads 952105 Improvement of Microscopic Detection of Acid-Fast Bacilli for Tuberculosis by Artificial Intelligence-Assisted Microscopic Platform and Medical Image Recognition System
Authors: Hsiao-Chuan Huang, King-Lung Kuo, Mei-Hsin Lo, Hsiao-Yun Chou, Yusen Lin
Abstract:
The most robust and economical method for laboratory diagnosis of TB is to identify mycobacterial bacilli (AFB) under acid-fast staining despite its disadvantages of low sensitivity and labor-intensive. Though digital pathology becomes popular in medicine, an automated microscopic system for microbiology is still not available. A new AI-assisted automated microscopic system, consisting of a microscopic scanner and recognition program powered by big data and deep learning, may significantly increase the sensitivity of TB smear microscopy. Thus, the objective is to evaluate such an automatic system for the identification of AFB. A total of 5,930 smears was enrolled for this study. An intelligent microscope system (TB-Scan, Wellgen Medical, Taiwan) was used for microscopic image scanning and AFB detection. 272 AFB smears were used for transfer learning to increase the accuracy. Referee medical technicians were used as Gold Standard for result discrepancy. Results showed that, under a total of 1726 AFB smears, the automated system's accuracy, sensitivity and specificity were 95.6% (1,650/1,726), 87.7% (57/65), and 95.9% (1,593/1,661), respectively. Compared to culture, the sensitivity for human technicians was only 33.8% (38/142); however, the automated system can achieve 74.6% (106/142), which is significantly higher than human technicians, and this is the first of such an automated microscope system for TB smear testing in a controlled trial. This automated system could achieve higher TB smear sensitivity and laboratory efficiency and may complement molecular methods (eg. GeneXpert) to reduce the total cost for TB control. Furthermore, such an automated system is capable of remote access by the internet and can be deployed in the area with limited medical resources.Keywords: TB smears, automated microscope, artificial intelligence, medical imaging
Procedia PDF Downloads 2272104 Predicting Returns Volatilities and Correlations of Stock Indices Using Multivariate Conditional Autoregressive Range and Return Models
Authors: Shay Kee Tan, Kok Haur Ng, Jennifer So-Kuen Chan
Abstract:
This paper extends the conditional autoregressive range (CARR) model to multivariate CARR (MCARR) model and further to the two-stage MCARR-return model to model and forecast volatilities, correlations and returns of multiple financial assets. The first stage model fits the scaled realised Parkinson volatility measures using individual series and their pairwise sums of indices to the MCARR model to obtain in-sample estimates and forecasts of volatilities for these individual and pairwise sum series. Then covariances are calculated to construct the fitted variance-covariance matrix of returns which are imputed into the stage-two return model to capture the heteroskedasticity of assets’ returns. We investigate different choices of mean functions to describe the volatility dynamics. Empirical applications are based on the Standard and Poor 500, Dow Jones Industrial Average and Dow Jones United States Financial Service Indices. Results show that the stage-one MCARR models using asymmetric mean functions give better in-sample model fits than those based on symmetric mean functions. They also provide better out-of-sample volatility forecasts than those using CARR models based on two robust loss functions with the scaled realised open-to-close volatility measure as the proxy for the unobserved true volatility. We also find that the stage-two return models with constant means and multivariate Student-t errors give better in-sample fits than the Baba, Engle, Kraft, and Kroner type of generalized autoregressive conditional heteroskedasticity (BEKK-GARCH) models. The estimates and forecasts of value-at-risk (VaR) and conditional VaR based on the best MCARR-return models for each asset are provided and tested using Kupiec test to confirm the accuracy of the VaR forecasts.Keywords: range-based volatility, correlation, multivariate CARR-return model, value-at-risk, conditional value-at-risk
Procedia PDF Downloads 992103 Retrospective Evaluation of Vector-borne Infections in Cats Living in Germany (2012-2019)
Authors: I. Schäfer, B. Kohn, M. Volkmann, E. Müller
Abstract:
Introduction: Blood-feeding arthropods transmit parasitic, bacterial, or viral pathogens to domestic animals and wildlife. Vector-borne infections are gaining significance due to the increase of travel, import of domestic animals from abroad, and the changing climate in Europe. Aims of the study: The main objective of this retrospective study was to assess the prevalence of vector-borne infections in cats in which a ‘Feline Travel Profile’ had been conducted. Material and Methods: This retrospective study included test results from cats for which a ‘Feline Travel Profile’ established by LABOKLIN had been requested by veterinarians between April 2012 and December 2019. This profile contains direct detection methods via polymerase chain reaction (PCR) for Hepatozoon spp. and Dirofilaria spp. as well as indirect detection methods via immunofluorescence antibody test (IFAT) for Ehrlichia spp. and Leishmania spp. This profile was expanded to include an IFAT for Rickettsia spp. from July 2015 onwards. The prevalence of the different vector-borne infectious agents was calculated. Results: A total of 602 cats were tested using the ‘Feline Travel Profile’. Positive test results were as follows: Rickettsia spp. IFAT 54/442 (12.2%), Ehrlichia spp. IFAT 68/602 (11.3%), Leishmania spp. IFAT 21/602 (3.5%), Hepatozoon spp. PCR 51/595 (8.6%), and Dirofilaria spp. PCR 1/595 cats (0.2%). Co-infections with more than one pathogen could be detected in 22/602 cats. Conclusions: 170/602 cats (28.2%) were tested positive for at least one vector-borne pathogen. Infections with multiple pathogens could be detected in 3.7% of the cats. The data emphasizes the importance of considering vector-borne infections as potential differential diagnoses in cats.Keywords: arthopod-transmitted infections, feline vector-borne infections, Germany, laboratory diagnostics
Procedia PDF Downloads 1652102 Vibration Based Damage Detection and Stiffness Reduction of Bridges: Experimental Study on a Small Scale Concrete Bridge
Authors: Mirco Tarozzi, Giacomo Pignagnoli, Andrea Benedetti
Abstract:
Structural systems are often subjected to degradation processes due to different kind of phenomena like unexpected loadings, ageing of the materials and fatigue cycles. This is true especially for bridges, in which their safety evaluation is crucial for the purpose of a design of planning maintenance. This paper discusses the experimental evaluation of the stiffness reduction from frequency changes due to uniform damage scenario. For this purpose, a 1:4 scaled bridge has been built in the laboratory of the University of Bologna. It is made of concrete and its cross section is composed by a slab linked to four beams. This concrete deck is 6 m long and 3 m wide, and its natural frequencies have been identified dynamically by exciting it with an impact hammer, a dropping weight, or by walking on it randomly. After that, a set of loading cycles has been applied to this bridge in order to produce a uniformly distributed crack pattern. During the loading phase, either cracking moment and yielding moment has been reached. In order to define the relationship between frequency variation and loss in stiffness, the identification of the natural frequencies of the bridge has been performed, before and after the occurrence of the damage, corresponding to each load step. The behavior of breathing cracks and its effect on the natural frequencies has been taken into account in the analytical calculations. By using a sort of exponential function given from the study of lot of experimental tests in the literature, it has been possible to predict the stiffness reduction through the frequency variation measurements. During the load test also crack opening and middle span vertical displacement has been monitored.Keywords: concrete bridge, damage detection, dynamic test, frequency shifts, operational modal analysis
Procedia PDF Downloads 1832101 Mutation Analysis of the ATP7B Gene in 43 Vietnamese Wilson’s Disease Patients
Authors: Huong M. T. Nguyen, Hoa A. P. Nguyen, Mai P. T. Nguyen, Ngoc D. Ngo, Van T. Ta, Hai T. Le, Chi V. Phan
Abstract:
Wilson’s disease (WD) is an autosomal recessive disorder of the copper metabolism, which is caused by a mutation in the copper-transporting P-type ATPase (ATP7B). The mechanism of this disease is the failure of hepatic excretion of copper to bile, and leads to copper deposits in the liver and other organs. The ATP7B gene is located on the long arm of chromosome 13 (13q14.3). This study aimed to investigate the gene mutation in the Vietnamese patients with WD, and make a presymptomatic diagnosis for their familial members. Forty-three WD patients and their 65 siblings were identified as having ATP7B gene mutations. Genomic DNA was extracted from peripheral blood samples; 21 exons and exon-intron boundaries of the ATP7B gene were analyzed by direct sequencing. We recognized four mutations ([R723=; H724Tfs*34], V1042Cfs*79, D1027H, and IVS6+3A>G) in the sum of 20 detectable mutations, accounting for 87.2% of the total. Mutation S105* was determined to have a high rate (32.6%) in this study. The hotspot regions of ATP7B were found at exons 2, 16, and 8, and intron 14, in 39.6 %, 11.6 %, 9.3%, and 7 % of patients, respectively. Among nine homozygote/compound heterozygote siblings of the patients with WD, three individuals were determined as asymptomatic by screening mutations of the probands. They would begin treatment after diagnosis. In conclusion, 20 different mutations were detected in 43 WD patients. Of this number, four novel mutations were explored, including [R723=; H724Tfs*34], V1042Cfs*79, D1027H, and IVS6+3A>G. The mutation S105* is the most prevalent and has been considered as a biomarker that can be used in a rapid detection assay for diagnosis of WD patients. Exons 2, 8, and 16, and intron 14 should be screened initially for WD patients in Vietnam. Based on risk profile for WD, genetic testing for presymptomatic patients is also useful in diagnosis and treatment.Keywords: ATP7B gene, mutation detection, presymptomatic diagnosis, Vietnamese Wilson’s disease
Procedia PDF Downloads 3782100 Detection of Patient Roll-Over Using High-Sensitivity Pressure Sensors
Authors: Keita Nishio, Takashi Kaburagi, Yosuke Kurihara
Abstract:
Recent advances in medical technology have served to enhance average life expectancy. However, the total time for which the patients are prescribed complete bedrest has also increased. With patients being required to maintain a constant lying posture- also called bedsore- development of a system to detect patient roll-over becomes imperative. For this purpose, extant studies have proposed the use of cameras, and favorable results have been reported. Continuous on-camera monitoring, however, tends to violate patient privacy. We have proposed unconstrained bio-signal measurement system that could detect body-motion during sleep and does not violate patient’s privacy. Therefore, in this study, we propose a roll-over detection method by the date obtained from the bi-signal measurement system. Signals recorded by the sensor were assumed to comprise respiration, pulse, body motion, and noise components. Compared the body-motion and respiration, pulse component, the body-motion, during roll-over, generate large vibration. Thus, analysis of the body-motion component facilitates detection of the roll-over tendency. The large vibration associated with the roll-over motion has a great effect on the Root Mean Square (RMS) value of time series of the body motion component calculated during short 10 s segments. After calculation, the RMS value during each segment was compared to a threshold value set in advance. If RMS value in any segment exceeded the threshold, corresponding data were considered to indicate occurrence of a roll-over. In order to validate the proposed method, we conducted experiment. A bi-directional microphone was adopted as a high-sensitivity pressure sensor and was placed between the mattress and bedframe. Recorded signals passed through an analog Band-pass Filter (BPF) operating over the 0.16-16 Hz bandwidth. BPF allowed the respiration, pulse, and body-motion to pass whilst removing the noise component. Output from BPF was A/D converted with the sampling frequency 100Hz, and the measurement time was 480 seconds. The number of subjects and data corresponded to 5 and 10, respectively. Subjects laid on a mattress in the supine position. During data measurement, subjects—upon the investigator's instruction—were asked to roll over into four different positions—supine to left lateral, left lateral to prone, prone to right lateral, and right lateral to supine. Recorded data was divided into 48 segments with 10 s intervals, and the corresponding RMS value for each segment was calculated. The system was evaluated by the accuracy between the investigator’s instruction and the detected segment. As the result, an accuracy of 100% was achieved. While reviewing the time series of recorded data, segments indicating roll-over tendencies were observed to demonstrate a large amplitude. However, clear differences between decubitus and the roll-over motion could not be confirmed. Extant researches possessed a disadvantage in terms of patient privacy. The proposed study, however, demonstrates more precise detection of patient roll-over tendencies without violating their privacy. As a future prospect, decubitus estimation before and after roll-over could be attempted. Since in this paper, we could not confirm the clear differences between decubitus and the roll-over motion, future studies could be based on utilization of the respiration and pulse components.Keywords: bedsore, high-sensitivity pressure sensor, roll-over, unconstrained bio-signal measurement
Procedia PDF Downloads 1202099 Detection and Quantification of Ochratoxin A in Food by Aptasensor
Authors: Moez Elsaadani, Noel Durand, Brice Sorli, Didier Montet
Abstract:
Governments and international instances are trying to improve the food safety system to prevent, reduce or avoid the increase of food borne diseases. This food risk is one of the major concerns for the humanity. The contamination by mycotoxins is a threat to the health and life of humans and animals. One of the most common mycotoxin contaminating feed and foodstuffs is Ochratoxin A (OTA), which is a secondary metabolite, produced by Aspergillus and Penicillium strains. OTA has a chronic toxic effect and proved to be mutagenic, nephrotoxic, teratogenic, immunosuppressive, and carcinogenic. On the other side, because of their high stability, specificity, affinity, and their easy chemical synthesis, aptamer based methods are applied to OTA biosensing as alternative to traditional analytical technique. In this work, five aptamers have been tested to confirm qualitatively and quantitatively their binding with OTA. In the same time, three different analytical methods were tested and compared based on their ability to detect and quantify the OTA. The best protocol that was established to quantify free OTA from linked OTA involved an ultrafiltration method in green coffee solution with. OTA was quantified by HPLC-FLD to calculate the binding percentage of all five aptamers. One aptamer (The most effective with 87% binding with OTA) has been selected to be our biorecognition element to study its electrical response (variation of electrical properties) in the presence of OTA in order to be able to make a pairing with a radio frequency identification (RFID). This device, which is characterized by its low cost, speed, and a simple wireless information transmission, will implement the knowledge on the mycotoxins molecular sensors (aptamers), an electronic device that will link the information, the quantification and make it available to operators.Keywords: aptamer, aptasensor, detection, Ochratoxin A
Procedia PDF Downloads 1792098 Telemedicine Services in Ophthalmology: A Review of Studies
Authors: Nasim Hashemi, Abbas Sheikhtaheri
Abstract:
Telemedicine is the use of telecommunication and information technologies to provide health care services that would often not be consistently available in distant rural communities to people at these remote areas. Teleophthalmology is a branch of telemedicine that delivers eye care through digital medical equipment and telecommunications technology. Thus, teleophthalmology can overcome geographical barriers and improve quality, access, and affordability of eye health care services. Since teleophthalmology has been widespread applied in recent years, the aim of this study was to determine the different applications of teleophthalmology in the world. To this end, three bibliographic databases (Medline, ScienceDirect, Scopus) were comprehensively searched with these keywords: eye care, eye health care, primary eye care, diagnosis, detection, and screening of different eye diseases in conjunction with telemedicine, telehealth, teleophthalmology, e-services, and information technology. All types of papers were included in the study with no time restriction. We conducted the search strategies until 2015. Finally 70 articles were surveyed. We classified the results based on the’type of eye problems covered’ and ‘the type of telemedicine services’. Based on the review, from the ‘perspective of health care levels’, there are three level for eye health care as primary, secondary and tertiary eye care. From the ‘perspective of eye care services’, the main application of teleophthalmology in primary eye care was related to the diagnosis of different eye diseases such as diabetic retinopathy, macular edema, strabismus and aged related macular degeneration. The main application of teleophthalmology in secondary and tertiary eye care was related to the screening of eye problems i.e. diabetic retinopathy, astigmatism, glaucoma screening. Teleconsultation between health care providers and ophthalmologists and also education and training sessions for patients were other types of teleophthalmology in world. Real time, store–forward and hybrid methods were the main forms of the communication from the perspective of ‘teleophthalmology mode’ which is used based on IT infrastructure between sending and receiving centers. In aspect of specialists, early detection of serious aged-related ophthalmic disease in population, screening of eye disease processes, consultation in an emergency cases and comprehensive eye examination were the most important benefits of teleophthalmology. Cost-effectiveness of teleophthalmology projects resulted from reducing transportation and accommodation cost, access to affordable eye care services and receiving specialist opinions were also the main advantages of teleophthalmology for patients. Teleophthalmology brings valuable secondary and tertiary care to remote areas. So, applying teleophthalmology for detection, treatment and screening purposes and expanding its use in new applications such as eye surgery will be a key tool to promote public health and integrating eye care to primary health care.Keywords: applications, telehealth, telemedicine, teleophthalmology
Procedia PDF Downloads 3732097 Event Data Representation Based on Time Stamp for Pedestrian Detection
Authors: Yuta Nakano, Kozo Kajiwara, Atsushi Hori, Takeshi Fujita
Abstract:
In association with the wave of electric vehicles (EV), low energy consumption systems have become more and more important. One of the key technologies to realize low energy consumption is a dynamic vision sensor (DVS), or we can call it an event sensor, neuromorphic vision sensor and so on. This sensor has several features, such as high temporal resolution, which can achieve 1 Mframe/s, and a high dynamic range (120 DB). However, the point that can contribute to low energy consumption the most is its sparsity; to be more specific, this sensor only captures the pixels that have intensity change. In other words, there is no signal in the area that does not have any intensity change. That is to say, this sensor is more energy efficient than conventional sensors such as RGB cameras because we can remove redundant data. On the other side of the advantages, it is difficult to handle the data because the data format is completely different from RGB image; for example, acquired signals are asynchronous and sparse, and each signal is composed of x-y coordinate, polarity (two values: +1 or -1) and time stamp, it does not include intensity such as RGB values. Therefore, as we cannot use existing algorithms straightforwardly, we have to design a new processing algorithm to cope with DVS data. In order to solve difficulties caused by data format differences, most of the prior arts make a frame data and feed it to deep learning such as Convolutional Neural Networks (CNN) for object detection and recognition purposes. However, even though we can feed the data, it is still difficult to achieve good performance due to a lack of intensity information. Although polarity is often used as intensity instead of RGB pixel value, it is apparent that polarity information is not rich enough. Considering this context, we proposed to use the timestamp information as a data representation that is fed to deep learning. Concretely, at first, we also make frame data divided by a certain time period, then give intensity value in response to the timestamp in each frame; for example, a high value is given on a recent signal. We expected that this data representation could capture the features, especially of moving objects, because timestamp represents the movement direction and speed. By using this proposal method, we made our own dataset by DVS fixed on a parked car to develop an application for a surveillance system that can detect persons around the car. We think DVS is one of the ideal sensors for surveillance purposes because this sensor can run for a long time with low energy consumption in a NOT dynamic situation. For comparison purposes, we reproduced state of the art method as a benchmark, which makes frames the same as us and feeds polarity information to CNN. Then, we measured the object detection performances of the benchmark and ours on the same dataset. As a result, our method achieved a maximum of 7 points greater than the benchmark in the F1 score.Keywords: event camera, dynamic vision sensor, deep learning, data representation, object recognition, low energy consumption
Procedia PDF Downloads 97