Search results for: forest fire detection
3301 Unsupervised Neural Architecture for Saliency Detection
Authors: Natalia Efremova, Sergey Tarasenko
Abstract:
We propose a novel neural network architecture for visual saliency detections, which utilizes neuro physiologically plausible mechanisms for extraction of salient regions. The model has been significantly inspired by recent findings from neuro physiology and aimed to simulate the bottom-up processes of human selective attention. Two types of features were analyzed: color and direction of maximum variance. The mechanism we employ for processing those features is PCA, implemented by means of normalized Hebbian learning and the waves of spikes. To evaluate performance of our model we have conducted psychological experiment. Comparison of simulation results with those of experiment indicates good performance of our model.Keywords: neural network models, visual saliency detection, normalized Hebbian learning, Oja's rule, psychological experiment
Procedia PDF Downloads 3503300 An Experimental Study on the Thermal Properties of Concrete Aggregates in Relation to Their Mineral Composition
Authors: Kyung Suk Cho, Heung Youl Kim
Abstract:
The analysis of the petrologic characteristics and thermal properties of crushed aggregates for concrete such as granite, gneiss, dolomite, shale and andesite found that rock-forming minerals decided the thermal properties of the aggregates. The thermal expansion coefficients of aggregates containing lots of quartz increased rapidly at 573 degrees due to quartz transition. The mass of aggregate containing carbonate minerals decreased rapidly at 750 degrees due to decarboxylation, while its specific heat capacity increased relatively. The mass of aggregates containing hydrated silicate minerals decreased more significantly, and their specific heat capacities were greater when compared with aggregates containing feldspar or quartz. It is deduced that the hydroxyl group (OH) in hydrated silicate dissolved as its bond became loose at high temperatures. Aggregates containing mafic minerals turned red at high temperatures due to oxidation response. Moreover, the comparison of cooling methods showed that rapid cooling using water resulted in more reduction in aggregate mass than slow cooling at room temperatures. In order to observe the fire resistance performance of concrete composed of the identical but coarse aggregate, mass loss and compressive strength reduction factor at 200, 400, 600 and 800 degrees were measured. It was found from the analysis of granite and gneiss that the difference in thermal expansion coefficients between cement paste and aggregates caused by quartz transit at 573 degrees resulted in thermal stress inside the concrete and thus triggered concrete cracking. The ferromagnesian hydrated silicate in andesite and shale caused greater reduction in both initial stiffness and mass compared with other aggregates. However, the thermal expansion coefficient of andesite and shale was similar to that of cement paste. Since they were low in thermal conductivity and high in specific heat capacity, concrete cracking was relatively less severe. Being slow in heat transfer, they were judged to be materials of high heat capacity.Keywords: crush-aggregates, fire resistance, thermal expansion, heat transfer
Procedia PDF Downloads 2293299 Lung Cancer Detection and Multi Level Classification Using Discrete Wavelet Transform Approach
Authors: V. Veeraprathap, G. S. Harish, G. Narendra Kumar
Abstract:
Uncontrolled growth of abnormal cells in the lung in the form of tumor can be either benign (non-cancerous) or malignant (cancerous). Patients with Lung Cancer (LC) have an average of five years life span expectancy provided diagnosis, detection and prediction, which reduces many treatment options to risk of invasive surgery increasing survival rate. Computed Tomography (CT), Positron Emission Tomography (PET), and Magnetic Resonance Imaging (MRI) for earlier detection of cancer are common. Gaussian filter along with median filter used for smoothing and noise removal, Histogram Equalization (HE) for image enhancement gives the best results without inviting further opinions. Lung cavities are extracted and the background portion other than two lung cavities is completely removed with right and left lungs segmented separately. Region properties measurements area, perimeter, diameter, centroid and eccentricity measured for the tumor segmented image, while texture is characterized by Gray-Level Co-occurrence Matrix (GLCM) functions, feature extraction provides Region of Interest (ROI) given as input to classifier. Two levels of classifications, K-Nearest Neighbor (KNN) is used for determining patient condition as normal or abnormal, while Artificial Neural Networks (ANN) is used for identifying the cancer stage is employed. Discrete Wavelet Transform (DWT) algorithm is used for the main feature extraction leading to best efficiency. The developed technology finds encouraging results for real time information and on line detection for future research.Keywords: artificial neural networks, ANN, discrete wavelet transform, DWT, gray-level co-occurrence matrix, GLCM, k-nearest neighbor, KNN, region of interest, ROI
Procedia PDF Downloads 1563298 Faster, Lighter, More Accurate: A Deep Learning Ensemble for Content Moderation
Authors: Arian Hosseini, Mahmudul Hasan
Abstract:
To address the increasing need for efficient and accurate content moderation, we propose an efficient and lightweight deep classification ensemble structure. Our approach is based on a combination of simple visual features, designed for high-accuracy classification of violent content with low false positives. Our ensemble architecture utilizes a set of lightweight models with narrowed-down color features, and we apply it to both images and videos. We evaluated our approach using a large dataset of explosion and blast contents and compared its performance to popular deep learning models such as ResNet-50. Our evaluation results demonstrate significant improvements in prediction accuracy, while benefiting from 7.64x faster inference and lower computation cost. While our approach is tailored to explosion detection, it can be applied to other similar content moderation and violence detection use cases as well. Based on our experiments, we propose a "think small, think many" philosophy in classification scenarios. We argue that transforming a single, large, monolithic deep model into a verification-based step model ensemble of multiple small, simple, and lightweight models with narrowed-down visual features can possibly lead to predictions with higher accuracy.Keywords: deep classification, content moderation, ensemble learning, explosion detection, video processing
Procedia PDF Downloads 573297 Clustered Regularly Interspaced Short Palindromic Repeat/cas9-Based Lateral Flow and Fluorescence Diagnostics for Rapid Pathogen Detection
Authors: Mark Osborn
Abstract:
Clustered, regularly interspaced short palindromic repeat (CRISPR/Cas) proteins can be designed to bind specified DNA and RNA sequences and hold great promise for the accurate detection of nucleic acids for diagnostics. Commercially available reagents were integrated into a CRISPR/Cas9-based lateral flow assay that can detect severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequences with single-base specificity. This approach requires minimal equipment and represents a simplified platform for field-based deployment. A rapid, multiplex fluorescence CRISPR/Cas9 nuclease cleavage assay capable of detecting and differentiating SARS-CoV-2, influenza A and B, and respiratory syncytial virus in a single reaction was also developed. These findings provide proof of principle for CRISPR/Cas9 point-of-care diagnosis that can detect specific SARS-CoV-2 strain(s). Further, Cas9 cleavage allows for a scalable fluorescent platform for identifying respiratory viral pathogens with overlapping symptomology. Collectively, this approach is a facile platform for diagnostics with broad application to user-defined sequence interrogation and detection.Keywords: CRISPR/Cas9, lateral flow assay, SARS-Co-V2, single-nucleotide resolution
Procedia PDF Downloads 1853296 Autonomous Kuka Youbot Navigation Based on Machine Learning and Path Planning
Authors: Carlos Gordon, Patricio Encalada, Henry Lema, Diego Leon, Dennis Chicaiza
Abstract:
The following work presents a proposal of autonomous navigation of mobile robots implemented in an omnidirectional robot Kuka Youbot. We have been able to perform the integration of robotic operative system (ROS) and machine learning algorithms. ROS mainly provides two distributions; ROS hydro and ROS Kinect. ROS hydro allows managing the nodes of odometry, kinematics, and path planning with statistical and probabilistic, global and local algorithms based on Adaptive Monte Carlo Localization (AMCL) and Dijkstra. Meanwhile, ROS Kinect is responsible for the detection block of dynamic objects which can be in the points of the planned trajectory obstructing the path of Kuka Youbot. The detection is managed by artificial vision module under a trained neural network based on the single shot multibox detector system (SSD), where the main dynamic objects for detection are human beings and domestic animals among other objects. When the objects are detected, the system modifies the trajectory or wait for the decision of the dynamic obstacle. Finally, the obstacles are skipped from the planned trajectory, and the Kuka Youbot can reach its goal thanks to the machine learning algorithms.Keywords: autonomous navigation, machine learning, path planning, robotic operative system, open source computer vision library
Procedia PDF Downloads 1783295 Study on an Integrated Real-Time Sensor in Droplet-Based Microfluidics
Authors: Tien-Li Chang, Huang-Chi Huang, Zhao-Chi Chen, Wun-Yi Chen
Abstract:
The droplet-based microfluidic are used as micro-reactors for chemical and biological assays. Hence, the precise addition of reagents into the droplets is essential for this function in the scope of lab-on-a-chip applications. To obtain the characteristics (size, velocity, pressure, and frequency of production) of droplets, this study describes an integrated on-chip method of real-time signal detection. By controlling and manipulating the fluids, the flow behavior can be obtained in the droplet-based microfluidics. The detection method is used a type of infrared sensor. Through the varieties of droplets in the microfluidic devices, the real-time conditions of velocity and pressure are gained from the sensors. Here the microfluidic devices are fabricated by polydimethylsiloxane (PDMS). To measure the droplets, the signal acquisition of sensor and LabVIEW program control must be established in the microchannel devices. The devices can generate the different size droplets where the flow rate of oil phase is fixed 30 μl/hr and the flow rates of water phase range are from 20 μl/hr to 80 μl/hr. The experimental results demonstrate that the sensors are able to measure the time difference of droplets under the different velocity at the voltage from 0 V to 2 V. Consequently, the droplets are measured the fastest speed of 1.6 mm/s and related flow behaviors that can be helpful to develop and integrate the practical microfluidic applications.Keywords: microfluidic, droplets, sensors, single detection
Procedia PDF Downloads 4953294 Gassing Tendency of Natural Ester Based Transformer oils: Low Alkane Generation in Stray Gassing Behaviour
Authors: Thummalapalli CSM Gupta, Banti Sidhiwala
Abstract:
Mineral oils of naphthenic and paraffinic type have been traditionally been used as insulating liquids in the transformer applications to protect the solid insulation from moisture and ensures effective heat transfer/cooling. The performance of these type of oils have been proven in the field over many decades and the condition monitoring and diagnosis of transformer performance have been successfully monitored through oil properties and dissolved gas analysis methods successfully. Different type of gases representing various types of faults due to components or operating conditions effectively. While large amount of data base has been generated in the industry on dissolved gas analysis for mineral oil based transformer oils and various models for predicting the fault and analysis, oil specifications and standards have also been modified to include stray gassing limits which cover the low temperature faults and becomes an effective preventative maintenance tool that can benefit greatly to know the reasons for the breakdown of electrical insulating materials and related components. Natural esters have seen a rise in popularity in recent years due to their "green" credentials. Some of its benefits include biodegradability, a higher fire point, improvement in load capability of transformer and improved solid insulation life than mineral oils. However, the Stray gases evolution like hydrogen and hydrocarbons like methane (CH4) and ethane (C2H6) show very high values which are much higher than the limits of mineral oil standards. Though the standards for these type esters are yet to be evolved, the higher values of hydrocarbon gases that are available in the market is of concern which might be interpreted as a fault in transformer operation. The current paper focuses on developing a natural ester based transformer oil which shows very levels of stray gassing by standard test methods show much lower values compared to the products available currently and experimental results on various test conditions and the underlying mechanism explained.Keywords: biodegadability, fire point, dissolved gassing analysis, stray gassing
Procedia PDF Downloads 993293 Desertification of Earth and Reverting Strategies
Authors: V. R. Venugopal
Abstract:
Human being evolved 200,000 years ago in an area which is now the Sahara desert and lived all along in the northern part of Africa. It was around 10,000 to15,00 years that he moved out of Africa. Various ancient civilizations – mainly the Egyptian, Mesopotamian, Indus valley and the Chinese yellow river valley civilizations - developed and perished till the beginning of the Christian era. Strangely the regions where all these civilizations flourished are no deserts. After the ancient civilizations the two major religions of the world the Christianity and Islam evolved. These too evolved in the regions of Jerusalem and Mecca which are now in the deserts of the present Israel and Saudi Arabia. Human activity since ancient age right from his origin was in areas which are now deserts. This is only because wherever Man lived in large numbers he has turned them into deserts. Unfortunately, this is not the case with the ancient days alone. Over the last 500 years the forest cover on the earth is reduced by 80 percent. Even more currently Just over the last forty decades human population has doubled but the number of bugs, beetles, worms and butterflies (micro fauna) have declined by 45%. Deforestation and defaunation are the first signs of desertification and Desertification is a process parallel to the extinction of life. There is every possibility that soon most of the earth will be in deserts. This writer has been involved in the process of forestation and increase of fauna as a profession since twenty years and this is a report of his efforts made in the process, the results obtained and concept generated to revert the ongoing desertification of this earth. This paper highlights how desertification can be reverted by applying these basic principles. 1) Man is not owner of this earth and has no right destroy vegetation and micro fauna. 2) Land owner shall not have the freedom to do anything that he wishes with the land. 3) The land that is under agriculture shall be reduced at least by a half. 4) Irrigation and modern technology shall be used for the forest growth also. 5) Farms shall have substantial permanent vegetation and the practice of all in all out shall stop.Keywords: desertification, extinction, micro fauna, reverting
Procedia PDF Downloads 3133292 Application of Support Vector Machines in Fault Detection and Diagnosis of Power Transmission Lines
Authors: I. A. Farhat, M. Bin Hasan
Abstract:
A developed approach for the protection of power transmission lines using Support Vector Machines (SVM) technique is presented. In this paper, the SVM technique is utilized for the classification and isolation of faults in power transmission lines. Accurate fault classification and location results are obtained for all possible types of short circuit faults. As in distance protection, the approach utilizes the voltage and current post-fault samples as inputs. The main advantage of the method introduced here is that the method could easily be extended to any power transmission line.Keywords: fault detection, classification, diagnosis, power transmission line protection, support vector machines (SVM)
Procedia PDF Downloads 5613291 The Benefits of Security Culture for Improving Physical Protection Systems at Detection and Radiation Measurement Laboratory
Authors: Ari S. Prabowo, Nia Febriyanti, Haryono B. Santosa
Abstract:
Security function that is called as Physical Protection Systems (PPS) has functions to detect, delay and response. Physical Protection Systems (PPS) in Detection and Radiation Measurement Laboratory needs to be improved continually by using internal resources. The nuclear security culture provides some potentials to support this research. The study starts by identifying the security function’s weaknesses and its strengths of security culture as a purpose. Secondly, the strengths of security culture are implemented in the laboratory management. Finally, a simulation was done to measure its effectiveness. Some changes were happened in laboratory personnel behaviors and procedures. All became more prudent. The results showed a good influence of nuclear security culture in laboratory security functions.Keywords: laboratory, physical protection system, security culture, security function
Procedia PDF Downloads 1903290 Cyber-Med: Practical Detection Methodology of Cyber-Attacks Aimed at Medical Devices Eco-Systems
Authors: Nir Nissim, Erez Shalom, Tomer Lancewiki, Yuval Elovici, Yuval Shahar
Abstract:
Background: A Medical Device (MD) is an instrument, machine, implant, or similar device that includes a component intended for the purpose of the diagnosis, cure, treatment, or prevention of disease in humans or animals. Medical devices play increasingly important roles in health services eco-systems, including: (1) Patient Diagnostics and Monitoring; Medical Treatment and Surgery; and Patient Life Support Devices and Stabilizers. MDs are part of the medical device eco-system and are connected to the network, sending vital information to the internal medical information systems of medical centers that manage this data. Wireless components (e.g. Wi-Fi) are often embedded within medical devices, enabling doctors and technicians to control and configure them remotely. All these functionalities, roles, and uses of MDs make them attractive targets of cyber-attacks launched for many malicious goals; this trend is likely to significantly increase over the next several years, with increased awareness regarding MD vulnerabilities, the enhancement of potential attackers’ skills, and expanded use of medical devices. Significance: We propose to develop and implement Cyber-Med, a unique collaborative project of Ben-Gurion University of the Negev and the Clalit Health Services Health Maintenance Organization. Cyber-Med focuses on the development of a comprehensive detection framework that relies on a critical attack repository that we aim to create. Cyber-Med will allow researchers and companies to better understand the vulnerabilities and attacks associated with medical devices as well as providing a comprehensive platform for developing detection solutions. Methodology: The Cyber-Med detection framework will consist of two independent, but complementary detection approaches: one for known attacks, and the other for unknown attacks. These modules incorporate novel ideas and algorithms inspired by our team's domains of expertise, including cyber security, biomedical informatics, and advanced machine learning, and temporal data mining techniques. The establishment and maintenance of Cyber-Med’s up-to-date attack repository will strengthen the capabilities of Cyber-Med’s detection framework. Major Findings: Based on our initial survey, we have already found more than 15 types of vulnerabilities and possible attacks aimed at MDs and their eco-system. Many of these attacks target individual patients who use devices such pacemakers and insulin pumps. In addition, such attacks are also aimed at MDs that are widely used by medical centers such as MRIs, CTs, and dialysis engines; the information systems that store patient information; protocols such as DICOM; standards such as HL7; and medical information systems such as PACS. However, current detection tools, techniques, and solutions generally fail to detect both the known and unknown attacks launched against MDs. Very little research has been conducted in order to protect these devices from cyber-attacks, since most of the development and engineering efforts are aimed at the devices’ core medical functionality, the contribution to patients’ healthcare, and the business aspects associated with the medical device.Keywords: medical device, cyber security, attack, detection, machine learning
Procedia PDF Downloads 3583289 Landscape Pattern Evolution and Optimization Strategy in Wuhan Urban Development Zone, China
Abstract:
With the rapid development of urbanization process in China, its environmental protection pressure is severely tested. So, analyzing and optimizing the landscape pattern is an important measure to ease the pressure on the ecological environment. This paper takes Wuhan Urban Development Zone as the research object, and studies its landscape pattern evolution and quantitative optimization strategy. First, remote sensing image data from 1990 to 2015 were interpreted by using Erdas software. Next, the landscape pattern index of landscape level, class level, and patch level was studied based on Fragstats. Then five indicators of ecological environment based on National Environmental Protection Standard of China were selected to evaluate the impact of landscape pattern evolution on the ecological environment. Besides, the cost distance analysis of ArcGIS was applied to simulate wildlife migration thus indirectly measuring the improvement of ecological environment quality. The result shows that the area of land for construction increased 491%. But the bare land, sparse grassland, forest, farmland, water decreased 82%, 47%, 36%, 25% and 11% respectively. They were mainly converted into construction land. On landscape level, the change of landscape index all showed a downward trend. Number of patches (NP), Landscape shape index (LSI), Connection index (CONNECT), Shannon's diversity index (SHDI), Aggregation index (AI) separately decreased by 2778, 25.7, 0.042, 0.6, 29.2%, all of which indicated that the NP, the degree of aggregation and the landscape connectivity declined. On class level, the construction land and forest, CPLAND, TCA, AI and LSI ascended, but the Distribution Statistics Core Area (CORE_AM) decreased. As for farmland, water, sparse grassland, bare land, CPLAND, TCA and DIVISION, the Patch Density (PD) and LSI descended, yet the patch fragmentation and CORE_AM increased. On patch level, patch area, Patch perimeter, Shape index of water, farmland and bare land continued to decline. The three indexes of forest patches increased overall, sparse grassland decreased as a whole, and construction land increased. It is obvious that the urbanization greatly influenced the landscape evolution. Ecological diversity and landscape heterogeneity of ecological patches clearly dropped. The Habitat Quality Index continuously declined by 14%. Therefore, optimization strategy based on greenway network planning is raised for discussion. This paper contributes to the study of landscape pattern evolution in planning and design and to the research on spatial layout of urbanization.Keywords: landscape pattern, optimization strategy, ArcGIS, Erdas, landscape metrics, landscape architecture
Procedia PDF Downloads 1683288 Metal-Oxide-Semiconductor-Only Process Corner Monitoring Circuit
Authors: Davit Mirzoyan, Ararat Khachatryan
Abstract:
A process corner monitoring circuit (PCMC) is presented in this work. The circuit generates a signal, the logical value of which depends on the process corner only. The signal can be used in both digital and analog circuits for testing and compensation of process variations (PV). The presented circuit uses only metal-oxide-semiconductor (MOS) transistors, which allow increasing its detection accuracy, decrease power consumption and area. Due to its simplicity the presented circuit can be easily modified to monitor parametrical variations of only n-type and p-type MOS (NMOS and PMOS, respectively) transistors, resistors, as well as their combinations. Post-layout simulation results prove correct functionality of the proposed circuit, i.e. ability to monitor the process corner (equivalently die-to-die variations) even in the presence of within-die variations.Keywords: detection, monitoring, process corner, process variation
Procedia PDF Downloads 5263287 Fault Diagnosis in Induction Motor
Authors: Kirti Gosavi, Anita Bhole
Abstract:
The paper demonstrates simulation and steady-state performance of three phase squirrel cage induction motor and detection of rotor broken bar fault using MATLAB. This simulation model is successfully used in the fault detection of rotor broken bar for the induction machines. A dynamic model using PWM inverter and mathematical modelling of the motor is developed. The dynamic simulation of the small power induction motor is one of the key steps in the validation of the design process of the motor drive system and it is needed for eliminating advertent design errors and the resulting error in the prototype construction and testing. The simulation model will be helpful in detecting the faults in three phase induction motor using Motor current signature analysis.Keywords: squirrel cage induction motor, pulse width modulation (PWM), fault diagnosis, induction motor
Procedia PDF Downloads 6333286 The Role of Sustainable Financing Models for Smallholder Tree Growers in Ghana
Authors: Raymond Awinbilla
Abstract:
The call for tree planting has long been set in motion by the government of Ghana. The Forestry Commission encourages plantation development through numerous interventions including formulating policies and enacting legislations. However, forest policies have failed and that has generated a major concern over the vast gap between the intentions of national policies and the realities established. This study addresses three objectives;1) Assessing the farmers' response and contribution to the tree planting initiative, 2) Identifying socio-economic factors hindering the development of smallholder plantations as a livelihood strategy, and 3) Determining the level of support available for smallholder tree growers and the factors influencing it. The field work was done in 12 farming communities in Ghana. The article illuminates that farmers have responded to the call for tree planting and have planted both exotic and indigenous tree species. Farmers have converted 17.2% (369.48ha) of their total land size into plantations and have no problem with land tenure. Operations and marketing constraints include lack of funds for operations, delay in payment, low price of wood, manipulation of price by buyers, documentation by buyers, and no ready market for harvesting wood products. Environmental institutions encourage tree planting; the only exception is with the Lands Commission. Support availed to farmers includes capacity building in silvicultural practices, organisation of farmers, linkage to markets and finance. Efforts by the Government of Ghana to enhance forest resources in the country could rely on the input of local populations.Keywords: livelihood strategy, marketing constraints, environmental institutions, silvicultural practices
Procedia PDF Downloads 603285 Machine Learning Techniques for Estimating Ground Motion Parameters
Authors: Farid Khosravikia, Patricia Clayton
Abstract:
The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine
Procedia PDF Downloads 1243284 Abnormality Detection of Persons Living Alone Using Daily Life Patterns Obtained from Sensors
Authors: Ippei Kamihira, Takashi Nakajima, Taiyo Matsumura, Hikaru Miura, Takashi Ono
Abstract:
In this research, the goal was construction of a system by which multiple sensors were used to observe the daily life behavior of persons living alone (while respecting their privacy). Using this information to judge such conditions as a bad physical condition or falling in the home, etc., so that these abnormal conditions can be made known to relatives and third parties. The daily life patterns of persons living alone are expressed by the number of responses of sensors each time that a set time period has elapsed. By comparing data for the prior two weeks, it was possible to judge a situation as 'normal' when the person was in a good physical condition or as 'abnormal' when the person was in a bad physical condition.Keywords: sensors, elderly living alone, abnormality detection, iifestyle habit
Procedia PDF Downloads 2553283 Recommendations Using Online Water Quality Sensors for Chlorinated Drinking Water Monitoring at Drinking Water Distribution Systems Exposed to Glyphosate
Authors: Angela Maria Fasnacht
Abstract:
Detection of anomalies due to contaminants’ presence, also known as early detection systems in water treatment plants, has become a critical point that deserves an in-depth study for their improvement and adaptation to current requirements. The design of these systems requires a detailed analysis and processing of the data in real-time, so it is necessary to apply various statistical methods appropriate to the data generated, such as Spearman’s Correlation, Factor Analysis, Cross-Correlation, and k-fold Cross-validation. Statistical analysis and methods allow the evaluation of large data sets to model the behavior of variables; in this sense, statistical treatment or analysis could be considered a vital step to be able to develop advanced models focused on machine learning that allows optimized data management in real-time, applied to early detection systems in water treatment processes. These techniques facilitate the development of new technologies used in advanced sensors. In this work, these methods were applied to identify the possible correlations between the measured parameters and the presence of the glyphosate contaminant in the single-pass system. The interaction between the initial concentration of glyphosate and the location of the sensors on the reading of the reported parameters was studied.Keywords: glyphosate, emergent contaminants, machine learning, probes, sensors, predictive
Procedia PDF Downloads 1253282 Real-Time Quantitative Polymerase Chain Reaction Assay for the Detection of microRNAs Using Bi-Directional Extension Sequences
Authors: Kyung Jin Kim, Jiwon Kwak, Jae-Hoon Lee, Soo Suk Lee
Abstract:
MicroRNAs (miRNA) are a class of endogenous, single-stranded, small, and non-protein coding RNA molecules typically 20-25 nucleotides long. They are thought to regulate the expression of other genes in a broad range by binding to 3’- untranslated regions (3’-UTRs) of specific mRNAs. The detection of miRNAs is very important for understanding of the function of these molecules and in the diagnosis of variety of human diseases. However, detection of miRNAs is very challenging because of their short length and high sequence similarities within miRNA families. So, a simple-to-use, low-cost, and highly sensitive method for the detection of miRNAs is desirable. In this study, we demonstrate a novel bi-directional extension (BDE) assay. In the first step, a specific linear RT primer is hybridized to 6-10 base pairs from the 3’-end of a target miRNA molecule and then reverse transcribed to generate a cDNA strand. After reverse transcription, the cDNA was hybridized to the 3’-end which is BDE sequence; it played role as the PCR template. The PCR template was amplified in an SYBR green-based quantitative real-time PCR. To prove the concept, we used human brain total RNA. It could be detected quantitatively in the range of seven orders of magnitude with excellent linearity and reproducibility. To evaluate the performance of BDE assay, we contrasted sensitivity and specificity of the BDE assay against a commercially available poly (A) tailing method using miRNAs for let-7e extracted from A549 human epithelial lung cancer cells. The BDE assay displayed good performance compared with a poly (A) tailing method in terms of specificity and sensitivity; the CT values differed by 2.5 and the melting curve showed a sharper than poly (A) tailing methods. We have demonstrated an innovative, cost-effective BDE assay that allows improved sensitivity and specificity in detection of miRNAs. Dynamic range of the SYBR green-based RT-qPCR for miR-145 could be represented quantitatively over a range of 7 orders of magnitude from 0.1 pg to 1.0 μg of human brain total RNA. Finally, the BDE assay for detection of miRNA species such as let-7e shows good performance compared with a poly (A) tailing method in terms of specificity and sensitivity. Thus BDE proves a simple, low cost, and highly sensitive assay for various miRNAs and should provide significant contributions in research on miRNA biology and application of disease diagnostics with miRNAs as targets.Keywords: bi-directional extension (BDE), microRNA (miRNA), poly (A) tailing assay, reverse transcription, RT-qPCR
Procedia PDF Downloads 1683281 Video Heart Rate Measurement for the Detection of Trauma-Related Stress States
Authors: Jarek Krajewski, David Daxberger, Luzi Beyer
Abstract:
Finding objective and non-intrusive measurements of emotional and psychopathological states (e.g., post-traumatic stress disorder, PTSD) is an important challenge. Thus, the proposed approach here uses Photoplethysmographic imaging (PPGI) applying facial RGB Cam videos to estimate heart rate levels. A pipeline for the signal processing of the raw image has been proposed containing different preprocessing approaches, e.g., Independent Component Analysis, Non-negative Matrix factorization, and various other artefact correction approaches. Under resting and constant light conditions, we reached a sensitivity of 84% for pulse peak detection. The results indicate that PPGI can be a suitable solution for providing heart rate data derived from these indirectly post-traumatic stress states.Keywords: heart rate, PTSD, PPGI, stress, preprocessing
Procedia PDF Downloads 1273280 Mike Hat: Coloured-Tape-in-Hat as a Head Circumference Measuring Instrument for Early Detection of Hydrocephalus in an Infant
Authors: Nyimas Annissa Mutiara Andini
Abstract:
Every year, children develop hydrocephalus during the first year of life. If it is not treated, hydrocephalus can lead to brain damage, a loss in mental and physical abilities, and even death. To be treated, first, we have to do a proper diagnosis using some examinations especially to detect hydrocephalus earlier. One of the examination that could be done is using a head circumference measurement. Increased head circumference is a first and main sign of hydrocephalus, especially in infant (0-1 year age). Head circumference is a measurement of a child's head largest area. In this measurement, we want to get the distance from above the eyebrows and ears and around the back of the head using a measurement tape. If the head circumference of an infant is larger than normal, this infant might potentially suffer hydrocephalus. If early diagnosis and timely treatment of hydrocephalus could be done most children can recover successfully. There are some problems with early detection of hydrocephalus using regular tape for head circumference measurement. One of the problem is the infant’s comfort. We need to make the infant feel comfort along the head circumference measurement to get a proper result of the examination. For that, we can use a helpful stuff, like a hat. This paper is aimed to describe the possibility of using a head circumference measuring instrument for early detection of hydrocephalus in an infant with a mike hat, coloured-tape-in-hat. In the first life, infants’ head size is about 35 centimeters. First three months after that infants will gain 2 centimeters each month. The second three months, infant’s head circumference will increase 1 cm each month. And for the six months later, the rate is 0.5 cm per month, and end up with an average of 47 centimeters. This formula is compared to the WHO’s head circumference growth chart. The shape of this tape-in-hat is alike an upper arm measurement. This tape-in-hat diameter is about 47 centimeters. It contains twelve different colours range by age. If it is out of the normal colour, the infant potentially suffers hydrocephalus. This examination should be done monthly. If in two times of measurement there still in the same range abnormal of head circumference, or a rapid growth of the head circumference size, the infant should be referred to a pediatrician. There are the pink hat for girls and blue hat for boys. Based on this paper, we know that this measurement can be used to help early detection of hydrocephalus in an infant.Keywords: head circumference, hydrocephalus, infant, mike hat
Procedia PDF Downloads 2693279 Electrical Decomposition of Time Series of Power Consumption
Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats
Abstract:
Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).Keywords: electrical disaggregation, DTW, general appliance modeling, event detection
Procedia PDF Downloads 803278 Early Gastric Cancer Prediction from Diet and Epidemiological Data Using Machine Learning in Mizoram Population
Authors: Brindha Senthil Kumar, Payel Chakraborty, Senthil Kumar Nachimuthu, Arindam Maitra, Prem Nath
Abstract:
Gastric cancer is predominantly caused by demographic and diet factors as compared to other cancer types. The aim of the study is to predict Early Gastric Cancer (ECG) from diet and lifestyle factors using supervised machine learning algorithms. For this study, 160 healthy individual and 80 cases were selected who had been followed for 3 years (2016-2019), at Civil Hospital, Aizawl, Mizoram. A dataset containing 11 features that are core risk factors for the gastric cancer were extracted. Supervised machine algorithms: Logistic Regression, Naive Bayes, Support Vector Machine (SVM), Multilayer perceptron, and Random Forest were used to analyze the dataset using Python Jupyter Notebook Version 3. The obtained classified results had been evaluated using metrics parameters: minimum_false_positives, brier_score, accuracy, precision, recall, F1_score, and Receiver Operating Characteristics (ROC) curve. Data analysis results showed Naive Bayes - 88, 0.11; Random Forest - 83, 0.16; SVM - 77, 0.22; Logistic Regression - 75, 0.25 and Multilayer perceptron - 72, 0.27 with respect to accuracy and brier_score in percent. Naive Bayes algorithm out performs with very low false positive rates as well as brier_score and good accuracy. Naive Bayes algorithm classification results in predicting ECG showed very satisfactory results using only diet cum lifestyle factors which will be very helpful for the physicians to educate the patients and public, thereby mortality of gastric cancer can be reduced/avoided with this knowledge mining work.Keywords: Early Gastric cancer, Machine Learning, Diet, Lifestyle Characteristics
Procedia PDF Downloads 1643277 Automatic Motion Trajectory Analysis for Dual Human Interaction Using Video Sequences
Authors: Yuan-Hsiang Chang, Pin-Chi Lin, Li-Der Jeng
Abstract:
Advance in techniques of image and video processing has enabled the development of intelligent video surveillance systems. This study was aimed to automatically detect moving human objects and to analyze events of dual human interaction in a surveillance scene. Our system was developed in four major steps: image preprocessing, human object detection, human object tracking, and motion trajectory analysis. The adaptive background subtraction and image processing techniques were used to detect and track moving human objects. To solve the occlusion problem during the interaction, the Kalman filter was used to retain a complete trajectory for each human object. Finally, the motion trajectory analysis was developed to distinguish between the interaction and non-interaction events based on derivatives of trajectories related to the speed of the moving objects. Using a database of 60 video sequences, our system could achieve the classification accuracy of 80% in interaction events and 95% in non-interaction events, respectively. In summary, we have explored the idea to investigate a system for the automatic classification of events for interaction and non-interaction events using surveillance cameras. Ultimately, this system could be incorporated in an intelligent surveillance system for the detection and/or classification of abnormal or criminal events (e.g., theft, snatch, fighting, etc.).Keywords: motion detection, motion tracking, trajectory analysis, video surveillance
Procedia PDF Downloads 5493276 Subpixel Corner Detection for Monocular Camera Linear Model Research
Authors: Guorong Sui, Xingwei Jia, Fei Tong, Xiumin Gao
Abstract:
Camera calibration is a fundamental issue of high precision noncontact measurement. And it is necessary to analyze and study the reliability and application range of its linear model which is often used in the camera calibration. According to the imaging features of monocular cameras, a camera model which is based on the image pixel coordinates and three dimensional space coordinates is built. Using our own customized template, the image pixel coordinate is obtained by the subpixel corner detection method. Without considering the aberration of the optical system, the feature extraction and linearity analysis of the line segment in the template are performed. Moreover, the experiment is repeated 11 times by constantly varying the measuring distance. At last, the linearity of the camera is achieved by fitting 11 groups of data. The camera model measurement results show that the relative error does not exceed 1%, and the repeated measurement error is not more than 0.1 mm magnitude. Meanwhile, it is found that the model has some measurement differences in the different region and object distance. The experiment results show this linear model is simple and practical, and have good linearity within a certain object distance. These experiment results provide a powerful basis for establishment of the linear model of camera. These works will have potential value to the actual engineering measurement.Keywords: camera linear model, geometric imaging relationship, image pixel coordinates, three dimensional space coordinates, sub-pixel corner detection
Procedia PDF Downloads 2793275 Assay for SARS-Cov-2 on Chicken Meat
Authors: R. Mehta, M. Ghogomu, B. Schoel
Abstract:
Reports appeared in 2020 about China detecting SARS-Cov-2 (Covid-19) on frozen meat, shrimp, and food packaging material. In this study, we examined the use of swabs for the detection of Covid-19 on meat samples, and chicken breast (CB) was used as a model. Methods: Heat inactivated SARS-Cov-2 virus (IV) from Microbiologics was loaded onto the CB, swabbing was done, and the recovered inactivated virus was subjected to the Machery & Nagel NucleoSpin RNAVirus kit for RNA isolation according to manufacturer's instructions. For RT-PCR, the IDT 2019-nCoV RUO Covid-19 test kit was used with the Taqman Fast Virus 1-step master mix. The limit of detection (LOD) of viral load recovered from the CB was determined under various conditions: first on frozen CB where the IV was introduced on a defined area, then on frozen CB, with IV spread-out, and finally, on thawed CB. Results: The lowest amount of IV which can be reliably detected on frozen CB was a load of 1,000 - 2,000 IV copies where the IV was loaded on one spot of about 1 square inch. Next, the IV was spread out over a whole frozen CB about 16 square inches. The IV could be recovered at a lowest load of 4,000 to 8,000 copies. Furthermore, the effects of temperature change on viral load recovery was investigated i.e., if raw unfrozen meat became contaminated and remains for 1 hour at 4°C or gets refrozen. The amount of IV recovered successfully from CB kept at 4°C and the refrozen CB was similar to the recovery gotten from loading the IV directly on the frozen CB. In conclusion, an assay using swabs was successfully established for the detection of SARS-Cov-2 on frozen or raw (unfrozen) CB with a minimal load of up to 8,000 copies spread over 16 square inches.Keywords: assay, COVID-19, meat, SARS-Cov-2
Procedia PDF Downloads 2043274 Clinical Efficacy of Indigenous Software for Automatic Detection of Stages of Retinopathy of Prematurity (ROP)
Authors: Joshi Manisha, Shivaram, Anand Vinekar, Tanya Susan Mathews, Yeshaswini Nagaraj
Abstract:
Retinopathy of prematurity (ROP) is abnormal blood vessel development in the retina of the eye in a premature infant. The principal object of the invention is to provide a technique for detecting demarcation line and ridge detection for a given ROP image that facilitates early detection of ROP in stage 1 and stage 2. The demarcation line is an indicator of Stage 1 of the ROP and the ridge is the hallmark of typically Stage 2 ROP. Thirty Retcam images of Asian Indian infants obtained during routine ROP screening have been used for the analysis. A graphical user interface has been developed to detect demarcation line/ridge and to extract ground truth. This novel algorithm uses multilevel vessel enhancement to enhance tubular structures in the digital ROP images. It has been observed that the orientation of the demarcation line/ridge is normal to the direction of the blood vessels, which is used for the identification of the ridge/ demarcation line. Quantitative analysis has been presented based on gold standard images marked by expert ophthalmologist. Image based analysis has been based on the length and the position of the detected ridge. In image based evaluation, average sensitivity and positive predictive value was found to be 92.30% and 85.71% respectively. In pixel based evaluation, average sensitivity, specificity, positive predictive value and negative predictive value achieved were 60.38%, 99.66%, 52.77% and 99.75% respectively.Keywords: ROP, ridge, multilevel vessel enhancement, biomedical
Procedia PDF Downloads 4133273 A Study on Soil Micro-Arthropods Assemblage in Selected Plantations in The Nilgiris, Tamilnadu
Authors: J. Dharmaraj, C. Gunasekaran
Abstract:
Invertebrates are the reliable ecological indicators of disturbance of the forest ecosystems and they respond to environment changes more quickly than other fauna. Among these the terrestrial invertebrates are vital to functioning ecosystems, contributing to processes such as decomposition, nutrient cycling and soil fertility. The natural ecosystems of the forests have been subject to various types of disturbances, which lead to decline of flora and fauna. The comparative diversity of micro-arthropods in natural forest, wattle plantation and eucalyptus plantations were studied in Nilgiris. The study area was divided in to five major sites (Emerald (Site-I), Thalaikundha (Site-II), Kodapmund (Site-III), Aravankad (Site-IV), Kattabettu (Site-V). The research was conducted during period from March 2014 to August 2014. The leaf and soil samples were collected and isolated by using Berlese funnel extraction methods. Specimens were isolated and identified according to their morphology (Balogh 1972). In the present study results clearly showed the variation in soil pH, NPK (Major Nutrients) and organic carbon among the study sites. The chemical components of the leaf litters of the plantation decreased the diversity of micro-arthropods and decomposition rate leads to low amount of carbon and other nutrients present in the soil. Moreover eucalyptus and wattle plantations decreases the availability of the ground water source to other plantations and micro-arthropods and hences affects the soil fertility. Hence, the present study suggests to minimize the growth of wattle and eucalyptus tree plantations in the natural areas which may help to reduce the decline of forests.Keywords: micro-arthropods, assemblage, berlese funnel, morphology, NPK, nilgiris
Procedia PDF Downloads 3103272 Analysis of a IncResU-Net Model for R-Peak Detection in ECG Signals
Authors: Beatriz Lafuente Alcázar, Yash Wani, Amit J. Nimunkar
Abstract:
Cardiovascular Diseases (CVDs) are the leading cause of death globally, and around 80% of sudden cardiac deaths are due to arrhythmias or irregular heartbeats. The majority of these pathologies are revealed by either short-term or long-term alterations in the electrocardiogram (ECG) morphology. The ECG is the main diagnostic tool in cardiology. It is a non-invasive, pain free procedure that measures the heart’s electrical activity and that allows the detecting of abnormal rhythms and underlying conditions. A cardiologist can diagnose a wide range of pathologies based on ECG’s form alterations, but the human interpretation is subjective and it is contingent to error. Moreover, ECG records can be quite prolonged in time, which can further complicate visual diagnosis, and deeply retard disease detection. In this context, deep learning methods have risen as a promising strategy to extract relevant features and eliminate individual subjectivity in ECG analysis. They facilitate the computation of large sets of data and can provide early and precise diagnoses. Therefore, the cardiology field is one of the areas that can most benefit from the implementation of deep learning algorithms. In the present study, a deep learning algorithm is trained following a novel approach, using a combination of different databases as the training set. The goal of the algorithm is to achieve the detection of R-peaks in ECG signals. Its performance is further evaluated in ECG signals with different origins and features to test the model’s ability to generalize its outcomes. Performance of the model for detection of R-peaks for clean and noisy ECGs is presented. The model is able to detect R-peaks in the presence of various types of noise, and when presented with data, it has not been trained. It is expected that this approach will increase the effectiveness and capacity of cardiologists to detect divergences in the normal cardiac activity of their patients.Keywords: arrhythmia, deep learning, electrocardiogram, machine learning, R-peaks
Procedia PDF Downloads 189