Search results for: tree detection
2433 Multiple Etiologies and Incidences of Co-Infections in Childhood Diarrhea in a Hospital Based Screening Study in Odisha, India
Authors: Arpit K. Shrivastava, Nirmal K. Mohakud, Subrat Kumar, Priyadarshi S. Sahu
Abstract:
Acute diarrhea is one of the major causes of morbidity and mortality among children less than five years of age. Multiple etiologies have been implicated for infectious gastroenteritis causing acute diarrhea. In our study fecal samples (n=165) were collected from children (<5 years) presenting with symptoms of acute diarrhea. Samples were screened for viral, bacterial, and parasitic etiologies such as Rotavirus, Adenovirus, Diarrhoeagenic Escherichia coli (EPEC, EHEC, STEC, O157, O111), Shigella spp., Salmonella spp., Vibrio cholera, Cryptosporidium spp., and Giardia spp. The overall results from our study showed that 57% of children below 5 years of age with acute diarrhea were positive for at least one infectious etiology. Diarrhoeagenic Escherichia coli was detected to be the major etiological agent (29.09%) followed by Rotavirus (24.24%), Shigella (21.21%), Adenovirus (5.45%), Cryptosporidium (2.42%), and Giardia (0.60%). Among the different DEC strains, EPEC was detected significantly higher in <2 years children in comparison to >2 years age group (p =0.001). Concurrent infections with two or more pathogens were observed in 47 of 160 (28.48%) cases with a predominant incidence particularly in <2-year-old children (66.66%) compared to children of 2 to 5 years age group. Co-infection of Rotavirus with Shigella was the most frequent combination, which was detected in 17.94% cases, followed by Rotavirus with EPEC (15.38%) and Shigella with STEC (12.82%). Detection of multiple infectious etiologies and diagnosis of the right causative agent(s) can immensely help in better management of acute childhood diarrhea. In future more studies focusing on the detection of cases with concurrent infections must be carried out, as we believe that the etiological agents might be complementing each other’s strategies of pathogenesis resulting in severe diarrhea.Keywords: children, co-infection, infectious diarrhea, Odisha
Procedia PDF Downloads 3362432 A Facile Nanocomposite of Graphene Oxide Reinforced Chitosan/Poly-Nitroaniline Polymer as a Highly Efficient Adsorbent for Extracting Polycyclic Aromatic Hydrocarbons from Tea Samples
Authors: Adel M. Al-Shutairi, Ahmed H. Al-Zahrani
Abstract:
Tea is a popular beverage drunk by millions of people throughout the globe. Tea has considerable health advantages, in-cluding antioxidant, antibacterial, antiviral, chemopreventive, and anticarcinogenic properties. As a result of environmental pollution (atmospheric deposition) and the production process, tealeaves may also include a variety of dangerous substances, such as polycyclic aromatic hydrocarbons (PAHs). In this study, graphene oxide reinforced chitosan/poly-nitroaniline polymer was prepared to develop a sensitive and reliable solid phase extraction method (SPE) for extraction of PAH7 in tea samples, followed by high-performance liquid chromatography- fluorescence detection. The prepared adsorbent was validated in terms of linearity, the limit of detection, the limit of quantification, recovery (%), accuracy (%), and precision (%) for the determination of the PAH7 (benzo[a]pyrene, benzo[a]anthracene, benzo[b]fluoranthene, chrysene, benzo[b]fluoranthene, Dibenzo[a,h]anthracene and Benzo[g,h,i]perylene) in tea samples. The concentration was determined in two types of tea commercially available in Saudi Arabia, including black tea and green tea. The maximum mean of Σ7PAHs in black tea samples was 68.23 ± 0.02 ug kg-1 and 26.68 ± 0.01 ug kg-1 in green tea samples. The minimum mean of Σ7PAHs in black tea samples was 37.93 ± 0.01 ug kg-1 and 15.26 ± 0.01 ug kg-1 in green tea samples. The mean value of benzo[a]pyrene in black tea samples ranged from 6.85 to 12.17 ug kg-1, where two samples exceeded the standard level (10 ug kg-1) established by the European Union (UE), while in green tea ranged from 1.78 to 2.81 ug kg-1. Low levels of Σ7PAHs in green tea samples were detected in comparison with black tea samples.Keywords: polycyclic aromatic hydrocarbons, CS, PNA and GO, black/green tea, solid phase extraction, Saudi Arabia
Procedia PDF Downloads 962431 Improvement of Microscopic Detection of Acid-Fast Bacilli for Tuberculosis by Artificial Intelligence-Assisted Microscopic Platform and Medical Image Recognition System
Authors: Hsiao-Chuan Huang, King-Lung Kuo, Mei-Hsin Lo, Hsiao-Yun Chou, Yusen Lin
Abstract:
The most robust and economical method for laboratory diagnosis of TB is to identify mycobacterial bacilli (AFB) under acid-fast staining despite its disadvantages of low sensitivity and labor-intensive. Though digital pathology becomes popular in medicine, an automated microscopic system for microbiology is still not available. A new AI-assisted automated microscopic system, consisting of a microscopic scanner and recognition program powered by big data and deep learning, may significantly increase the sensitivity of TB smear microscopy. Thus, the objective is to evaluate such an automatic system for the identification of AFB. A total of 5,930 smears was enrolled for this study. An intelligent microscope system (TB-Scan, Wellgen Medical, Taiwan) was used for microscopic image scanning and AFB detection. 272 AFB smears were used for transfer learning to increase the accuracy. Referee medical technicians were used as Gold Standard for result discrepancy. Results showed that, under a total of 1726 AFB smears, the automated system's accuracy, sensitivity and specificity were 95.6% (1,650/1,726), 87.7% (57/65), and 95.9% (1,593/1,661), respectively. Compared to culture, the sensitivity for human technicians was only 33.8% (38/142); however, the automated system can achieve 74.6% (106/142), which is significantly higher than human technicians, and this is the first of such an automated microscope system for TB smear testing in a controlled trial. This automated system could achieve higher TB smear sensitivity and laboratory efficiency and may complement molecular methods (eg. GeneXpert) to reduce the total cost for TB control. Furthermore, such an automated system is capable of remote access by the internet and can be deployed in the area with limited medical resources.Keywords: TB smears, automated microscope, artificial intelligence, medical imaging
Procedia PDF Downloads 2312430 Design and Implementation of an Effective Machine Learning Approach to Crime Prediction and Prevention
Authors: Ashish Kumar, Kaptan Singh, Amit Saxena
Abstract:
Today, it is believed that crimes have the greatest impact on a person's ability to progress financially and personally. Identifying places where individuals shouldn't go is crucial for preventing crimes and is one of the key considerations. As society and technologies have advanced significantly, so have crimes and the harm they wreak. When there is a concentration of people in one place and changes happen quickly, it is even harder to prevent. Because of this, many crime prevention strategies have been embraced as a component of the development of smart cities in numerous cities. However, crimes can occur anywhere; all that is required is to identify the pattern of their occurrences, which will help to lower the crime rate. In this paper, an analysis related to crime has been done; information related to crimes is collected from all over India that can be accessed from anywhere. The purpose of this paper is to investigate the relationship between several factors and India's crime rate. The review has covered information related to every state of India and their associated regions of the period going in between 2001- 2014. However various classes of violations have a marginally unique scope over the years.Keywords: K-nearest neighbor, random forest, decision tree, pre-processing
Procedia PDF Downloads 952429 Retrospective Evaluation of Vector-borne Infections in Cats Living in Germany (2012-2019)
Authors: I. Schäfer, B. Kohn, M. Volkmann, E. Müller
Abstract:
Introduction: Blood-feeding arthropods transmit parasitic, bacterial, or viral pathogens to domestic animals and wildlife. Vector-borne infections are gaining significance due to the increase of travel, import of domestic animals from abroad, and the changing climate in Europe. Aims of the study: The main objective of this retrospective study was to assess the prevalence of vector-borne infections in cats in which a ‘Feline Travel Profile’ had been conducted. Material and Methods: This retrospective study included test results from cats for which a ‘Feline Travel Profile’ established by LABOKLIN had been requested by veterinarians between April 2012 and December 2019. This profile contains direct detection methods via polymerase chain reaction (PCR) for Hepatozoon spp. and Dirofilaria spp. as well as indirect detection methods via immunofluorescence antibody test (IFAT) for Ehrlichia spp. and Leishmania spp. This profile was expanded to include an IFAT for Rickettsia spp. from July 2015 onwards. The prevalence of the different vector-borne infectious agents was calculated. Results: A total of 602 cats were tested using the ‘Feline Travel Profile’. Positive test results were as follows: Rickettsia spp. IFAT 54/442 (12.2%), Ehrlichia spp. IFAT 68/602 (11.3%), Leishmania spp. IFAT 21/602 (3.5%), Hepatozoon spp. PCR 51/595 (8.6%), and Dirofilaria spp. PCR 1/595 cats (0.2%). Co-infections with more than one pathogen could be detected in 22/602 cats. Conclusions: 170/602 cats (28.2%) were tested positive for at least one vector-borne pathogen. Infections with multiple pathogens could be detected in 3.7% of the cats. The data emphasizes the importance of considering vector-borne infections as potential differential diagnoses in cats.Keywords: arthopod-transmitted infections, feline vector-borne infections, Germany, laboratory diagnostics
Procedia PDF Downloads 1692428 Vibration Based Damage Detection and Stiffness Reduction of Bridges: Experimental Study on a Small Scale Concrete Bridge
Authors: Mirco Tarozzi, Giacomo Pignagnoli, Andrea Benedetti
Abstract:
Structural systems are often subjected to degradation processes due to different kind of phenomena like unexpected loadings, ageing of the materials and fatigue cycles. This is true especially for bridges, in which their safety evaluation is crucial for the purpose of a design of planning maintenance. This paper discusses the experimental evaluation of the stiffness reduction from frequency changes due to uniform damage scenario. For this purpose, a 1:4 scaled bridge has been built in the laboratory of the University of Bologna. It is made of concrete and its cross section is composed by a slab linked to four beams. This concrete deck is 6 m long and 3 m wide, and its natural frequencies have been identified dynamically by exciting it with an impact hammer, a dropping weight, or by walking on it randomly. After that, a set of loading cycles has been applied to this bridge in order to produce a uniformly distributed crack pattern. During the loading phase, either cracking moment and yielding moment has been reached. In order to define the relationship between frequency variation and loss in stiffness, the identification of the natural frequencies of the bridge has been performed, before and after the occurrence of the damage, corresponding to each load step. The behavior of breathing cracks and its effect on the natural frequencies has been taken into account in the analytical calculations. By using a sort of exponential function given from the study of lot of experimental tests in the literature, it has been possible to predict the stiffness reduction through the frequency variation measurements. During the load test also crack opening and middle span vertical displacement has been monitored.Keywords: concrete bridge, damage detection, dynamic test, frequency shifts, operational modal analysis
Procedia PDF Downloads 1852427 The Employment of Unmanned Aircraft Systems for Identification and Classification of Helicopter Landing Zones and Airdrop Zones in Calamity Situations
Authors: Marielcio Lacerda, Angelo Paulino, Elcio Shiguemori, Alvaro Damiao, Lamartine Guimaraes, Camila Anjos
Abstract:
Accurate information about the terrain is extremely important in disaster management activities or conflict. This paper proposes the use of the Unmanned Aircraft Systems (UAS) at the identification of Airdrop Zones (AZs) and Helicopter Landing Zones (HLZs). In this paper we consider the AZs the zones where troops or supplies are dropped by parachute, and HLZs areas where victims can be rescued. The use of digital image processing enables the automatic generation of an orthorectified mosaic and an actual Digital Surface Model (DSM). This methodology allows obtaining this fundamental information to the terrain’s comprehension post-disaster in a short amount of time and with good accuracy. In order to get the identification and classification of AZs and HLZs images from DJI drone, model Phantom 4 have been used. The images were obtained with the knowledge and authorization of the responsible sectors and were duly registered in the control agencies. The flight was performed on May 24, 2017, and approximately 1,300 images were obtained during approximately 1 hour of flight. Afterward, new attributes were generated by Feature Extraction (FE) from the original images. The use of multispectral images and complementary attributes generated independently from them increases the accuracy of classification. The attributes of this work include the Declivity Map and Principal Component Analysis (PCA). For the classification four distinct classes were considered: HLZ 1 – small size (18m x 18m); HLZ 2 – medium size (23m x 23m); HLZ 3 – large size (28m x 28m); AZ (100m x 100m). The Decision Tree method Random Forest (RF) was used in this work. RF is a classification method that uses a large collection of de-correlated decision trees. Different random sets of samples are used as sampled objects. The results of classification from each tree and for each object is called a class vote. The resulting classification is decided by a majority of class votes. In this case, we used 200 trees for the execution of RF in the software WEKA 3.8. The classification result was visualized on QGIS Desktop 2.12.3. Through the methodology used, it was possible to classify in the study area: 6 areas as HLZ 1, 6 areas as HLZ 2, 4 areas as HLZ 3; and 2 areas as AZ. It should be noted that an area classified as AZ covers the classifications of the other classes, and may be used as AZ, HLZ of large size (HLZ3), medium size (HLZ2) and small size helicopters (HLZ1). Likewise, an area classified as HLZ for large rotary wing aircraft (HLZ3) covers the smaller area classifications, and so on. It was concluded that images obtained through small UAV are of great use in calamity situations since they can provide data with high accuracy, with low cost, low risk and ease and agility in obtaining aerial photographs. This allows the generation, in a short time, of information about the features of the terrain in order to serve as an important decision support tool.Keywords: disaster management, unmanned aircraft systems, helicopter landing zones, airdrop zones, random forest
Procedia PDF Downloads 1772426 Mondoc: Informal Lightweight Ontology for Faceted Semantic Classification of Hypernymy
Authors: M. Regina Carreira-Lopez
Abstract:
Lightweight ontologies seek to concrete union relationships between a parent node, and a secondary node, also called "child node". This logic relation (L) can be formally defined as a triple ontological relation (LO) equivalent to LO in ⟨LN, LE, LC⟩, and where LN represents a finite set of nodes (N); LE is a set of entities (E), each of which represents a relationship between nodes to form a rooted tree of ⟨LN, LE⟩; and LC is a finite set of concepts (C), encoded in a formal language (FL). Mondoc enables more refined searches on semantic and classified facets for retrieving specialized knowledge about Atlantic migrations, from the Declaration of Independence of the United States of America (1776) and to the end of the Spanish Civil War (1939). The model looks forward to increasing documentary relevance by applying an inverse frequency of co-ocurrent hypernymy phenomena for a concrete dataset of textual corpora, with RMySQL package. Mondoc profiles archival utilities implementing SQL programming code, and allows data export to XML schemas, for achieving semantic and faceted analysis of speech by analyzing keywords in context (KWIC). The methodology applies random and unrestricted sampling techniques with RMySQL to verify the resonance phenomena of inverse documentary relevance between the number of co-occurrences of the same term (t) in more than two documents of a set of texts (D). Secondly, the research also evidences co-associations between (t) and their corresponding synonyms and antonyms (synsets) are also inverse. The results from grouping facets or polysemic words with synsets in more than two textual corpora within their syntagmatic context (nouns, verbs, adjectives, etc.) state how to proceed with semantic indexing of hypernymy phenomena for subject-heading lists and for authority lists for documentary and archival purposes. Mondoc contributes to the development of web directories and seems to achieve a proper and more selective search of e-documents (classification ontology). It can also foster on-line catalogs production for semantic authorities, or concepts, through XML schemas, because its applications could be used for implementing data models, by a prior adaptation of the based-ontology to structured meta-languages, such as OWL, RDF (descriptive ontology). Mondoc serves to the classification of concepts and applies a semantic indexing approach of facets. It enables information retrieval, as well as quantitative and qualitative data interpretation. The model reproduces a triple tuple ⟨LN, LE, LT, LCF L, BKF⟩ where LN is a set of entities that connect with other nodes to concrete a rooted tree in ⟨LN, LE⟩. LT specifies a set of terms, and LCF acts as a finite set of concepts, encoded in a formal language, L. Mondoc only resolves partial problems of linguistic ambiguity (in case of synonymy and antonymy), but neither the pragmatic dimension of natural language nor the cognitive perspective is addressed. To achieve this goal, forthcoming programming developments should target at oriented meta-languages with structured documents in XML.Keywords: hypernymy, information retrieval, lightweight ontology, resonance
Procedia PDF Downloads 1262425 Mutation Analysis of the ATP7B Gene in 43 Vietnamese Wilson’s Disease Patients
Authors: Huong M. T. Nguyen, Hoa A. P. Nguyen, Mai P. T. Nguyen, Ngoc D. Ngo, Van T. Ta, Hai T. Le, Chi V. Phan
Abstract:
Wilson’s disease (WD) is an autosomal recessive disorder of the copper metabolism, which is caused by a mutation in the copper-transporting P-type ATPase (ATP7B). The mechanism of this disease is the failure of hepatic excretion of copper to bile, and leads to copper deposits in the liver and other organs. The ATP7B gene is located on the long arm of chromosome 13 (13q14.3). This study aimed to investigate the gene mutation in the Vietnamese patients with WD, and make a presymptomatic diagnosis for their familial members. Forty-three WD patients and their 65 siblings were identified as having ATP7B gene mutations. Genomic DNA was extracted from peripheral blood samples; 21 exons and exon-intron boundaries of the ATP7B gene were analyzed by direct sequencing. We recognized four mutations ([R723=; H724Tfs*34], V1042Cfs*79, D1027H, and IVS6+3A>G) in the sum of 20 detectable mutations, accounting for 87.2% of the total. Mutation S105* was determined to have a high rate (32.6%) in this study. The hotspot regions of ATP7B were found at exons 2, 16, and 8, and intron 14, in 39.6 %, 11.6 %, 9.3%, and 7 % of patients, respectively. Among nine homozygote/compound heterozygote siblings of the patients with WD, three individuals were determined as asymptomatic by screening mutations of the probands. They would begin treatment after diagnosis. In conclusion, 20 different mutations were detected in 43 WD patients. Of this number, four novel mutations were explored, including [R723=; H724Tfs*34], V1042Cfs*79, D1027H, and IVS6+3A>G. The mutation S105* is the most prevalent and has been considered as a biomarker that can be used in a rapid detection assay for diagnosis of WD patients. Exons 2, 8, and 16, and intron 14 should be screened initially for WD patients in Vietnam. Based on risk profile for WD, genetic testing for presymptomatic patients is also useful in diagnosis and treatment.Keywords: ATP7B gene, mutation detection, presymptomatic diagnosis, Vietnamese Wilson’s disease
Procedia PDF Downloads 3812424 Evaluation of Robust Feature Descriptors for Texture Classification
Authors: Jia-Hong Lee, Mei-Yi Wu, Hsien-Tsung Kuo
Abstract:
Texture is an important characteristic in real and synthetic scenes. Texture analysis plays a critical role in inspecting surfaces and provides important techniques in a variety of applications. Although several descriptors have been presented to extract texture features, the development of object recognition is still a difficult task due to the complex aspects of texture. Recently, many robust and scaling-invariant image features such as SIFT, SURF and ORB have been successfully used in image retrieval and object recognition. In this paper, we have tried to compare the performance for texture classification using these feature descriptors with k-means clustering. Different classifiers including K-NN, Naive Bayes, Back Propagation Neural Network , Decision Tree and Kstar were applied in three texture image sets - UIUCTex, KTH-TIPS and Brodatz, respectively. Experimental results reveal SIFTS as the best average accuracy rate holder in UIUCTex, KTH-TIPS and SURF is advantaged in Brodatz texture set. BP neuro network works best in the test set classification among all used classifiers.Keywords: texture classification, texture descriptor, SIFT, SURF, ORB
Procedia PDF Downloads 3712423 Regression Model Evaluation on Depth Camera Data for Gaze Estimation
Authors: James Purnama, Riri Fitri Sari
Abstract:
We investigate the machine learning algorithm selection problem in the term of a depth image based eye gaze estimation, with respect to its essential difficulty in reducing the number of required training samples and duration time of training. Statistics based prediction accuracy are increasingly used to assess and evaluate prediction or estimation in gaze estimation. This article evaluates Root Mean Squared Error (RMSE) and R-Squared statistical analysis to assess machine learning methods on depth camera data for gaze estimation. There are 4 machines learning methods have been evaluated: Random Forest Regression, Regression Tree, Support Vector Machine (SVM), and Linear Regression. The experiment results show that the Random Forest Regression has the lowest RMSE and the highest R-Squared, which means that it is the best among other methods.Keywords: gaze estimation, gaze tracking, eye tracking, kinect, regression model, orange python
Procedia PDF Downloads 5392422 Detection of Patient Roll-Over Using High-Sensitivity Pressure Sensors
Authors: Keita Nishio, Takashi Kaburagi, Yosuke Kurihara
Abstract:
Recent advances in medical technology have served to enhance average life expectancy. However, the total time for which the patients are prescribed complete bedrest has also increased. With patients being required to maintain a constant lying posture- also called bedsore- development of a system to detect patient roll-over becomes imperative. For this purpose, extant studies have proposed the use of cameras, and favorable results have been reported. Continuous on-camera monitoring, however, tends to violate patient privacy. We have proposed unconstrained bio-signal measurement system that could detect body-motion during sleep and does not violate patient’s privacy. Therefore, in this study, we propose a roll-over detection method by the date obtained from the bi-signal measurement system. Signals recorded by the sensor were assumed to comprise respiration, pulse, body motion, and noise components. Compared the body-motion and respiration, pulse component, the body-motion, during roll-over, generate large vibration. Thus, analysis of the body-motion component facilitates detection of the roll-over tendency. The large vibration associated with the roll-over motion has a great effect on the Root Mean Square (RMS) value of time series of the body motion component calculated during short 10 s segments. After calculation, the RMS value during each segment was compared to a threshold value set in advance. If RMS value in any segment exceeded the threshold, corresponding data were considered to indicate occurrence of a roll-over. In order to validate the proposed method, we conducted experiment. A bi-directional microphone was adopted as a high-sensitivity pressure sensor and was placed between the mattress and bedframe. Recorded signals passed through an analog Band-pass Filter (BPF) operating over the 0.16-16 Hz bandwidth. BPF allowed the respiration, pulse, and body-motion to pass whilst removing the noise component. Output from BPF was A/D converted with the sampling frequency 100Hz, and the measurement time was 480 seconds. The number of subjects and data corresponded to 5 and 10, respectively. Subjects laid on a mattress in the supine position. During data measurement, subjects—upon the investigator's instruction—were asked to roll over into four different positions—supine to left lateral, left lateral to prone, prone to right lateral, and right lateral to supine. Recorded data was divided into 48 segments with 10 s intervals, and the corresponding RMS value for each segment was calculated. The system was evaluated by the accuracy between the investigator’s instruction and the detected segment. As the result, an accuracy of 100% was achieved. While reviewing the time series of recorded data, segments indicating roll-over tendencies were observed to demonstrate a large amplitude. However, clear differences between decubitus and the roll-over motion could not be confirmed. Extant researches possessed a disadvantage in terms of patient privacy. The proposed study, however, demonstrates more precise detection of patient roll-over tendencies without violating their privacy. As a future prospect, decubitus estimation before and after roll-over could be attempted. Since in this paper, we could not confirm the clear differences between decubitus and the roll-over motion, future studies could be based on utilization of the respiration and pulse components.Keywords: bedsore, high-sensitivity pressure sensor, roll-over, unconstrained bio-signal measurement
Procedia PDF Downloads 1212421 Forest Fire Burnt Area Assessment in a Part of West Himalayan Region Using Differenced Normalized Burnt Ratio and Neural Network Approach
Authors: Sunil Chandra, Himanshu Rawat, Vikas Gusain, Triparna Barman
Abstract:
Forest fires are a recurrent phenomenon in the Himalayan region owing to the presence of vulnerable forest types, topographical gradients, climatic weather conditions, and anthropogenic pressure. The present study focuses on the identification of forest fire-affected areas in a small part of the West Himalayan region using a differential normalized burnt ratio method and spectral unmixing methods. The study area has a rugged terrain with the presence of sub-tropical pine forest, montane temperate forest, and sub-alpine forest and scrub. The major reason for fires in this region is anthropogenic in nature, with the practice of human-induced fires for getting fresh leaves, scaring wild animals to protect agricultural crops, grazing practices within reserved forests, and igniting fires for cooking and other reasons. The fires caused by the above reasons affect a large area on the ground, necessitating its precise estimation for further management and policy making. In the present study, two approaches have been used for carrying out a burnt area analysis. The first approach followed for burnt area analysis uses a differenced normalized burnt ratio (dNBR) index approach that uses the burnt ratio values generated using the Short-Wave Infrared (SWIR) band and Near Infrared (NIR) bands of the Sentinel-2 image. The results of the dNBR have been compared with the outputs of the spectral mixing methods. It has been found that the dNBR is able to create good results in fire-affected areas having homogenous forest stratum and with slope degree <5 degrees. However, in a rugged terrain where the landscape is largely influenced by the topographical variations, vegetation types, tree density, the results may be largely influenced by the effects of topography, complexity in tree composition, fuel load composition, and soil moisture. Hence, such variations in the factors influencing burnt area assessment may not be effectively carried out using a dNBR approach which is commonly followed for burnt area assessment over a large area. Hence, another approach that has been attempted in the present study utilizes a spectral mixing method where the individual pixel is tested before assigning an information class to it. The method uses a neural network approach utilizing Sentinel-2 bands. The training and testing data are generated from the Sentinel-2 data and the national field inventory, which is further used for generating outputs using ML tools. The analysis of the results indicates that the fire-affected regions and their severity can be better estimated using spectral unmixing methods, which have the capability to resolve the noise in the data and can classify the individual pixel to the precise burnt/unburnt class.Keywords: categorical data, log linear modeling, neural network, shifting cultivation
Procedia PDF Downloads 562420 Detection and Quantification of Ochratoxin A in Food by Aptasensor
Authors: Moez Elsaadani, Noel Durand, Brice Sorli, Didier Montet
Abstract:
Governments and international instances are trying to improve the food safety system to prevent, reduce or avoid the increase of food borne diseases. This food risk is one of the major concerns for the humanity. The contamination by mycotoxins is a threat to the health and life of humans and animals. One of the most common mycotoxin contaminating feed and foodstuffs is Ochratoxin A (OTA), which is a secondary metabolite, produced by Aspergillus and Penicillium strains. OTA has a chronic toxic effect and proved to be mutagenic, nephrotoxic, teratogenic, immunosuppressive, and carcinogenic. On the other side, because of their high stability, specificity, affinity, and their easy chemical synthesis, aptamer based methods are applied to OTA biosensing as alternative to traditional analytical technique. In this work, five aptamers have been tested to confirm qualitatively and quantitatively their binding with OTA. In the same time, three different analytical methods were tested and compared based on their ability to detect and quantify the OTA. The best protocol that was established to quantify free OTA from linked OTA involved an ultrafiltration method in green coffee solution with. OTA was quantified by HPLC-FLD to calculate the binding percentage of all five aptamers. One aptamer (The most effective with 87% binding with OTA) has been selected to be our biorecognition element to study its electrical response (variation of electrical properties) in the presence of OTA in order to be able to make a pairing with a radio frequency identification (RFID). This device, which is characterized by its low cost, speed, and a simple wireless information transmission, will implement the knowledge on the mycotoxins molecular sensors (aptamers), an electronic device that will link the information, the quantification and make it available to operators.Keywords: aptamer, aptasensor, detection, Ochratoxin A
Procedia PDF Downloads 1832419 Tree Species Classification Using Effective Features of Polarimetric SAR and Hyperspectral Images
Authors: Milad Vahidi, Mahmod R. Sahebi, Mehrnoosh Omati, Reza Mohammadi
Abstract:
Forest management organizations need information to perform their work effectively. Remote sensing is an effective method to acquire information from the Earth. Two datasets of remote sensing images were used to classify forested regions. Firstly, all of extractable features from hyperspectral and PolSAR images were extracted. The optical features were spectral indexes related to the chemical, water contents, structural indexes, effective bands and absorption features. Also, PolSAR features were the original data, target decomposition components, and SAR discriminators features. Secondly, the particle swarm optimization (PSO) and the genetic algorithms (GA) were applied to select optimization features. Furthermore, the support vector machine (SVM) classifier was used to classify the image. The results showed that the combination of PSO and SVM had higher overall accuracy than the other cases. This combination provided overall accuracy about 90.56%. The effective features were the spectral index, the bands in shortwave infrared (SWIR) and the visible ranges and certain PolSAR features.Keywords: hyperspectral, PolSAR, feature selection, SVM
Procedia PDF Downloads 4192418 Telemedicine Services in Ophthalmology: A Review of Studies
Authors: Nasim Hashemi, Abbas Sheikhtaheri
Abstract:
Telemedicine is the use of telecommunication and information technologies to provide health care services that would often not be consistently available in distant rural communities to people at these remote areas. Teleophthalmology is a branch of telemedicine that delivers eye care through digital medical equipment and telecommunications technology. Thus, teleophthalmology can overcome geographical barriers and improve quality, access, and affordability of eye health care services. Since teleophthalmology has been widespread applied in recent years, the aim of this study was to determine the different applications of teleophthalmology in the world. To this end, three bibliographic databases (Medline, ScienceDirect, Scopus) were comprehensively searched with these keywords: eye care, eye health care, primary eye care, diagnosis, detection, and screening of different eye diseases in conjunction with telemedicine, telehealth, teleophthalmology, e-services, and information technology. All types of papers were included in the study with no time restriction. We conducted the search strategies until 2015. Finally 70 articles were surveyed. We classified the results based on the’type of eye problems covered’ and ‘the type of telemedicine services’. Based on the review, from the ‘perspective of health care levels’, there are three level for eye health care as primary, secondary and tertiary eye care. From the ‘perspective of eye care services’, the main application of teleophthalmology in primary eye care was related to the diagnosis of different eye diseases such as diabetic retinopathy, macular edema, strabismus and aged related macular degeneration. The main application of teleophthalmology in secondary and tertiary eye care was related to the screening of eye problems i.e. diabetic retinopathy, astigmatism, glaucoma screening. Teleconsultation between health care providers and ophthalmologists and also education and training sessions for patients were other types of teleophthalmology in world. Real time, store–forward and hybrid methods were the main forms of the communication from the perspective of ‘teleophthalmology mode’ which is used based on IT infrastructure between sending and receiving centers. In aspect of specialists, early detection of serious aged-related ophthalmic disease in population, screening of eye disease processes, consultation in an emergency cases and comprehensive eye examination were the most important benefits of teleophthalmology. Cost-effectiveness of teleophthalmology projects resulted from reducing transportation and accommodation cost, access to affordable eye care services and receiving specialist opinions were also the main advantages of teleophthalmology for patients. Teleophthalmology brings valuable secondary and tertiary care to remote areas. So, applying teleophthalmology for detection, treatment and screening purposes and expanding its use in new applications such as eye surgery will be a key tool to promote public health and integrating eye care to primary health care.Keywords: applications, telehealth, telemedicine, teleophthalmology
Procedia PDF Downloads 3752417 Event Data Representation Based on Time Stamp for Pedestrian Detection
Authors: Yuta Nakano, Kozo Kajiwara, Atsushi Hori, Takeshi Fujita
Abstract:
In association with the wave of electric vehicles (EV), low energy consumption systems have become more and more important. One of the key technologies to realize low energy consumption is a dynamic vision sensor (DVS), or we can call it an event sensor, neuromorphic vision sensor and so on. This sensor has several features, such as high temporal resolution, which can achieve 1 Mframe/s, and a high dynamic range (120 DB). However, the point that can contribute to low energy consumption the most is its sparsity; to be more specific, this sensor only captures the pixels that have intensity change. In other words, there is no signal in the area that does not have any intensity change. That is to say, this sensor is more energy efficient than conventional sensors such as RGB cameras because we can remove redundant data. On the other side of the advantages, it is difficult to handle the data because the data format is completely different from RGB image; for example, acquired signals are asynchronous and sparse, and each signal is composed of x-y coordinate, polarity (two values: +1 or -1) and time stamp, it does not include intensity such as RGB values. Therefore, as we cannot use existing algorithms straightforwardly, we have to design a new processing algorithm to cope with DVS data. In order to solve difficulties caused by data format differences, most of the prior arts make a frame data and feed it to deep learning such as Convolutional Neural Networks (CNN) for object detection and recognition purposes. However, even though we can feed the data, it is still difficult to achieve good performance due to a lack of intensity information. Although polarity is often used as intensity instead of RGB pixel value, it is apparent that polarity information is not rich enough. Considering this context, we proposed to use the timestamp information as a data representation that is fed to deep learning. Concretely, at first, we also make frame data divided by a certain time period, then give intensity value in response to the timestamp in each frame; for example, a high value is given on a recent signal. We expected that this data representation could capture the features, especially of moving objects, because timestamp represents the movement direction and speed. By using this proposal method, we made our own dataset by DVS fixed on a parked car to develop an application for a surveillance system that can detect persons around the car. We think DVS is one of the ideal sensors for surveillance purposes because this sensor can run for a long time with low energy consumption in a NOT dynamic situation. For comparison purposes, we reproduced state of the art method as a benchmark, which makes frames the same as us and feeds polarity information to CNN. Then, we measured the object detection performances of the benchmark and ours on the same dataset. As a result, our method achieved a maximum of 7 points greater than the benchmark in the F1 score.Keywords: event camera, dynamic vision sensor, deep learning, data representation, object recognition, low energy consumption
Procedia PDF Downloads 1012416 Theoretical Modeling of Mechanical Properties of Eco-Friendly Composites Derived from Sugar Palm
Authors: J. Sahari, S. M. Sapuan
Abstract:
Eco-friendly composites have been successfully prepared by using sugar palm tree as a sources. The effect of fibre content on mechanical properties of (SPF/SPS) biocomposites have been done and the experimentally tensile properties (tensile strength and modulus) of biocomposites have been compared with the existing theories of reinforcement. The biocomposites were prepared with different amounts of fibres (i.e. 10%, 20% and 30% by weight percent). The mechanical properties of plasticized SPS improved with the incorporation of fibres. Both approaches (experimental and theoretical) show that the young’s modulus of the biocomposites is consistently increased when the sugar palm fibre (SPF) are placed into the sugar palm starch matrix (SPS). Surface morphological study through scanning electron microscopy showed homogeneous distribution of fibres and matrix with good adhesion which play an important role in improving the mechanical properties of biocomposites. The observed deviations between the experimental and theoretical values are explained by the simplifying model assumptions applied for the configuration of the composites, in particular the sugar palm starch composites.Keywords: eco-friendly, biocomposite, mechanical, experimental, theoretical
Procedia PDF Downloads 4452415 Transdisciplinary Methodological Innovation: Connecting Natural and Social Sciences Research through a Training Toolbox
Authors: Jessica M. Black
Abstract:
Although much of natural and social science research aims to enhance human flourishing and address social problems, the training within the two fields is significantly different across theory, methodology, and implementation of results. Social scientists are trained in social, psychological, and to the extent that it is relevant to their discipline, spiritual development, theory, and accompanying methodologies. They tend not to receive training or learn about accompanying methodology related to interrogating human development and social problems from a biological perspective. On the other hand, those in the natural sciences, and for the purpose of this work, human biological sciences specifically – biology, neuroscience, genetics, epigenetics, and physiology – are often trained first to consider cellular development and related methodologies, and may not have opportunity to receive formal training in many of the foundational principles that guide human development, such as systems theory or person-in-environment framework, methodology related to tapping both proximal and distal psycho-social-spiritual influences on human development, and foundational principles of equity, justice and inclusion in research design. There is a need for disciplines heretofore siloed to know one another, to receive streamlined, easy to access training in theory and methods from one another and to learn how to build interdisciplinary teams that can speak and act upon a shared research language. Team science is more essential than ever, as are transdisciplinary approaches to training and research design. This study explores the use of a methodological toolbox that natural and social scientists can use by employing a decision-making tree regarding project aims, costs, and participants, among other important study variables. The decision tree begins with a decision about whether the researcher wants to learn more about social sciences approaches or biological approaches to study design. The toolbox and platform are flexible, such that users could also choose among modules, for instance, reviewing epigenetics or community-based participatory research even if those are aspects already a part of their home field. To start, both natural and social scientists would receive training on systems science, team science, transdisciplinary approaches, and translational science. Next, social scientists would receive training on grounding biological theory and the following methodological approaches and tools: physiology, (epi)genetics, non-invasive neuroimaging, invasive neuroimaging, endocrinology, and the gut-brain connection. Natural scientists would receive training on grounding social science theory, and measurement including variables, assessment and surveys on human development as related to the developing person (e.g., temperament and identity), microsystems (e.g., systems that directly interact with the person such as family and peers), mesosystems (e.g., systems that interact with one another but do not directly interact with the individual person, such as parent and teacher relationships with one another), exosystems (e.g., spaces and settings that may come back to affect the individual person, such as a parent’s work environment, but within which the individual does not directly interact, macrosystems (e.g., wider culture and policy), and the chronosystem (e.g., historical time, such as the generational impact of trauma). Participants will be able to engage with the toolbox and one another to foster increased transdisciplinary workKeywords: methodology, natural science, social science, transdisciplinary
Procedia PDF Downloads 1172414 A Visual Analytics Tool for the Structural Health Monitoring of an Aircraft Panel
Authors: F. M. Pisano, M. Ciminello
Abstract:
Aerospace, mechanical, and civil engineering infrastructures can take advantages from damage detection and identification strategies in terms of maintenance cost reduction and operational life improvements, as well for safety scopes. The challenge is to detect so called “barely visible impact damage” (BVID), due to low/medium energy impacts, that can progressively compromise the structure integrity. The occurrence of any local change in material properties, that can degrade the structure performance, is to be monitored using so called Structural Health Monitoring (SHM) systems, in charge of comparing the structure states before and after damage occurs. SHM seeks for any "anomalous" response collected by means of sensor networks and then analyzed using appropriate algorithms. Independently of the specific analysis approach adopted for structural damage detection and localization, textual reports, tables and graphs describing possible outlier coordinates and damage severity are usually provided as artifacts to be elaborated for information extraction about the current health conditions of the structure under investigation. Visual Analytics can support the processing of monitored measurements offering data navigation and exploration tools leveraging the native human capabilities of understanding images faster than texts and tables. Herein, a SHM system enrichment by integration of a Visual Analytics component is investigated. Analytical dashboards have been created by combining worksheets, so that a useful Visual Analytics tool is provided to structural analysts for exploring the structure health conditions examined by a Principal Component Analysis based algorithm.Keywords: interactive dashboards, optical fibers, structural health monitoring, visual analytics
Procedia PDF Downloads 1252413 On an Approach for Rule Generation in Association Rule Mining
Authors: B. Chandra
Abstract:
In Association Rule Mining, much attention has been paid for developing algorithms for large (frequent/closed/maximal) itemsets but very little attention has been paid to improve the performance of rule generation algorithms. Rule generation is an important part of Association Rule Mining. In this paper, a novel approach named NARG (Association Rule using Antecedent Support) has been proposed for rule generation that uses memory resident data structure named FCET (Frequent Closed Enumeration Tree) to find frequent/closed itemsets. In addition, the computational speed of NARG is enhanced by giving importance to the rules that have lower antecedent support. Comparative performance evaluation of NARG with fast association rule mining algorithm for rule generation has been done on synthetic datasets and real life datasets (taken from UCI Machine Learning Repository). Performance analysis shows that NARG is computationally faster in comparison to the existing algorithms for rule generation.Keywords: knowledge discovery, association rule mining, antecedent support, rule generation
Procedia PDF Downloads 3262412 Non-Revenue Water Management in Palestine
Authors: Samah Jawad Jabari
Abstract:
Water is the most important and valuable resource not only for human life but also for all living things on the planet. The water supply utilities should fulfill the water requirement quantitatively and qualitatively. Drinking water systems are exposed to both natural (hurricanes and flood) and manmade hazards (risks) that are common in Palestine. Non-Revenue Water (NRW) is a manmade risk which remains a major concern in Palestine, as the NRW levels are estimated to be at a high level. In this research, Hebron city water distribution network was taken as a case study to estimate and audit the NRW levels. The research also investigated the state of the existing water distribution system in the study area by investigating the water losses and obtained more information on NRW prevention and management practices. Data and information have been collected from the Palestinian Water Authority (PWA) and Hebron Municipality (HM) archive. In addition to that, a questionnaire has been designed and administered by the researcher in order to collect the necessary data for water auditing. The questionnaire also assessed the views of stakeholder in PWA and HM (staff) on the current status of the NRW in the Hebron water distribution system. The important result obtained by this research shows that NRW in Hebron city was high and in excess of 30%. The main factors that contribute to NRW were the inaccuracies in billing volumes, unauthorized consumption, and the method of estimating consumptions through faulty meters. Policy for NRW reduction is available in Palestine; however, it is clear that the number of qualified staff available to carry out the activities related to leak detection is low, and that there is a lack of appropriate technologies to reduce water losses and undertake sufficient system maintenance, which needs to be improved to enhance the performance of the network and decrease the level of NRW losses.Keywords: non-revenue water, water auditing, leak detection, water meters
Procedia PDF Downloads 3002411 Non-Destructive Technique for Detection of Voids in the IC Package Using Terahertz-Time Domain Spectrometer
Authors: Sung-Hyeon Park, Jin-Wook Jang, Hak-Sung Kim
Abstract:
In recent years, Terahertz (THz) time-domain spectroscopy (TDS) imaging method has been received considerable interest as a promising non-destructive technique for detection of internal defects. In comparison to other non-destructive techniques such as x-ray inspection method, scanning acoustic tomograph (SAT) and microwave inspection method, THz-TDS imaging method has many advantages: First, it can measure the exact thickness and location of defects. Second, it doesn’t require the liquid couplant while it is very crucial to deliver that power of ultrasonic wave in SAT method. Third, it didn’t damage to materials and be harmful to human bodies while x-ray inspection method does. Finally, it exhibits better spatial resolution than microwave inspection method. However, this technology couldn’t be applied to IC package because THz radiation can penetrate through a wide variety of materials including polymers and ceramics except of metals. Therefore, it is difficult to detect the defects in IC package which are composed of not only epoxy and semiconductor materials but also various metals such as copper, aluminum and gold. In this work, we proposed a special method for detecting the void in the IC package using THz-TDS imaging system. The IC package specimens for this study are prepared by Packaging Engineering Team in Samsung Electronics. Our THz-TDS imaging system has a special reflection mode called pitch-catch mode which can change the incidence angle in the reflection mode from 10 o to 70 o while the others have transmission and the normal reflection mode or the reflection mode fixed at certain angle. Therefore, to find the voids in the IC package, we investigated the appropriate angle as changing the incidence angle of THz wave emitter and detector. As the results, the voids in the IC packages were successfully detected using our THz-TDS imaging system.Keywords: terahertz, non-destructive technique, void, IC package
Procedia PDF Downloads 4742410 Efficient Recommendation System for Frequent and High Utility Itemsets over Incremental Datasets
Authors: J. K. Kavitha, D. Manjula, U. Kanimozhi
Abstract:
Mining frequent and high utility item sets have gained much significance in the recent years. When the data arrives sporadically, incremental and interactive rule mining and utility mining approaches can be adopted to handle user’s dynamic environmental needs and avoid redundancies, using previous data structures, and mining results. The dependence on recommendation systems has exponentially risen since the advent of search engines. This paper proposes a model for building a recommendation system that suggests frequent and high utility item sets over dynamic datasets for a cluster based location prediction strategy to predict user’s trajectories using the Efficient Incremental Rule Mining (EIRM) algorithm and the Fast Update Utility Pattern Tree (FUUP) algorithm. Through comprehensive evaluations by experiments, this scheme has shown to deliver excellent performance.Keywords: data sets, recommendation system, utility item sets, frequent item sets mining
Procedia PDF Downloads 2952409 Problems and Solutions in the Application of ICP-MS for Analysis of Trace Elements in Various Samples
Authors: Béla Kovács, Éva Bódi, Farzaneh Garousi, Szilvia Várallyay, Áron Soós, Xénia Vágó, Dávid Andrási
Abstract:
In agriculture for analysis of elements in different food and food raw materials, moreover environmental samples generally flame atomic absorption spectrometers (FAAS), graphite furnace atomic absorption spectrometers (GF-AAS), inductively coupled plasma optical emission spectrometers (ICP-OES) and inductively coupled plasma mass spectrometers (ICP-MS) are routinely applied. An inductively coupled plasma mass spectrometer (ICP-MS) is capable for analysis of 70-80 elements in multielemental mode, from 1-5 cm3 volume of a sample, moreover the detection limits of elements are in µg/kg-ng/kg (ppb-ppt) concentration range. All the analytical instruments have different physical and chemical interfering effects analysing the above types of samples. The smaller the concentration of an analyte and the larger the concentration of the matrix the larger the interfering effects. Nowadays there is very important to analyse growingly smaller concentrations of elements. From the above analytical instruments generally the inductively coupled plasma mass spectrometer is capable of analysing the smallest concentration of elements. The applied ICP-MS instrument has Collision Cell Technology (CCT) also. Using CCT mode certain elements have better (smaller) detection limits with 1-3 magnitudes comparing to a normal ICP-MS analytical method. The CCT mode has better detection limits mainly for analysis of selenium, arsenic, germanium, vanadium and chromium. To elaborate an analytical method for trace elements with an inductively coupled plasma mass spectrometer the most important interfering effects (problems) were evaluated: 1) Physical interferences; 2) Spectral interferences (elemental and molecular isobaric); 3) Effect of easily ionisable elements; 4) Memory interferences. Analysing food and food raw materials, moreover environmental samples an other (new) interfering effect emerged in ICP-MS, namely the effect of various matrixes having different evaporation and nebulization effectiveness, moreover having different quantity of carbon content of food and food raw materials, moreover environmental samples. In our research work the effect of different water-soluble compounds furthermore the effect of various quantity of carbon content (as sample matrix) were examined on changes of intensity of the applied elements. So finally we could find “opportunities” to decrease or eliminate the error of the analyses of applied elements (Cr, Co, Ni, Cu, Zn, Ge, As, Se, Mo, Cd, Sn, Sb, Te, Hg, Pb, Bi). To analyse these elements in the above samples, the most appropriate inductively coupled plasma mass spectrometer is a quadrupole instrument applying a collision cell technique (CCT). The extent of interfering effect of carbon content depends on the type of compounds. The carbon content significantly affects the measured concentration (intensities) of the above elements, which can be corrected using different internal standards.Keywords: elements, environmental and food samples, ICP-MS, interference effects
Procedia PDF Downloads 5042408 Using Machine Learning to Predict Answers to Big-Five Personality Questions
Authors: Aadityaa Singla
Abstract:
The big five personality traits are as follows: openness, conscientiousness, extraversion, agreeableness, and neuroticism. In order to get an insight into their personality, many flocks to these categories, which each have different meanings/characteristics. This information is important not only to individuals but also to career professionals and psychologists who can use this information for candidate assessment or job recruitment. The links between AI and psychology have been well studied in cognitive science, but it is still a rather novel development. It is possible for various AI classification models to accurately predict a personality question via ten input questions. This would contrast with the hundred questions that normal humans have to answer to gain a complete picture of their five personality traits. In order to approach this problem, various AI classification models were used on a dataset to predict what a user may answer. From there, the model's prediction was compared to its actual response. Normally, there are five answer choices (a 20% chance of correct guess), and the models exceed that value to different degrees, proving their significance. By utilizing an MLP classifier, decision tree, linear model, and K-nearest neighbors, they were able to obtain a test accuracy of 86.643, 54.625, 47.875, and 52.125, respectively. These approaches display that there is potential in the future for more nuanced predictions to be made regarding personality.Keywords: machine learning, personally, big five personality traits, cognitive science
Procedia PDF Downloads 1472407 South African Breast Cancer Mutation Spectrum: Pitfalls to Copy Number Variation Detection Using Internationally Designed Multiplex Ligation-Dependent Probe Amplification and Next Generation Sequencing Panels
Authors: Jaco Oosthuizen, Nerina C. Van Der Merwe
Abstract:
The National Health Laboratory Services in Bloemfontien has been the diagnostic testing facility for 1830 patients for familial breast cancer since 1997. From the cohort, 540 were comprehensively screened using High-Resolution Melting Analysis or Next Generation Sequencing for the presence of point mutations and/or indels. Approximately 90% of these patients stil remain undiagnosed as they are BRCA1/2 negative. Multiplex ligation-dependent probe amplification was initially added to screen for copy number variation detection, but with the introduction of next generation sequencing in 2017, was substituted and is currently used as a confirmation assay. The aim was to investigate the viability of utilizing internationally designed copy number variation detection assays based on mostly European/Caucasian genomic data for use within a South African context. The multiplex ligation-dependent probe amplification technique is based on the hybridization and subsequent ligation of multiple probes to a targeted exon. The ligated probes are amplified using conventional polymerase chain reaction, followed by fragment analysis by means of capillary electrophoresis. The experimental design of the assay was performed according to the guidelines of MRC-Holland. For BRCA1 (P002-D1) and BRCA2 (P045-B3), both multiplex assays were validated, and results were confirmed using a secondary probe set for each gene. The next generation sequencing technique is based on target amplification via multiplex polymerase chain reaction, where after the amplicons are sequenced parallel on a semiconductor chip. Amplified read counts are visualized as relative copy numbers to determine the median of the absolute values of all pairwise differences. Various experimental parameters such as DNA quality, quantity, and signal intensity or read depth were verified using positive and negative patients previously tested internationally. DNA quality and quantity proved to be the critical factors during the verification of both assays. The quantity influenced the relative copy number frequency directly whereas the quality of the DNA and its salt concentration influenced denaturation consistency in both assays. Multiplex ligation-dependent probe amplification produced false positives due to ligation failure when ligation was inhibited due to a variant present within the ligation site. Next generation sequencing produced false positives due to read dropout when primer sequences did not meet optimal multiplex binding kinetics due to population variants in the primer binding site. The analytical sensitivity and specificity for the South African population have been proven. Verification resulted in repeatable reactions with regards to the detection of relative copy number differences. Both multiplex ligation-dependent probe amplification and next generation sequencing multiplex panels need to be optimized to accommodate South African polymorphisms present within the genetically diverse ethnic groups to reduce the false copy number variation positive rate and increase performance efficiency.Keywords: familial breast cancer, multiplex ligation-dependent probe amplification, next generation sequencing, South Africa
Procedia PDF Downloads 2332406 Coagulase Negative Staphylococci: Phenotypic Characterization and Antimicrobial Susceptibility Pattern
Authors: Lok Bahadur Shrestha, Narayan Raj Bhattarai, Basudha Khanal
Abstract:
Introduction: Coagulase-negative staphylococci (CoNS) are the normal commensal of human skin and mucous membranes. The study was carried out to study the prevalence of CoNS among clinical isolates, to characterize them up to species level and to compare the three conventional methods for detection of biofilm formation. Objectives: to characterize the clinically significant coagulase-negative staphylococci up to species level, to compare the three phenotypic methods for the detection of biofilm formation and to study the antimicrobial susceptibility pattern of the isolates. Methods: CoNS isolates were obtained from various clinical samples during the period of 1 year. Characterization up to species level was done using biochemical test and study of biofilm formation was done by tube adherence, congo red agar, and tissue culture plate method. Results: Among 71 CoNS isolates, seven species were identified. S. epidermidis was the most common species followed by S. saprophyticus, S. haemolyticus. Antimicrobial susceptibility pattern of CoNS documented resistance of 90% to ampicillin. Resistance to cefoxitin and ceftriaxone was observed in 55% of the isolates. We detected biofilm formation in 71.8% of isolates. The sensitivity of tube adherence method was 82% while that of congo red agar method was 78%. Conclusion: Among 71 CoNS isolated, S. epidermidis was the most common isolates followed by S. saprophyticus and S. haemolyticus. Biofilm formation was detected in 71.8% of the isolates. All of the methods were effective at detecting biofilm-producing CoNS strains. Biofilm former strains are more resistant to antibiotics as compared to biofilm non-formers.Keywords: CoNS, congo red agar, bloodstream infections, foreign body-related infections, tissue culture plate
Procedia PDF Downloads 2002405 A Visualization Classification Method for Identifying the Decayed Citrus Fruit Infected by Fungi Based on Hyperspectral Imaging
Authors: Jiangbo Li, Wenqian Huang
Abstract:
Early detection of fungal infection in citrus fruit is one of the major problems in the postharvest commercialization process. The automatic and nondestructive detection of infected fruits is still a challenge for the citrus industry. At present, the visual inspection of rotten citrus fruits is commonly performed by workers through the ultraviolet induction fluorescence technology or manual sorting in citrus packinghouses to remove fruit subject with fungal infection. However, the former entails a number of problems because exposing people to this kind of lighting is potentially hazardous to human health, and the latter is very inefficient. Orange is used as a research object. This study would focus on this problem and proposed an effective method based on Vis-NIR hyperspectral imaging in the wavelength range of 400-1000 nm with a spectroscopic resolution of 2.8 nm. In this work, three normalization approaches are applied prior to analysis to reduce the effect of sample curvature on spectral profiles, and it is found that mean normalization was the most effective pretreatment for decreasing spectral variability due to curvature. Then, principal component analysis (PCA) was applied to a dataset composing of average spectra from decayed and normal tissue to reduce the dimensionality of data and observe the ability of Vis-NIR hyper-spectra to discriminate data from two classes. In this case, it was observed that normal and decayed spectra were separable along the resultant first principal component (PC1) axis. Subsequently, five wavelengths (band) centered at 577, 702, 751, 808, and 923 nm were selected as the characteristic wavelengths by analyzing the loadings of PC1. A multispectral combination image was generated based on five selected characteristic wavelength images. Based on the obtained multispectral combination image, the intensity slicing pseudocolor image processing method is used to generate a 2-D visual classification image that would enhance the contrast between normal and decayed tissue. Finally, an image segmentation algorithm for detection of decayed fruit was developed based on the pseudocolor image coupled with a simple thresholding method. For the investigated 238 independent set samples including infected fruits infected by Penicillium digitatum and normal fruits, the total success rate is 100% and 97.5%, respectively, and, the proposed algorithm also used to identify the orange infected by penicillium italicum with a 100% identification accuracy, indicating that the proposed multispectral algorithm here is an effective method and it is potential to be applied in citrus industry.Keywords: citrus fruit, early rotten, fungal infection, hyperspectral imaging
Procedia PDF Downloads 3042404 Formation Control for Linear Multi-Robot System with Switched Directed Topology and Time-Varying Delays
Authors: Yaxiao Zhang, Yangzhou Chen
Abstract:
This study investigate the formation problem for high-order continuous-time multi-robot with bounded symmetric time-varying delay protocol under switched directed communication topology. By using a linear transformation, the formation problem is transformed to stability analysis of a switched delay system. Under the assumption that each communication topology has a directed spanning tree, sufficient conditions are presented in terms of linear matrix inequalities (LMIs) that the multi-robot system can achieve a desired formation by the trade-off among the pre-exist topologies with the help of the scheme of average dwell time. A numeral example is presented to illustrate the effectiveness of the obtained results.Keywords: multi-robot systems, formation, switched directed topology, symmetric time-varying delay, average dwell time, linear matrix inequalities (lmis)
Procedia PDF Downloads 536