Search results for: fault detection and classification
1950 Studies on Pesticide Usage Pattern and Farmers Knowledge on Pesticide Usage and Technologies in Open Field and Poly House Conditions
Authors: B. Raghu, Shashi Vemuri, Ch. Sreenivasa Rao
Abstract:
The survey on pesticide use pattern was carried out by interviewing farmers growing chill in open fields and poly houses based on the questionnaire prepared to assess their knowledge and practices on crop cultivation, general awareness on pesticide recommendations and use. Education levels of poly house farmers are high compared to open field farmers, where 57.14% poly house farmers are high school educated, whereas 35% open field farmers are illiterates. Majority farmers use nursery of 35 days and grow in <0.5 acre poly house in summer and rabi and < 1 acre in open field during kharif. Awareness on pesticide related issues is varying among poly house and open field farmers with some commonality, where 28.57% poly house farmers know about recommended pesticides while only 10% open field farmers are aware of this issue. However, in general, all farmers contact pesticide dealer for recommendations, poly house farmers prefer to contact scientists (35.71%) and open field farmers prefer to contact agricultural officers (33.33). Most farmers are unaware about pesticide classification and toxicity symbols on packing. Farmers are aware about endosulfan ban, but only 21.42% poly house and 11.66% open field farmers know about ban of monocrotofos on vegetables. Very few farmers know about pesticide residues and related issues, but know washing helps to reduce contamination.Keywords: open field, pesticide usage, polyhouses, residues survey
Procedia PDF Downloads 4681949 Biosensors for Parathion Based on Au-Pd Nanoparticles Modified Electrodes
Authors: Tian-Fang Kang, Chao-Nan Ge, Rui Li
Abstract:
An electrochemical biosensor for the determination of organophosphorus pesticides was developed based on electrochemical co-deposition of Au and Pd nanoparticles on glassy carbon electrode (GCE). Energy disperse spectroscopy (EDS) analysis was used for characterization of the surface structure. Scanning electron micrograph (SEM) demonstrates that the films are uniform and the nanoclusters are homogeneously distributed on the GCE surface. Acetylcholinesterase (AChE) was immobilized on the Au and Pd nanoparticle modified electrode (Au-Pd/GCE) by cross-linking with glutaraldehyde. The electrochemical behavior of thiocholine at the biosensor (AChE/Au-Pd/GCE) was studied. The biosensors exhibited substantial electrocatalytic effect on the oxidation of thiocholine. The peak current of linear scan voltammetry (LSV) of thiocholine at the biosensor is proportional to the concentration of acetylthiocholine chloride (ATCl) over the range of 2.5 × 10-6 to 2.5 × 10-4 M in 0.1 M phosphate buffer solution (pH 7.0). The percent inhibition of acetylcholinesterase was proportional to the logarithm of parathion concentration in the range of 4.0 × 10-9 to 1.0 × 10-6 M. The detection limit of parathion was 2.6 × 10-9 M. The proposed method exhibited high sensitivity and good reproducibility.Keywords: acetylcholinesterase, Au-Pd nanoparticles, electrochemical biosensors, parathion
Procedia PDF Downloads 4071948 Methodology for Assessing Spatial Equity of Urban Green Space
Authors: Asna Anchalan, Anjana Bhagyanathan
Abstract:
Urban green space plays an important role in providing health (physical and mental well-being), economic, and environmental benefits for urban residents and neighborhoods. Ensuring equitable distribution of urban green space is vital to ensure equal access to these benefits. This study is developing a methodology for assessing spatial equity of urban green spaces in the Indian context. Through a systematic literature review, the research trends, parameters, data, and tools being used are identified. After 2020, the research in this domain is increasing rapidly, where COVID-19 acted as a catalyst. Indian documents use various terminologies, definitions, and classifications of urban green spaces. The terminology, definition, and classification for this study are done after reviewing several Indian documents, master plans, and research papers. Parameters identified for assessing spatial equity are availability, proximity, accessibility, and socio-economic disparity. Criteria for evaluating each parameter were identified from diverse research papers. There is a research gap identified as a comprehensive approach encompassing all four parameters. The outcome of this study led to the development of a methodology that addresses the gaps, providing a practical tool applicable across diverse Indian cities.Keywords: urban green space, spatial equity, accessibility, proximity, methodology
Procedia PDF Downloads 581947 Use of Predictive Food Microbiology to Determine the Shelf-Life of Foods
Authors: Fatih Tarlak
Abstract:
Predictive microbiology can be considered as an important field in food microbiology in which it uses predictive models to describe the microbial growth in different food products. Predictive models estimate the growth of microorganisms quickly, efficiently, and in a cost-effective way as compared to traditional methods of enumeration, which are long-lasting, expensive, and time-consuming. The mathematical models used in predictive microbiology are mainly categorised as primary and secondary models. The primary models are the mathematical equations that define the growth data as a function of time under a constant environmental condition. The secondary models describe the effects of environmental factors, such as temperature, pH, and water activity (aw) on the parameters of the primary models, including the maximum specific growth rate and lag phase duration, which are the most critical growth kinetic parameters. The combination of primary and secondary models provides valuable information to set limits for the quantitative detection of the microbial spoilage and assess product shelf-life.Keywords: shelf-life, growth model, predictive microbiology, simulation
Procedia PDF Downloads 2111946 Face Tracking and Recognition Using Deep Learning Approach
Authors: Degale Desta, Cheng Jian
Abstract:
The most important factor in identifying a person is their face. Even identical twins have their own distinct faces. As a result, identification and face recognition are needed to tell one person from another. A face recognition system is a verification tool used to establish a person's identity using biometrics. Nowadays, face recognition is a common technique used in a variety of applications, including home security systems, criminal identification, and phone unlock systems. This system is more secure because it only requires a facial image instead of other dependencies like a key or card. Face detection and face identification are the two phases that typically make up a human recognition system.The idea behind designing and creating a face recognition system using deep learning with Azure ML Python's OpenCV is explained in this paper. Face recognition is a task that can be accomplished using deep learning, and given the accuracy of this method, it appears to be a suitable approach. To show how accurate the suggested face recognition system is, experimental results are given in 98.46% accuracy using Fast-RCNN Performance of algorithms under different training conditions.Keywords: deep learning, face recognition, identification, fast-RCNN
Procedia PDF Downloads 1401945 Reactive Power Control with Plug-In Electric Vehicles
Authors: Mostafa Dastori, Sirus Mohammadi
Abstract:
While plug-in electric vehicles (PEVs) potentially have the capability to fulfill the energy storage needs of the electric grid, the degradation on the battery during this operation makes it less preferable by the auto manufacturers and consumers. On the other hand, the on-board chargers can also supply energy storage system applications such as reactive power compensation, voltage regulation, and power factor correction without the need of engaging the battery with the grid and thereby preserving its lifetime. It presents the design motives of single-phase on-board chargers in detail and makes a classification of the chargers based on their future vehicle-to-grid usage. The pros and cons of each different ac–dc topology are discussed to shed light on their suit- ability for reactive power support. This paper also presents and analyzes the differences between charging-only operation and capacitive reactive power operation that results in increased demand from the dc-link capacitor (more charge/discharge cycles and in- creased second harmonic ripple current). Moreover, battery state of charge is spared from losses during reactive power operation, but converter output power must be limited below its rated power rating to have the same stress on the dc-link capacitor.Keywords: energy storage system, battery unit, cost, optimal sizing, plug-in electric vehicles (PEVs), smart grid
Procedia PDF Downloads 3431944 A Machine Learning-based Study on the Estimation of the Threat Posed by Orbital Debris
Authors: Suhani Srivastava
Abstract:
This research delves into the classification of orbital debris through machine learning (ML): it will categorize the intensity of the threat orbital debris poses through multiple ML models to gain an insight into effectively estimating the danger specific orbital debris can pose to future space missions. As the space industry expands, orbital debris becomes a growing concern in Low Earth Orbit (LEO) because it can potentially obfuscate space missions due to the increased orbital debris pollution. Moreover, detecting orbital debris and identifying its characteristics has become a major concern in Space Situational Awareness (SSA), and prior methods of solely utilizing physics can become inconvenient in the face of the growing issue. Thus, this research focuses on approaching orbital debris concerns through machine learning, an efficient and more convenient alternative, in detecting the potential threat certain orbital debris pose. Our findings found that the Logistic regression machine worked the best with a 98% accuracy and this research has provided insight into the accuracies of specific machine learning models when classifying orbital debris. Our work would help provide space shuttle manufacturers with guidelines about mitigating risks, and it would help in providing Aerospace Engineers facilities to identify the kinds of protection that should be incorporated into objects traveling in the LEO through the predictions our models provide.Keywords: aerospace, orbital debris, machine learning, space, space situational awareness, nasa
Procedia PDF Downloads 231943 An Overview of Onshore and Offshore Wind Turbines
Authors: Mohammad Borhani, Afshin Danehkar
Abstract:
With the increase in population and the upward trend of energy demand, mankind has thought of using suppliers that guarantee a stable supply of energy, unlike fossil fuels, which, in addition to the widespread emission of greenhouse gases that one of the main factors in the destruction of the ozone layer and it will be finished in a short time in the not-so-distant future. In this regard, one of the sustainable ways of energy supply is the use of wind converters. That convert wind energy into electricity. For this reason, this research focused on wind turbines and their installation conditions. The main classification of wind turbines is based on the axis of rotation, which is divided into two groups: horizontal axis and vertical axis; each of these two types, with the advancement of technology in man-made environments such as cities, villages, airports, and other human environments can be installed and operated. The main difference between offshore and onshore wind turbines is their installation and foundation. Which are usually divided into five types; including of Monopile Wind Turbines, Jacket Wind Turbines, Tripile Wind Turbines, Gravity-Based Wind Turbines, and Floating Offshore Wind Turbines. For installation in a wind power plant requires an arrangement that produces electric power, the distance between the turbines is usually between 5 or 7 times the diameter of the rotor and if perpendicular to the wind direction be If they are 3 to 5 times the diameter of the rotor, they will be more efficient.Keywords: wind farms, Savonius, Darrieus, offshore wind turbine, renewable energy
Procedia PDF Downloads 1171942 Analysis of Linguistic Disfluencies in Bilingual Children’s Discourse
Authors: Sheena Christabel Pravin, M. Palanivelan
Abstract:
Speech disfluencies are common in spontaneous speech. The primary purpose of this study was to distinguish linguistic disfluencies from stuttering disfluencies in bilingual Tamil–English (TE) speaking children. The secondary purpose was to determine whether their disfluencies are mediated by native language dominance and/or on an early onset of developmental stuttering at childhood. A detailed study was carried out to identify the prosodic and acoustic features that uniquely represent the disfluent regions of speech. This paper focuses on statistical modeling of repetitions, prolongations, pauses and interjections in the speech corpus encompassing bilingual spontaneous utterances from school going children – English and Tamil. Two classifiers including Hidden Markov Models (HMM) and the Multilayer Perceptron (MLP), which is a class of feed-forward artificial neural network, were compared in the classification of disfluencies. The results of the classifiers document the patterns of disfluency in spontaneous speech samples of school-aged children to distinguish between Children Who Stutter (CWS) and Children with Language Impairment CLI). The ability of the models in classifying the disfluencies was measured in terms of F-measure, Recall, and Precision.Keywords: bi-lingual, children who stutter, children with language impairment, hidden markov models, multi-layer perceptron, linguistic disfluencies, stuttering disfluencies
Procedia PDF Downloads 2171941 Rapid Method for the Determination of Acid Dyes by Capillary Electrophoresis
Authors: Can Hu, Huixia Shi, Hongcheng Mei, Jun Zhu, Hongling Guo
Abstract:
Textile fibers are important trace evidence and frequently encountered in criminal investigations. A significant aspect of fiber evidence examination is the determination of fiber dyes. Although several instrumental methods have been developed for dyes detection, the analysis speed is not fast enough yet. A rapid dye analysis method is still needed to further improve the efficiency of case handling. Capillary electrophoresis has the advantages of high separation speed and high separation efficiency and is an ideal method for the rapid analysis of fiber dyes. In this paper, acid dyes used for protein fiber dyeing were determined by a developed short-end injection capillary electrophoresis technique. Five acid red dyes with similar structures were successfully baseline separated within 5 min. The separation reproducibility is fairly good for the relative standard deviation of retention time is 0.51%. The established method is rapid and accurate which has great potential to be applied in forensic setting.Keywords: acid dyes, capillary electrophoresis, fiber evidence, rapid determination
Procedia PDF Downloads 1441940 A Review of Antimicrobial Strategy for Cotton Textile
Abstract:
Cotton textile has large specific surfaces with good adhesion and water-storage properties which provide conditions for the growth and settlement of biological organisms. In addition, the soil, dust and solutes from sweat can also be the sources of nutrients for microorganisms [236]. Generally speaking, algae can grow on textiles under very moist conditions, providing nutrients for fungi and bacteria growth. Fungi cause multiple problems to textiles including discolouration, coloured stains and fibre damage. Bacteria can damage fibre and cause unpleasant odours with a slick and slimy feel. In addition, microbes can disrupt the manufacturing processes such as textile dyeing, printing and finishing operations through the reduction of viscosity, fermentation and mold formation. Therefore, a large demand exists for the anti-microbially finished textiles capable of avoiding or limiting microbial fibre degradation or bio fouling, bacterial incidence, odour generation and spreading or transfer of pathogens. In this review, the main strategy for cotton textile will be reviewed. In the beginning, the classification of bacteria and germs which are commonly found with cotton textiles will be introduced. The chemistry of antimicrobial finishing will be discussed. In addition, the types of antimicrobial treatment will be summarized. Finally, the application and evaluation of antimicrobial treatment on cotton textile will be discussed.Keywords: antimicrobial, cotton, textile, review
Procedia PDF Downloads 3651939 Male Versatile Sexual Offenders in Taiwan
Authors: Huang Yueh Chen, Sheng Ang Shen
Abstract:
Purpose: Sexual assault has always been a highly anticipated crime in Taiwan. People assume that the career of sexual offenders tends to be highly specialized. This study hopes to analyze the crime career and risk factors of offenders by means of another classification. Methods: A total of 145 sexual offenders were sentenced on the parole or expiration date from 2009 to 2011, through analysis of official existing documents such as ‘Re-infringement risk assessment report’ and ‘case assessment report’. Results: The section ‘Various Types of Crimes ‘ of criminal career is analyzed. The highest number of ‘ versatile sexual offender’ followed by ‘adult sexual offender’ is about 2.5, representing more than 1.5 kinds of non-sex crimes besides sexual crimes. Different specialized sexual offenders have had extensive experience in the ‘Sexual Assault Experiences in Children and School’, ‘Static 99 Levels’, ‘Pre-Commuted Substance Use’, ‘Excited Deviant Sexual Behavior’, ‘Various Types of Crimes,’ and ‘Sexual Crime in Forerunner’ , ‘Type of Index Crime’ and other projects to achieve significant differences. Conclusions: Resources continue to be devoted to specialized offenders, the character of first-time sexual offender depends on further research and makes the public aware of the different assumptions of diversified offenders from traditional professional offenses that reduce unnecessary panic in society.Keywords: versatile sexual offender, specialized sexual offender, criminal career, risk factor
Procedia PDF Downloads 1661938 A Passive Digital Video Authentication Technique Using Wavelet Based Optical Flow Variation Thresholding
Authors: R. S. Remya, U. S. Sethulekshmi
Abstract:
Detecting the authenticity of a video is an important issue in digital forensics as Video is used as a silent evidence in court such as in child pornography, movie piracy cases, insurance claims, cases involving scientific fraud, traffic monitoring etc. The biggest threat to video data is the availability of modern open video editing tools which enable easy editing of videos without leaving any trace of tampering. In this paper, we propose an efficient passive method for inter-frame video tampering detection, its type and location by estimating the optical flow of wavelet features of adjacent frames and thresholding the variation in the estimated feature. The performance of the algorithm is compared with the z-score thresholding and achieved an efficiency above 95% on all the tested databases. The proposed method works well for videos with dynamic (forensics) as well as static (surveillance) background.Keywords: discrete wavelet transform, optical flow, optical flow variation, video tampering
Procedia PDF Downloads 3591937 Optimal Allocation of Battery Energy Storage Considering Stiffness Constraints
Authors: Felipe Riveros, Ricardo Alvarez, Claudia Rahmann, Rodrigo Moreno
Abstract:
Around the world, many countries have committed to a decarbonization of their electricity system. Under this global drive, converter-interfaced generators (CIG) such as wind and photovoltaic generation appear as cornerstones to achieve these energy targets. Despite its benefits, an increasing use of CIG brings several technical challenges in power systems, especially from a stability viewpoint. Among the key differences are limited short circuit current capacity, inertia-less characteristic of CIG, and response times within the electromagnetic timescale. Along with the integration of CIG into the power system, one enabling technology for the energy transition towards low-carbon power systems is battery energy storage systems (BESS). Because of the flexibility that BESS provides in power system operation, its integration allows for mitigating the variability and uncertainty of renewable energies, thus optimizing the use of existing assets and reducing operational costs. Another characteristic of BESS is that they can also support power system stability by injecting reactive power during the fault, providing short circuit currents, and delivering fast frequency response. However, most methodologies for sizing and allocating BESS in power systems are based on economic aspects and do not exploit the benefits that BESSs can offer to system stability. In this context, this paper presents a methodology for determining the optimal allocation of battery energy storage systems (BESS) in weak power systems with high levels of CIG. Unlike traditional economic approaches, this methodology incorporates stability constraints to allocate BESS, aiming to mitigate instability issues arising from weak grid conditions with low short-circuit levels. The proposed methodology offers valuable insights for power system engineers and planners seeking to maintain grid stability while harnessing the benefits of renewable energy integration. The methodology is validated in the reduced Chilean electrical system. The results show that integrating BESS into a power system with high levels of CIG with stability criteria contributes to decarbonizing and strengthening the network in a cost-effective way while sustaining system stability. This paper potentially lays the foundation for understanding the benefits of integrating BESS in electrical power systems and coordinating their placements in future converter-dominated power systems.Keywords: battery energy storage, power system stability, system strength, weak power system
Procedia PDF Downloads 611936 Robust Recognition of Locomotion Patterns via Data-Driven Machine Learning in the Cloud Environment
Authors: Shinoy Vengaramkode Bhaskaran, Kaushik Sathupadi, Sandesh Achar
Abstract:
Human locomotion recognition is important in a variety of sectors, such as robotics, security, healthcare, fitness tracking and cloud computing. With the increasing pervasiveness of peripheral devices, particularly Inertial Measurement Units (IMUs) sensors, researchers have attempted to exploit these advancements in order to precisely and efficiently identify and categorize human activities. This research paper introduces a state-of-the-art methodology for the recognition of human locomotion patterns in a cloud environment. The methodology is based on a publicly available benchmark dataset. The investigation implements a denoising and windowing strategy to deal with the unprocessed data. Next, feature extraction is adopted to abstract the main cues from the data. The SelectKBest strategy is used to abstract optimal features from the data. Furthermore, state-of-the-art ML classifiers are used to evaluate the performance of the system, including logistic regression, random forest, gradient boosting and SVM have been investigated to accomplish precise locomotion classification. Finally, a detailed comparative analysis of results is presented to reveal the performance of recognition models.Keywords: artificial intelligence, cloud computing, IoT, human locomotion, gradient boosting, random forest, neural networks, body-worn sensors
Procedia PDF Downloads 111935 A Network Approach to Analyzing Financial Markets
Authors: Yusuf Seedat
Abstract:
The necessity to understand global financial markets has increased following the unfortunate spread of the recent financial crisis around the world. Financial markets are considered to be complex systems consisting of highly volatile move-ments whose indexes fluctuate without any clear pattern. Analytic methods of stock prices have been proposed in which financial markets are modeled using common network analysis tools and methods. It has been found that two key components of social network analysis are relevant to modeling financial markets, allowing us to forecast accurate predictions of stock prices within the financial market. Financial markets have a number of interacting components, leading to complex behavioral patterns. This paper describes a social network approach to analyzing financial markets as a viable approach to studying the way complex stock markets function. We also look at how social network analysis techniques and metrics are used to gauge an understanding of the evolution of financial markets as well as how community detection can be used to qualify and quantify in-fluence within a network.Keywords: network analysis, social networks, financial markets, stocks, nodes, edges, complex networks
Procedia PDF Downloads 1911934 Artificial Intelligence-Based Detection of Individuals Suffering from Vestibular Disorder
Authors: Dua Hişam, Serhat İkizoğlu
Abstract:
Identifying the problem behind balance disorder is one of the most interesting topics in the medical literature. This study has considerably enhanced the development of artificial intelligence (AI) algorithms applying multiple machine learning (ML) models to sensory data on gait collected from humans to classify between normal people and those suffering from Vestibular System (VS) problems. Although AI is widely utilized as a diagnostic tool in medicine, AI models have not been used to perform feature extraction and identify VS disorders through training on raw data. In this study, three machine learning (ML) models, the Random Forest Classifier (RF), Extreme Gradient Boosting (XGB), and K-Nearest Neighbor (KNN), have been trained to detect VS disorder, and the performance comparison of the algorithms has been made using accuracy, recall, precision, and f1-score. With an accuracy of 95.28 %, Random Forest Classifier (RF) was the most accurate model.Keywords: vestibular disorder, machine learning, random forest classifier, k-nearest neighbor, extreme gradient boosting
Procedia PDF Downloads 691933 Food and Feeding Habit of Clarias anguillaris in Tagwai Reservoir, Minna, Niger State, Nigeria
Authors: B. U. Ibrahim, A. Okafor
Abstract:
Sixty-two (62) samples of Clarias anguillaris were collected from Tagwai Reservoir and used for the study. 29 male and 33 female samples were obtained for the study. Body measurement indicated that different sizes were collected for the study. Males, females and combined sexes had standard length and total length means of 26.56±4.99 and 31.13±6.43, 27.17±5.21 and 30.62±5.43, 26.88±5.08 and 30.86±5.88 cm, respectively. The weights of males, females and combined sexes have mean weights of 241.10±96.27, 225.75±78.66 and 232.93±86.95 gm, respectively. Eight items; fish, insects, plant materials, sand grains, crustaceans, algae, detritus and unidentified items were eaten as food by Clarias anguilarias in Tagwai Reservoir. Frequency of occurrence and numerical methods used in stomach contents analysis indicated that fish was the highest, followed by insect, while the lowest was the algae. Frequency of stomach fullness of Clarias anguillaris showed low percentage of empty stomachs or stomachs without food (21.00%) and high percentage of stomachs with food (79.00%), which showed high abundance of food and high feeding intensity during the period of study. Classification of fish based on feeding habits showed that Clarias anguillaris in this study is an omnivore because it consumed both plant and animal materials.Keywords: stomach content, feeding habit, Clarias anguillaris, Tagwai Reservoir
Procedia PDF Downloads 5971932 GPU Based High Speed Error Protection for Watermarked Medical Image Transmission
Authors: Md Shohidul Islam, Jongmyon Kim, Ui-pil Chong
Abstract:
Medical image is an integral part of e-health care and e-diagnosis system. Medical image watermarking is widely used to protect patients’ information from malicious alteration and manipulation. The watermarked medical images are transmitted over the internet among patients, primary and referred physicians. The images are highly prone to corruption in the wireless transmission medium due to various noises, deflection, and refractions. Distortion in the received images leads to faulty watermark detection and inappropriate disease diagnosis. To address the issue, this paper utilizes error correction code (ECC) with (8, 4) Hamming code in an existing watermarking system. In addition, we implement the high complex ECC on a graphics processing units (GPU) to accelerate and support real-time requirement. Experimental results show that GPU achieves considerable speedup over the sequential CPU implementation, while maintaining 100% ECC efficiency.Keywords: medical image watermarking, e-health system, error correction, Hamming code, GPU
Procedia PDF Downloads 2901931 Comparison of Concentration of Heavy Metals in PM2.5 Analyzed in Three Different Global Research Institutions Using X-Ray Fluorescence
Authors: Sungroul Kim, Yeonjin Kim
Abstract:
This study was conducted by comparing the concentrations of heavy metals analyzed from the same samples with three X-Ray fluorescence (XRF) spectrometer in three different global research institutions, including PAN (A Branch of Malvern Panalytical, Seoul, South Korea), RTI (Research Triangle Institute, NC, U.S.A), and aerosol laboratory in Harvard University, Boston, U.S.A. To achieve our research objectives, the indoor air filter samples were collected at homes (n=24) of adults or child asthmatics then analyzed in PAN followed by Harvard University and RTI consecutively. Descriptive statistics were conducted for data comparison as well as correlation and simple regression analysis using R version 4.0.3. As a result, detection rates of most heavy metals analyzed in three institutions were about 90%. Of the 25 elements commonly analyzed among those institutions, 16 elements showed an R² (coefficient of determination) of 0.7 or higher (10 components were 0.9 or higher). The findings of this study demonstrated that XRF was a useful device ensuring reproducibility and compatibility for measuring heavy metals in PM2.5 collected from indoor air of asthmatics’ home.Keywords: heavy metals, indoor air quality, PM2.5, X-ray fluorescence
Procedia PDF Downloads 2001930 Single Cell and Spatial Transcriptomics: A Beginners Viewpoint from the Conceptual Pipeline
Authors: Leo Nnamdi Ozurumba-Dwight
Abstract:
Messenger ribooxynucleic acid (mRNA) molecules are compositional, protein-based. These proteins, encoding mRNA molecules (which collectively connote the transcriptome), when analyzed by RNA sequencing (RNAseq), unveils the nature of gene expression in the RNA. The obtained gene expression provides clues of cellular traits and their dynamics in presentations. These can be studied in relation to function and responses. RNAseq is a practical concept in Genomics as it enables detection and quantitative analysis of mRNA molecules. Single cell and spatial transcriptomics both present varying avenues for expositions in genomic characteristics of single cells and pooled cells in disease conditions such as cancer, auto-immune diseases, hematopoietic based diseases, among others, from investigated biological tissue samples. Single cell transcriptomics helps conduct a direct assessment of each building unit of tissues (the cell) during diagnosis and molecular gene expressional studies. A typical technique to achieve this is through the use of a single-cell RNA sequencer (scRNAseq), which helps in conducting high throughput genomic expressional studies. However, this technique generates expressional gene data for several cells which lack presentations on the cells’ positional coordinates within the tissue. As science is developmental, the use of complimentary pre-established tissue reference maps using molecular and bioinformatics techniques has innovatively sprung-forth and is now used to resolve this set back to produce both levels of data in one shot of scRNAseq analysis. This is an emerging conceptual approach in methodology for integrative and progressively dependable transcriptomics analysis. This can support in-situ fashioned analysis for better understanding of tissue functional organization, unveil new biomarkers for early-stage detection of diseases, biomarkers for therapeutic targets in drug development, and exposit nature of cell-to-cell interactions. Also, these are vital genomic signatures and characterizations of clinical applications. Over the past decades, RNAseq has generated a wide array of information that is igniting bespoke breakthroughs and innovations in Biomedicine. On the other side, spatial transcriptomics is tissue level based and utilized to study biological specimens having heterogeneous features. It exposits the gross identity of investigated mammalian tissues, which can then be used to study cell differentiation, track cell line trajectory patterns and behavior, and regulatory homeostasis in disease states. Also, it requires referenced positional analysis to make up of genomic signatures that will be sassed from the single cells in the tissue sample. Given these two presented approaches to RNA transcriptomics study in varying quantities of cell lines, with avenues for appropriate resolutions, both approaches have made the study of gene expression from mRNA molecules interesting, progressive, developmental, and helping to tackle health challenges head-on.Keywords: transcriptomics, RNA sequencing, single cell, spatial, gene expression.
Procedia PDF Downloads 1221929 Recognizing an Individual, Their Topic of Conversation and Cultural Background from 3D Body Movement
Authors: Gheida J. Shahrour, Martin J. Russell
Abstract:
The 3D body movement signals captured during human-human conversation include clues not only to the content of people’s communication but also to their culture and personality. This paper is concerned with automatic extraction of this information from body movement signals. For the purpose of this research, we collected a novel corpus from 27 subjects, arranged them into groups according to their culture. We arranged each group into pairs and each pair communicated with each other about different topics. A state-of-art recognition system is applied to the problems of person, culture, and topic recognition. We borrowed modeling, classification, and normalization techniques from speech recognition. We used Gaussian Mixture Modeling (GMM) as the main technique for building our three systems, obtaining 77.78%, 55.47%, and 39.06% from the person, culture, and topic recognition systems respectively. In addition, we combined the above GMM systems with Support Vector Machines (SVM) to obtain 85.42%, 62.50%, and 40.63% accuracy for person, culture, and topic recognition respectively. Although direct comparison among these three recognition systems is difficult, it seems that our person recognition system performs best for both GMM and GMM-SVM, suggesting that inter-subject differences (i.e. subject’s personality traits) are a major source of variation. When removing these traits from culture and topic recognition systems using the Nuisance Attribute Projection (NAP) and the Intersession Variability Compensation (ISVC) techniques, we obtained 73.44% and 46.09% accuracy from culture and topic recognition systems respectively.Keywords: person recognition, topic recognition, culture recognition, 3D body movement signals, variability compensation
Procedia PDF Downloads 5411928 Price Heterogeneity in Establishing Real Estate Composite Price Index as Underlying Asset for Property Derivatives in Russia
Authors: Andrey Matyukhin
Abstract:
Russian official statistics have been showing a steady decline in residential real estate prices for several consecutive years. Price risk in real estate markets is thus affecting various groups of economic agents, namely, individuals, construction companies and financial institutions. Potential use of property derivatives might help mitigate adverse consequences of negative price dynamics. Unless a sustainable price indicator is developed, settlement of such instruments imposes constraints on counterparties involved while imposing restrictions on real estate market development. The study addresses geographical and classification heterogeneity in real estate prices by means of variance analysis in various groups of real estate properties. In conclusion, we determine optimal sample structure of representative real estate assets with sufficient level of price homogeneity. The composite price indicator based on the sample would have a higher level of robustness and reliability and hence improving liquidity in the market for property derivatives through underlying standardization. Unlike the majority of existing real estate price indices, calculated on country-wide basis, the optimal indices for Russian market shall be constructed on the city-level.Keywords: price homogeneity, property derivatives, real estate price index, real estate price risk
Procedia PDF Downloads 3071927 Exploring Data Leakage in EEG Based Brain-Computer Interfaces: Overfitting Challenges
Authors: Khalida Douibi, Rodrigo Balp, Solène Le Bars
Abstract:
In the medical field, applications related to human experiments are frequently linked to reduced samples size, which makes the training of machine learning models quite sensitive and therefore not very robust nor generalizable. This is notably the case in Brain-Computer Interface (BCI) studies, where the sample size rarely exceeds 20 subjects or a few number of trials. To address this problem, several resampling approaches are often used during the data preparation phase, which is an overly critical step in a data science analysis process. One of the naive approaches that is usually applied by data scientists consists in the transformation of the entire database before the resampling phase. However, this can cause model’ s performance to be incorrectly estimated when making predictions on unseen data. In this paper, we explored the effect of data leakage observed during our BCI experiments for device control through the real-time classification of SSVEPs (Steady State Visually Evoked Potentials). We also studied potential ways to ensure optimal validation of the classifiers during the calibration phase to avoid overfitting. The results show that the scaling step is crucial for some algorithms, and it should be applied after the resampling phase to avoid data leackage and improve results.Keywords: data leackage, data science, machine learning, SSVEP, BCI, overfitting
Procedia PDF Downloads 1531926 Detecting Cyberbullying, Spam and Bot Behavior and Fake News in Social Media Accounts Using Machine Learning
Authors: M. D. D. Chathurangi, M. G. K. Nayanathara, K. M. H. M. M. Gunapala, G. M. R. G. Dayananda, Kavinga Yapa Abeywardena, Deemantha Siriwardana
Abstract:
Due to the growing popularity of social media platforms at present, there are various concerns, mostly cyberbullying, spam, bot accounts, and the spread of incorrect information. To develop a risk score calculation system as a thorough method for deciphering and exposing unethical social media profiles, this research explores the most suitable algorithms to our best knowledge in detecting the mentioned concerns. Various multiple models, such as Naïve Bayes, CNN, KNN, Stochastic Gradient Descent, Gradient Boosting Classifier, etc., were examined, and the best results were taken into the development of the risk score system. For cyberbullying, the Logistic Regression algorithm achieved an accuracy of 84.9%, while the spam-detecting MLP model gained 98.02% accuracy. The bot accounts identifying the Random Forest algorithm obtained 91.06% accuracy, and 84% accuracy was acquired for fake news detection using SVM.Keywords: cyberbullying, spam behavior, bot accounts, fake news, machine learning
Procedia PDF Downloads 361925 Tax Evasion with Mobility between the Regular and Irregular Sectors
Authors: Xavier Ruiz Del Portal
Abstract:
This paper incorporates mobility between the legal and black economies into a model of tax evasion with endogenous labor supply in which underreporting is possible in one sector but impossible in the other. We have found that the results of the effects along the extensive margin (number of evaders) become more robust and conclusive than those along the intensive margin (hours of illegal work) usually considered by the literature. In particular, it is shown that the following policies reduce the number of evaders: (a) larger and more progressive evasion penalties; (b) higher detection probabilities; (c) an increase in the legal sector wage rate; (d) a decrease in the moonlighting wage rate; (e) higher costs for creating opportunities to evade; (f) lower opportunities to evade, and (g) greater psychological costs of tax evasion. When tax concealment and illegal work also are taken into account, the effects do not vary significantly under the assumptions in Cowell (1985), except for the fact that policies (a) and (b) only hold as regards low- and middle-income groups and policies (e) and (f) as regards high-income groups.Keywords: income taxation, tax evasion, extensive margin responses, the penalty system
Procedia PDF Downloads 1551924 Grating Scale Thermal Expansion Error Compensation for Large Machine Tools Based on Multiple Temperature Detection
Authors: Wenlong Feng, Zhenchun Du, Jianguo Yang
Abstract:
To decrease the grating scale thermal expansion error, a novel method which based on multiple temperature detections is proposed. Several temperature sensors are installed on the grating scale and the temperatures of these sensors are recorded. The temperatures of every point on the grating scale are calculated by interpolating between adjacent sensors. According to the thermal expansion principle, the grating scale thermal expansion error model can be established by doing the integral for the variations of position and temperature. A novel compensation method is proposed in this paper. By applying the established error model, the grating scale thermal expansion error is decreased by 90% compared with no compensation. The residual positioning error of the grating scale is less than 15um/10m and the accuracy of the machine tool is significant improved.Keywords: thermal expansion error of grating scale, error compensation, machine tools, integral method
Procedia PDF Downloads 3661923 Estimating Knowledge Flow Patterns of Business Method Patents with a Hidden Markov Model
Authors: Yoonjung An, Yongtae Park
Abstract:
Knowledge flows are a critical source of faster technological progress and stouter economic growth. Knowledge flows have been accelerated dramatically with the establishment of a patent system in which each patent is required by law to disclose sufficient technical information for the invention to be recreated. Patent analysis, thus, has been widely used to help investigate technological knowledge flows. However, the existing research is limited in terms of both subject and approach. Particularly, in most of the previous studies, business method (BM) patents were not covered although they are important drivers of knowledge flows as other patents. In addition, these studies usually focus on the static analysis of knowledge flows. Some use approaches that incorporate the time dimension, yet they still fail to trace a true dynamic process of knowledge flows. Therefore, we investigate dynamic patterns of knowledge flows driven by BM patents using a Hidden Markov Model (HMM). An HMM is a popular statistical tool for modeling a wide range of time series data, with no general theoretical limit in regard to statistical pattern classification. Accordingly, it enables characterizing knowledge patterns that may differ by patent, sector, country and so on. We run the model in sets of backward citations and forward citations to compare the patterns of knowledge utilization and knowledge dissemination.Keywords: business method patents, dynamic pattern, Hidden-Markov Model, knowledge flow
Procedia PDF Downloads 3281922 The Impact of Childhood Cancer on the Quality of Life of Survivor: A Qualitative Analysis of Functionality and Participation
Authors: Catarina Grande, Barbara Mota
Abstract:
The main goal of the present study was to understand the impact of childhood cancer on the quality of life of survivors and the extent to which oncologic disease affects the functionality and participation of survivors at the present time, compared to the time of diagnosis. Six survivors of pediatric cancer participated in the study. Participants were interviewed using a semi-structured interview, adapted from two instruments present in the literature - QALY and QLACS - and piloted through a previous study. This study is based on a qualitative approach using content analysis, allowing the identification of categories and subcategories. Subsequently, the correspondence between the units of meaning and the codes in the International Classification of Functioning, Disability, and Health for Children and Young, which contributed to a more detailed analysis of the impact on the quality of life of survivors in relation to the domains under study. The results showed significant changes between the moment of diagnosis and the present moment, concretely at the microsystem of the survivor. Regarding functionality and participation, the results show that the functions of the body are the most affected domain, emphasizing the emotional component that currently has a greater impact on the quality of life of survivors. The present study allowed identifying a set of codes for the development of a CIF-CJ core set for pediatric cancer survivors. He also indicated the need for future studies to validate and deepen these issues.Keywords: cancer, participation, quality of life, survivor
Procedia PDF Downloads 2371921 Nondestructive Testing for Reinforced Concrete Buildings with Active Infrared Thermography
Authors: Huy Q. Tran, Jungwon Huh, Kiseok Kwak, Choonghyun Kang
Abstract:
Infrared thermography (IRT) technique has been proven to be a good method for nondestructive evaluation of concrete material. In the building, a broad range of applications has been used such as subsurface defect inspection, energy loss, and moisture detection. The purpose of this research is to consider the qualitative and quantitative performance of reinforced concrete deteriorations using active infrared thermography technique. An experiment of three different heating regimes was conducted on a concrete slab in the laboratory. The thermal characteristics of the IRT method, i.e., absolute contrast and observation time, are investigated. A linear relationship between the observation time and the real depth was established with a well linear regression R-squared of 0.931. The results showed that the absolute contrast above defective area increases with the rise of the size of delamination and the heating time. In addition, the depth of delamination can be predicted by using the proposal relationship of this study.Keywords: concrete building, infrared thermography, nondestructive evaluation, subsurface delamination
Procedia PDF Downloads 283