Search results for: entity extraction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2269

Search results for: entity extraction

739 Arsenic Speciation in Cicer arietinum: A Terrestrial Legume That Contains Organoarsenic Species

Authors: Anjana Sagar

Abstract:

Arsenic poisoned ground water is a major concern in South Asia. The arsenic enters the food chain not only through drinking but also by using arsenic polluted water for irrigation. Arsenic is highly toxic in its inorganic forms; however, organic forms of arsenic are comparatively less toxic. In terrestrial plants, inorganic form of arsenic is predominantly found; however, we found that significant proportion of organic arsenic was present in root and shoot of a staple legume, chickpea (Cicer arientinum L) plants. Chickpea plants were raised in pot culture on soils spiked with arsenic ranging from 0-70 mg arsenate per Kg soil. Total arsenic concentrations of chickpea shoots and roots were determined by inductively coupled plasma-mass-spectrometry (ICP-MS) ranging from 0.76 to 20.26, and 2.09 to 16.43 µg g⁻¹ dry weight, respectively. Information on arsenic species was acquired by methanol/water extraction method, with arsenic species being analyzed by high-performance liquid chromatography (HPLC) coupled with ICP-MS. Dimethylarsinic acid (DMA) was the only organic arsenic species found in amount from 0.02 to 3.16 % of total arsenic shoot concentration and 0 to 6.93 % of total arsenic root concentration, respectively. To investigate the source of the organic arsenic in chickpea plants, arsenic species in the rhizosphere of soils of plants were also examined. The absence of organic arsenic in soils would suggest the possibility of formation of DMA in plants. The present investigation provides useful information for better understanding of distribution of arsenic species in terrestrial legume plants.

Keywords: arsenic, arsenic speciation, dimethylarsinic acid, organoarsenic

Procedia PDF Downloads 138
738 Bioflocculation Using the Purified Wild Strain of P. aeruginosa Culture in Wastewater Treatment

Authors: Mohammad Hajjartabar, Tahereh Kermani Ranjbar

Abstract:

P. aeruginosa EF2 was isolated and identified from human infection sources before in our previous study. The present study was performed to determine the characteristics and activity role of bioflocculant produced by the bacterium in flocculation of the wastewater active sludge treatment. The bacterium was inoculated and then was grown in an orbital shaker at 250 rpm for 5 days at 35 °C under TSB and peptone water media. After incubation period, culture broths of the bacterial strain was collected and washed. The concentration of the bacteria was adjusted. For the extraction of the bacterial bioflocculant, culture was centrifuged at 6000 rpm for 20 min at 4 °C to remove bacterial cells. Supernatant was decanted and pellet containing bioflocculant was dried at 105 °C to a constant weight according to APHA, 2005. The chemical composition of the extracted bioflocculant from the bacterial sample was then analyzed. Wastewater active sludge sample obtained from aeration tank from one of wastewater treatment plants in Tehran, was first mixed thoroughly. After addition of bioflocculant, improvements in floc density were observed with an increase in bioflocculant. The results of this study strongly suggested that the extracted bioflucculant played a significant role in flocculation of the wastewater sample. The use of wild bacteria and nutrient regulation techniques instead of genetic manipulation opens wide investigation area in the future to improve wastewater treatment processes. Also this may put a new path in front of us to attain and improve the more effective bioflocculant using the purified microbial culture in wastewater treatment.

Keywords: wastewater treatment, P. aeruginosa, sludge treatment

Procedia PDF Downloads 156
737 Assessment of DNA Degradation Using Comet Assay: A Versatile Technique for Forensic Application

Authors: Ritesh K. Shukla

Abstract:

Degradation of biological samples in terms of macromolecules (DNA, RNA, and protein) are the major challenges in the forensic investigation which misleads the result interpretation. Currently, there are no precise methods available to circumvent this problem. Therefore, at the preliminary level, some methods are urgently needed to solve this issue. In this order, Comet assay is one of the most versatile, rapid and sensitive molecular biology technique to assess the DNA degradation. This technique helps to assess DNA degradation even at very low amount of sample. Moreover, the expedient part of this method does not require any additional process of DNA extraction and isolation during DNA degradation assessment. Samples directly embedded on agarose pre-coated microscopic slide and electrophoresis perform on the same slide after lysis step. After electrophoresis microscopic slide stained by DNA binding dye and observed under fluorescent microscope equipped with Komet software. With the help of this technique extent of DNA degradation can be assessed which can help to screen the sample before DNA fingerprinting, whether it is appropriate for DNA analysis or not. This technique not only helps to assess degradation of DNA but many other challenges in forensic investigation such as time since deposition estimation of biological fluids, repair of genetic material from degraded biological sample and early time since death estimation could also be resolved. With the help of this study, an attempt was made to explore the application of well-known molecular biology technique that is Comet assay in the field of forensic science. This assay will open avenue in the field of forensic research and development.

Keywords: comet assay, DNA degradation, forensic, molecular biology

Procedia PDF Downloads 155
736 Machine Learning and Deep Learning Approach for People Recognition and Tracking in Crowd for Safety Monitoring

Authors: A. Degale Desta, Cheng Jian

Abstract:

Deep learning application in computer vision is rapidly advancing, giving it the ability to monitor the public and quickly identify potentially anomalous behaviour from crowd scenes. Therefore, the purpose of the current work is to improve the performance of safety of people in crowd events from panic behaviour through introducing the innovative idea of Aggregation of Ensembles (AOE), which makes use of the pre-trained ConvNets and a pool of classifiers to find anomalies in video data with packed scenes. According to the theory of algorithms that applied K-means, KNN, CNN, SVD, and Faster-CNN, YOLOv5 architectures learn different levels of semantic representation from crowd videos; the proposed approach leverages an ensemble of various fine-tuned convolutional neural networks (CNN), allowing for the extraction of enriched feature sets. In addition to the above algorithms, a long short-term memory neural network to forecast future feature values and a handmade feature that takes into consideration the peculiarities of the crowd to understand human behavior. On well-known datasets of panic situations, experiments are run to assess the effectiveness and precision of the suggested method. Results reveal that, compared to state-of-the-art methodologies, the system produces better and more promising results in terms of accuracy and processing speed.

Keywords: action recognition, computer vision, crowd detecting and tracking, deep learning

Procedia PDF Downloads 161
735 Understanding Cognitive Fatigue From FMRI Scans With Self-supervised Learning

Authors: Ashish Jaiswal, Ashwin Ramesh Babu, Mohammad Zaki Zadeh, Fillia Makedon, Glenn Wylie

Abstract:

Functional magnetic resonance imaging (fMRI) is a neuroimaging technique that records neural activations in the brain by capturing the blood oxygen level in different regions based on the task performed by a subject. Given fMRI data, the problem of predicting the state of cognitive fatigue in a person has not been investigated to its full extent. This paper proposes tackling this issue as a multi-class classification problem by dividing the state of cognitive fatigue into six different levels, ranging from no-fatigue to extreme fatigue conditions. We built a spatio-temporal model that uses convolutional neural networks (CNN) for spatial feature extraction and a long short-term memory (LSTM) network for temporal modeling of 4D fMRI scans. We also applied a self-supervised method called MoCo (Momentum Contrast) to pre-train our model on a public dataset BOLD5000 and fine-tuned it on our labeled dataset to predict cognitive fatigue. Our novel dataset contains fMRI scans from Traumatic Brain Injury (TBI) patients and healthy controls (HCs) while performing a series of N-back cognitive tasks. This method establishes a state-of-the-art technique to analyze cognitive fatigue from fMRI data and beats previous approaches to solve this problem.

Keywords: fMRI, brain imaging, deep learning, self-supervised learning, contrastive learning, cognitive fatigue

Procedia PDF Downloads 189
734 Influence of Ground Granulated Blast Furnace Slag on Geotechnical Characteristics of Jarosite Waste

Authors: Chayan Gupta, Arun Prasad

Abstract:

The quick evolution of industrialization causes the scarcity of precious land. Thus, it is vital need to influence the R&D societies to achieve sustainable, economic and social benefits from huge utilization of waste for universal aids. The current study promotes the influence of steel industries waste i.e. ground granulated blast furnace slag (GGBS) in geotechnical properties of jarosite waste (solid waste residues produced from hydrometallurgy operations involved in extraction of Zinc). Numerous strengths tests (unconfined compression (qu) and splitting tensile strength (qt)) are conducted on jarosite-GGBS blends (GGBS, 10-30%) with different curing periods (7, 28 & 90 days). The results indicate that both qu and qt increase with the increase in GGBS content along with curing periods. The increased strength with the addition of GGBS is also observed from microstructural study, which illustrates the occurrence of larger agglomeration of jarosite-GGBS blend particles. The Freezing-Thawing (F-T) durability analysis is also conducted for all the jarosite-GGBS blends and found that the reduction in unconfined compressive strength after five successive F-T cycles enhanced from 62% (natural jarosite) to 48, 42 and 34% at 7, 14 and 28 days curing periods respectively for stabilized jarosite-GGBS samples containing 30% GGBS content. It can be concluded from this study that blending of cementing additives (GGBS) with jarosite waste resulted in a significant improvement in geotechnical characteristics.

Keywords: jarosite, GGBS, strength characteristics, microstructural study, durability analysis

Procedia PDF Downloads 168
733 Determination of Physicochemical Properties, Bioaccessibility of Phenolics and Antioxidant Capacity of Mineral Enriched Linden Herbal Tea Beverage

Authors: Senem Suna, Canan Ece Tamer, Ömer Utku Çopur

Abstract:

In this research, dried linden (Tilia argentea) leaves and blossoms were used as a raw material for mineral enriched herbal tea beverage production. For this aim, %1 dried linden was infused with boiling water (100 °C) for 5 minutes. After cooling, sucrose, citric acid, ascorbic acid, natural lemon flavor and natural mineral water were added. Beverage samples were plate filtered, filled into 200-mL glass bottles, capped then pasteurized at 98 °C for 15 minutes. Water soluble dry matter, titratable acidity, ascorbic acid, pH, minerals (Fe, Ca, Mg, K, Na), color (L*, a*, b*), turbidity, bioaccessible phenolics and antioxidant capacity were analyzed. Water soluble dry matter, titratable acidity, and ascorbic were determined as 7.66±0.28 g/100 g, 0.13±0.00 g/100 mL, and 19.42±0.62 mg/100 mL, respectively. pH was measured as 3.69. Fe, Ca, Mg, K and Na contents of the beverage were determined as 0.12±0.00, 115.48±0.05, 34.72±0.14, 48.67±0.43 and 85.72±1.01 mg/L, respectively. Color was measured as 13.63±0.05, -4.33±0.05, and 3.06±0.05 for L*, a*, and b* values. Turbidity was determined as 0.69±0.07 NTU. Bioaccessible phenolics were determined as 312.82±5.91 mg GAE/100 mL. Antioxidant capacities of chemical (MetOH:H2O:HCl) and physiological extracts (in vitro digestive enzymatic extraction) with DPPH (27.59±0.53 and 0.17±0.02 μmol trolox/mL), FRAP (21.01±0.97 and 13.27±0.19 μmol trolox/mL) and CUPRAC (44.71±9.42 and 2.80±0.64 μmol trolox/mL) methods were also evaluated. As a result, enrichment with natural mineral water was proposed for the development of functional and nutritional values together with a good potential for commercialization.

Keywords: linden, herbal tea beverage, bioaccessibility, antioxidant capacity

Procedia PDF Downloads 174
732 Polymorphisms of STAT5A and DGAT1 Genes and Their Associations with Milk Trait in Egyptian Goats

Authors: Othman Elmahdy Othman

Abstract:

The objectives of this study were to identify polymorphisms in the STAT5A using Restriction Fragment Length Polymorphism and DGAT1 using Single-Strand Conformation Polymorphism genes among three Egyptian goat breeds (Barki, Zaraibi, and Damascus) as well as investigate the effect of their genotypes on milk composition traits of Zaraibi goats. One hundred and fifty blood samples were collected for DNA extraction, 60 from Zaraibi, 40 from Damascus and 50 from Barki breeds. Fat, protein and lactose percentages were determined in Zaraibi goat milk using an automatic milk analyzer. Two genotypes, CC and CT (for STAT5A) and C-C- and C-C+ (for DGAT1), were identified in the three Egyptian goat breeds with different frequencies. The associations between these genotypes and milk fat, protein and lactose were determined in Zaraibi breed. The results showed that the STAT5A genotypes had significant effects on milk yield, protein, fat and lactose with the superiority of CT genotype over CC. Regarding DGAT1 polymorphism, the result showed the only association between it with milk fat where the animals with C-C+ genotype had greater milk fat than animals possess C-C- genotype. The association of combined genotypes with milk trait declared that the does with heterozygous genotypes for both genes are preferred than does with homozygous genotypes where the animals with CTC-C+ have more milk yield, fat and protein than those with CCC-C- genotype. In conclusion, the result showed that C/T and C-/C+ SNPs of STAT5A and DGAT1 genes respectively may be useful markers for assisted selection programs to improve goat milk composition

Keywords: DGAT1, genetic polymorphism, milk trait, STAT5A

Procedia PDF Downloads 163
731 Deep Feature Augmentation with Generative Adversarial Networks for Class Imbalance Learning in Medical Images

Authors: Rongbo Shen, Jianhua Yao, Kezhou Yan, Kuan Tian, Cheng Jiang, Ke Zhou

Abstract:

This study proposes a generative adversarial networks (GAN) framework to perform synthetic sampling in feature space, i.e., feature augmentation, to address the class imbalance problem in medical image analysis. A feature extraction network is first trained to convert images into feature space. Then the GAN framework incorporates adversarial learning to train a feature generator for the minority class through playing a minimax game with a discriminator. The feature generator then generates features for minority class from arbitrary latent distributions to balance the data between the majority class and the minority class. Additionally, a data cleaning technique, i.e., Tomek link, is employed to clean up undesirable conflicting features introduced from the feature augmentation and thus establish well-defined class clusters for the training. The experiment section evaluates the proposed method on two medical image analysis tasks, i.e., mass classification on mammogram and cancer metastasis classification on histopathological images. Experimental results suggest that the proposed method obtains superior or comparable performance over the state-of-the-art counterparts. Compared to all counterparts, our proposed method improves more than 1.5 percentage of accuracy.

Keywords: class imbalance, synthetic sampling, feature augmentation, generative adversarial networks, data cleaning

Procedia PDF Downloads 127
730 Lessons from Implementation of a Network-Wide Safety Huddle in Behavioral Health

Authors: Deborah Weidner, Melissa Morgera

Abstract:

The model of care delivery in the Behavioral Health Network (BHN) is integrated across all five regions of Hartford Healthcare and thus spans the entirety of the state of Connecticut, with care provided in seven inpatient settings and over 30 ambulatory outpatient locations. While safety has been a core priority of the BHN in alignment with High Reliability practices, safety initiatives have historically been facilitated locally in each region or within each entity, with interventions implemented locally as opposed to throughout the network. To address this, the BHN introduced a network wide Safety Huddle during 2022. Launched in January, the BHN Safety Huddle brought together internal stakeholders, including medical and administrative leaders, along with executive institute leadership, quality, and risk management. By bringing leaders together and introducing a network-wide safety huddle into the way we work, the benefit has been an increase in awareness of safety events occurring in behavioral health areas as well as increased systemization of countermeasures to prevent future events. One significant discussion topic presented in huddles has pertained to environmental design and patient access to potentially dangerous items, addressing some of the most relevant factors resulting in harm to patients in inpatient and emergency settings for behavioral health patients. The safety huddle has improved visibility of potential environmental safety risks through the generation of over 15 safety alerts cascaded throughout the BHN and also spurred a rapid improvement project focused on standardization of patient belonging searches to reduce patient access to potentially dangerous items on inpatient units. Safety events pertaining to potentially dangerous items decreased by 31% as a result of standardized interventions implemented across the network and as a result of increased awareness. A second positive outcome originating from the BHN Safety Huddle was implementation of a recommendation to increase the emergency Narcan®(naloxone) supply on hand in ambulatory settings of the BHN after incidents involving accidental overdose resulted in higher doses of naloxone administration. By increasing the emergency supply of naloxone on hand in all ambulatory and residential settings, colleagues are better prepared to respond in an emergency situation should a patient experience an overdose while on site. Lastly, discussions in safety huddle spurred a new initiative within the BHN to improve responsiveness to assaultive incidents through a consultation service. This consult service, aligned with one of the network’s improvement priorities to reduce harm events related to assaultive incidents, was borne out of discussion in huddle in which it was identified that additional interventions may be needed in providing clinical care to patients who are experiencing multiple and/ or frequent safety events.

Keywords: quality, safety, behavioral health, risk management

Procedia PDF Downloads 83
729 Embodied Neoliberalism and the Mind as Tool to Manage the Body: A Descriptive Study Applied to Young Australian Amateur Athletes

Authors: Alicia Ettlin

Abstract:

Amid the rise of neoliberalism to the leading economic policy model in Western societies in the 1980s, people have started to internalise a neoliberal way of thinking, whereby the human body has become an entity that can and needs to be precisely managed through free yet rational decision-making processes. The neoliberal citizen has consequently become an entrepreneur of the self who is free, independent, rational, productive and responsible for themselves, their health and wellbeing as well as their appearance. The focus on individuals as entrepreneurs who manage their bodies through the rationally thinking mind has, however, become increasingly criticised for viewing the social actor as ‘disembodied’, as a detached, social actor whose powerful mind governs over the passive body. On the other hand, the discourse around embodiment seeks to connect rational decision-making processes to the dominant neoliberal discourse which creates an embodied understanding that the body, just as other areas of people’s lives, can and should be shaped, monitored and managed through cognitive and rational thinking. This perspective offers an understanding of the body regarding its connections with the social environment that reaches beyond the debates around mind-body binary thinking. Hence, following this argument, body management should not be thought of as either solely guided by embodied discourses nor as merely falling into a mind-body dualism, but rather, simultaneously and inseparably as both at once. The descriptive, qualitative analysis of semi-structured in-depth interviews conducted with young Australian amateur athletes between the age of 18 and 24 has shown that most participants are interested in measuring and managing their body to create self-knowledge and self-improvement. The participants thereby connected self-improvement to weight loss, muscle gain or simply staying fit and healthy. Self-knowledge refers to body measurements including weight, BMI or body fat percentage. Self-management and self-knowledge that are reliant on one another to take rational and well-thought-out decisions, are both characteristic values of the neoliberal doctrine. A neoliberal way of thinking and looking after the body has also by many been connected to rewarding themselves for their discipline, hard work or achievement of specific body management goals (e.g. eating chocolate for reaching the daily step count goal). A few participants, however, have shown resistance against these neoliberal values, and in particular, against the precise monitoring and management of the body with the help of self-tracking devices. Ultimately, however, it seems that most participants have internalised the dominant discourses around self-responsibility, and by association, a sense of duty to discipline their body in normative ways. Even those who have indicated their resistance against body work and body management practices that follow neoliberal thinking and measurement systems, are aware and have internalised the concept of the rational operating mind that needs or should decide how to look after the body in terms of health but also appearance ideals. The discussion around the collected data thereby shows that embodiment and the mind/body dualism constitute two connected, rather than two separate or opposing concepts.

Keywords: dualism, embodiment, mind, neoliberalism

Procedia PDF Downloads 163
728 Mineral Deposits in Spatial Planning Systems – Review of European Practices

Authors: Alicja Kot-Niewiadomska

Abstract:

Securing sustainable access to raw materials is vital for the growth of the European economy and for the goals laid down in Strategy Europe 2020. One of the most important sources of mineral raw materials are primary deposits. The efficient management of them, including extraction, will ensure competitiveness of the European economy. A critical element of this approach is mineral deposits safeguarding and the most important tool - spatial planning. The safeguarding of deposits should be understood as safeguarding of land access, and safeguarding of area against development, which may (potential) prevent the use of the deposit and the necessary mining activities. Many European Union countries successfully integrated their mineral policy and spatial policy, which has ensured the proper place of mineral deposits in their spatial planning systems. These, in turn, are widely recognized as the most important mineral deposit safeguarding tool, the essence of which is to ensure long-term access to its resources. The examples of Austria, Portugal, Slovakia, Czech Republic, Sweden, and the United Kingdom, discussed in the paper, are often mentioned as examples of good practices in this area. Although none of these countries managed to avoid cases of social and environmental conflicts related to mining activities, the solutions they implement certainly deserve special attention. And for many countries, including Poland, they can be a potential source of solutions aimed at improving the protection of mineral deposits.

Keywords: mineral deposits, land use planning, mineral deposit safeguarding, European practices

Procedia PDF Downloads 171
727 Optimization of Multistage Extractor for the Butanol Separation from Aqueous Solution Using Ionic Liquids

Authors: Dharamashi Rabari, Anand Patel

Abstract:

n-Butanol can be regarded as a potential biofuel. Being resistive to corrosion and having high calorific value, butanol is a very attractive energy source as opposed to ethanol. By fermentation process called ABE (acetone, butanol, ethanol), bio-butanol can be produced. ABE carried out mostly by bacteria Clostridium acetobutylicum. The major drawback of the process is the butanol concentration higher than 10 g/L, delays the growth of microbes resulting in a low yield. It indicates the simultaneous separation of butanol from the fermentation broth. Two hydrophobic Ionic Liquids (ILs) 1-butyl-1-methylpiperidinium bis (trifluoromethylsulfonyl)imide [bmPIP][Tf₂N] and 1-hexyl-3-methylimidazolium bis (trifluoromethylsulfonyl)imide [hmim][Tf₂N] were chosen. The binary interaction parameters for both ternary systems i.e. [bmPIP][Tf₂N] + water + n-butanol and [hmim][Tf₂N] + water +n-butanol were taken from the literature that was generated by NRTL model. Particle swarm optimization (PSO) with the isothermal sum rate (ISR) method was used to optimize the cost of liquid-liquid extractor. For [hmim][Tf₂N] + water +n-butanol system, PSO shows 84% success rate with the number of stages equal to eight and solvent flow rate equal to 461 kmol/hr. The number of stages was three with 269.95 kmol/hr solvent flow rate for [bmPIP][Tf₂N] + water + n-butanol system. Moreover, both ILs were very efficient as the loss of ILs in raffinate phase was negligible.

Keywords: particle swarm optimization, isothermal sum rate method, success rate, extraction

Procedia PDF Downloads 122
726 Technology in the Calculation of People Health Level: Design of a Computational Tool

Authors: Sara Herrero Jaén, José María Santamaría García, María Lourdes Jiménez Rodríguez, Jorge Luis Gómez González, Adriana Cercas Duque, Alexandra González Aguna

Abstract:

Background: Health concept has evolved throughout history. The health level is determined by the own individual perception. It is a dynamic process over time so that you can see variations from one moment to the next. In this way, knowing the health of the patients you care for, will facilitate decision making in the treatment of care. Objective: To design a technological tool that calculates the people health level in a sequential way over time. Material and Methods: Deductive methodology through text analysis, extraction and logical knowledge formalization and education with expert group. Studying time: September 2015- actually. Results: A computational tool for the use of health personnel has been designed. It has 11 variables. Each variable can be given a value from 1 to 5, with 1 being the minimum value and 5 being the maximum value. By adding the result of the 11 variables we obtain a magnitude in a certain time, the health level of the person. The health calculator allows to represent people health level at a time, establishing temporal cuts being useful to determine the evolution of the individual over time. Conclusion: The Information and Communication Technologies (ICT) allow training and help in various disciplinary areas. It is important to highlight their relevance in the field of health. Based on the health formalization, care acts can be directed towards some of the propositional elements of the concept above. The care acts will modify the people health level. The health calculator allows the prioritization and prediction of different strategies of health care in hospital units.

Keywords: calculator, care, eHealth, health

Procedia PDF Downloads 264
725 Evaluation of Modern Natural Language Processing Techniques via Measuring a Company's Public Perception

Authors: Burak Oksuzoglu, Savas Yildirim, Ferhat Kutlu

Abstract:

Opinion mining (OM) is one of the natural language processing (NLP) problems to determine the polarity of opinions, mostly represented on a positive-neutral-negative axis. The data for OM is usually collected from various social media platforms. In an era where social media has considerable control over companies’ futures, it’s worth understanding social media and taking actions accordingly. OM comes to the fore here as the scale of the discussion about companies increases, and it becomes unfeasible to gauge opinion on individual levels. Thus, the companies opt to automize this process by applying machine learning (ML) approaches to their data. For the last two decades, OM or sentiment analysis (SA) has been mainly performed by applying ML classification algorithms such as support vector machines (SVM) and Naïve Bayes to a bag of n-gram representations of textual data. With the advent of deep learning and its apparent success in NLP, traditional methods have become obsolete. Transfer learning paradigm that has been commonly used in computer vision (CV) problems started to shape NLP approaches and language models (LM) lately. This gave a sudden rise to the usage of the pretrained language model (PTM), which contains language representations that are obtained by training it on the large datasets using self-supervised learning objectives. The PTMs are further fine-tuned by a specialized downstream task dataset to produce efficient models for various NLP tasks such as OM, NER (Named-Entity Recognition), Question Answering (QA), and so forth. In this study, the traditional and modern NLP approaches have been evaluated for OM by using a sizable corpus belonging to a large private company containing about 76,000 comments in Turkish: SVM with a bag of n-grams, and two chosen pre-trained models, multilingual universal sentence encoder (MUSE) and bidirectional encoder representations from transformers (BERT). The MUSE model is a multilingual model that supports 16 languages, including Turkish, and it is based on convolutional neural networks. The BERT is a monolingual model in our case and transformers-based neural networks. It uses a masked language model and next sentence prediction tasks that allow the bidirectional training of the transformers. During the training phase of the architecture, pre-processing operations such as morphological parsing, stemming, and spelling correction was not used since the experiments showed that their contribution to the model performance was found insignificant even though Turkish is a highly agglutinative and inflective language. The results show that usage of deep learning methods with pre-trained models and fine-tuning achieve about 11% improvement over SVM for OM. The BERT model achieved around 94% prediction accuracy while the MUSE model achieved around 88% and SVM did around 83%. The MUSE multilingual model shows better results than SVM, but it still performs worse than the monolingual BERT model.

Keywords: BERT, MUSE, opinion mining, pretrained language model, SVM, Turkish

Procedia PDF Downloads 146
724 Investigation of Rifampicin and Isoniazid Resistance Mutated Genes in Mycobacterium Tuberculosis Isolated From Patients

Authors: Seyyed Mohammad Amin Mousavi Sagharchi, Alireza Mahmoudi Nasab, Tim Bakker

Abstract:

Introduction: Mycobacterium tuberculosis (MTB) is the most intelligent bacterium that existed in the world to our best knowledge. This bacterium can cause tuberculosis (TB) which is responsible for its spread speed and murder of millions of people around the world. MTB has the practical function to escape from anti-tuberculosis drugs (AT), for this purpose, it handles some mutations in the main genes and creates new patterns for inhibited genes. Method and materials: Researchers have their best tries to safely isolate MTB from the sputum specimens of 35 patients in some hospitals in the Tehran province and detect MTB by culture on Löwenstein-Jensen (LJ) medium and microscopic examination. DNA was extracted from the established bacterial colony by enzymatic extraction method. It was amplified by the polymerase chain reaction (PCR) method, reverse hybridization, and evaluation for detection of resistance genes; generally, researchers apply GenoType MTBDRplus assay. Results: Investigations of results declare us that 21 of the isolated specimens (about 60%) have mutation in rpoB gene, which resisted to rifampicin (most prevalence), and 8 of them (about 22.8%) have mutation in katG or inhA genes which resisted to isoniazid. Also, 4 of them (about 11.4%) don't have any mutation, and 2 of them (about 5.7%) have mutation in every three genes, which makes them resistant to the two drugs mentioned above. Conclusion: Rifampicin and isoniazid are two essential AT that using in the first line of treatment. Resistance in rpoB, and katG, and inhA genes related to mentioned drugs lead to ineffective treatment.

Keywords: mycobacterium tuberculosis, tuberculosis, drug resistance, isoniazid, rifampicin

Procedia PDF Downloads 96
723 Electrokinetic Regulation of Flow in Microcrack Reservoirs

Authors: Aslanova Aida Ramiz

Abstract:

One of the important aspects of rheophysical problems in oil and gas extraction is the regulation of thermohydrodynamic properties of liquid systems using physical and physicochemical methods. It is known that the constituent parts of real fluid systems in oil and gas production are practically non-conducting, non-magnetically active components. Real heterogeneous hydrocarbon systems, from the structural point of view, consist of an infinite number of microscopic local ion-electrostatic cores distributed in the volume of the dispersion medium. According to Cohen's rule, double electric layers are formed at the contact boundaries of components in contact (oil-gas, oil-water, water-condensate, etc.) in a heterogeneous system, and as a result, each real fluid system can be represented as a complex composition of a set of local electrostatic fields. The electrokinetic properties of this structure are characterized by a certain electrode potential. Prof. F.H. Valiyev called this potential the α-factor and came up with the idea that many natural and technological rheophysical processes (effects) are essentially electrokinetic in nature, and by changing the α-factor, it is possible to adjust the physical properties of real hydraulic systems, including thermohydrodynamic parameters. Based on this idea, extensive research work was conducted, and the possibility of reducing hydraulic resistances and improving rheological properties was experimentally discovered in real liquid systems by reducing the electrical potential with various physical and chemical methods.

Keywords: microcracked, electrode potential, hydraulic resistance, Newtonian fluid, rheophysical properties

Procedia PDF Downloads 77
722 The Use of Remotely Sensed Data to Extract Wetlands Area in the Cultural Park of Ahaggar, South of Algeria

Authors: Y. Fekir, K. Mederbal, M. A. Hammadouche, D. Anteur

Abstract:

The cultural park of the Ahaggar, occupying a large area of Algeria, is characterized by a rich wetlands area to be preserved and managed both in time and space. The management of a large area, by its complexity, needs large amounts of data, which for the most part, are spatially localized (DEM, satellite images and socio-economic information...), where the use of conventional and traditional methods is quite difficult. The remote sensing, by its efficiency in environmental applications, became an indispensable solution for this kind of studies. Remote sensing imaging data have been very useful in the last decade in very interesting applications. They can aid in several domains such as the detection and identification of diverse wetland surface targets, topographical details, and geological features... In this work, we try to extract automatically wetlands area using multispectral remotely sensed data on-board the Earth Observing 1 (EO-1) and Landsat satellite. Both are high-resolution multispectral imager with a 30 m resolution. The instrument images an interesting surface area. We have used images acquired over the several area of interesting in the National Park of Ahaggar in the south of Algeria. An Extraction Algorithm is applied on the several spectral index obtained from combination of different spectral bands to extract wetlands fraction occupation of land use. The obtained results show an accuracy to distinguish wetlands area from the other lad use themes using a fine exploitation on spectral index.

Keywords: multispectral data, EO1, landsat, wetlands, Ahaggar, Algeria

Procedia PDF Downloads 377
721 Optical Design and Modeling of Micro Light-Emitting Diodes for Display Applications

Authors: Chaya B. M., C. Dhanush, Inti Sai Srikar, Akula Pavan Parvatalu, Chirag Gowda R

Abstract:

Recently, there has been a lot of interest in µ-LED technology because of its exceptional qualities, including auto emission, high visibility, low consumption of power, rapid response and longevity. Light-emitting diodes (LED) using III-nitride, such as lighting sources, visible light communication (VLC) devices, and high-power devices, are finding increasing use as miniaturization technology advances. The use of micro-LED displays in place of traditional display technologies like liquid crystal displays (LCDs) and organic light-emitting diodes (OLEDs) is one of the most prominent recent advances, which may even represent the next generation of displays. The development of fully integrated, multifunctional devices and the incorporation of extra capabilities into micro-LED displays, such as sensing, light detection, and solar cells, are the pillars of advanced technology. Due to the wide range of applications for micro-LED technology, the effectiveness and dependability of these devices in numerous harsh conditions are becoming increasingly important. Enough research has been conducted to overcome the under-effectiveness of micro-LED devices. In this paper, different Micro LED design structures are proposed in order to achieve optimized optical properties. In order to attain improved external quantum efficiency (EQE), devices' light extraction efficiency (LEE) has also been boosted.

Keywords: finite difference time domain, light out coupling efficiency, far field intensity, power density, quantum efficiency, flat panel displays

Procedia PDF Downloads 79
720 Carbon Nanomaterials from Agricultural Wastes for Adsorption of Organic Pollutions

Authors: Magdalena Blachnio, Viktor Bogatyrov, Mariia Galaburda, Anna Derylo-Marczewska

Abstract:

Agricultural waste materials from traditional oil mill and after extraction of natural raw materials in supercritical conditions were used for the preparation of carbon nanomaterials (activated carbons) by two various methods. Chemical activation using acetic acid and physical activation with a gaseous agent (carbon dioxide) were chosen as mild and environmentally friendly ones. The effect of influential factors: type of raw material, temperature and activation agent on the porous structure characteristics of the materials was discussed by using N₂ adsorption/desorption isotherms at 77 K. Furthermore scanning electron microscope (SEM), transmission electron microscope (TEM), X-ray diffraction (XRD) and X-ray photoelectron spectroscopy (XPS) were employed to examine the physicochemical properties of the obtained sorbents. Selection of a raw material and an optimization of the conditions of the synthesis process, allowed to obtain the cheap sorbents with a targeted distribution of pores enabling effective adsorption of the model organic pollutants carried out in the multicomponent systems. Adsorption behavior (capacity and rate) of the chosen activated carbons was estimated by utilizing Crystal violet (CV), 4-chlorophenoxyacetic acid (4-CPA), 2.4-dichlorophenoxyacetic acid (2.4-D) as the adsorbates. Both rate and adsorption capacity of the organics on the sorbents evidenced that the activated carbons could be effectively used in sewage treatment plants. The mechanisms of organics adsorption were studied and correlated with activated carbons properties.

Keywords: activated carbon, adsorption equilibrium, adsorption kinetics, organics adsorption

Procedia PDF Downloads 177
719 An Engineered Epidemic: Big Pharma's Role in the Opioid Crisis

Authors: Donna L. Roberts

Abstract:

2019 marked 23 years since Purdue Pharma launched its flagship drug, OxyContin, that unleashed an unprecedented epidemic touching both celebrities and common citizens, metropolitan, suburbia and rural areas and all levels of socioeconomic status. From rural Appalachia to East LA individuals, families and communities have been devastated by a trajectory of addiction that often began with the legitimate prescription of a pain killer for anything from a tooth extraction to a sports injury to recovery from surgery or chronic arthritis. Far from being a serendipitous progression of events, the proliferation of this new breed of 'miracle drug' was instead a carefully crafted marketing program aimed at both the medical community and common citizens. This research represents and in-depth investigation of the evolution of the marketing, distribution and promotion of prescription opioids by pharmaceutical companies and its relationship to the propagation of the opioid crisis. Specifically, key components of Purdue Pharma’s aggressive marketing campaign, including its bonus system and sales incentives, were analyzed in the context of the sociopolitical environment that essential created the proverbial 'perfect storm' for the changing manner in which pain is treated in the U.S. The analyses of these series of events clearly indicate their role in first, the increase in prescription of opioids for non-terminal pain relief and subsequently, the incidence of related addiction, overdose, and death. Through this examination of the conditions that facilitated and maintained this drug crisis, perhaps we can begin to chart a course toward its resolution.

Keywords: addiction, opioid, opioid crisis, Purdue Pharma

Procedia PDF Downloads 121
718 The Abundance and Distribution of Locally Important Species Along Different Altitude: The Case of Mountain Damota, Wolaita South Ethiopia

Authors: Tamirat Solomon, Tadesse Faltamo, Belete Limani

Abstract:

This study was conducted on the mountain Damota of Wolaita to assess the abundance and spatial distribution of two locally important indigenous medicinal plants on the mountain landscape. A total of 130 plots measuring 20x20m were established along eight systematically laid transect lines. In each plot, the abundance and distribution of Hagenia abyssinica (tree) and Pentas schiperiana Vatke (shrub) were evaluated. The abundance and distribution of H. abyssinica were evaluated by measuring height and DBH for mature trees and counting seedlings and saplings, whereas the P. schiperiana Vatke was assessed for its abundance and distribution by counting in each plot. In the entire study plots, a total of 485 H. abyssinica and 760 P. schiperiana vatake were recorded. It was observed that the distribution of the species increased while the altitude increased and the highest abundance of the species was recorded at an altitude range between 2332 and 2661m.a.s.l. However, at the altitudes below 2320 m.a.s.l., the species distributions and abundance was decreased, indicating either the ecological preference of the species or the extraction of the local community surrounding the mountain influenced the species. On average, only 28 seedlings/ha of H. abyssinica and 146/ha of P. schiperiana vatke were recorded in the study areas showing the tendency of decline in the abundance and distribution of both species. Finally, we recommend management intervention for the socially important species which are under threat on the mountain landscape.

Keywords: indigenous medicinal plants, H.abyssinic, P. schiperiana, distribution, abundance, socio-economic importance

Procedia PDF Downloads 122
717 Validation of Mapping Historical Linked Data to International Committee for Documentation (CIDOC) Conceptual Reference Model Using Shapes Constraint Language

Authors: Ghazal Faraj, András Micsik

Abstract:

Shapes Constraint Language (SHACL), a World Wide Web Consortium (W3C) language, provides well-defined shapes and RDF graphs, named "shape graphs". These shape graphs validate other resource description framework (RDF) graphs which are called "data graphs". The structural features of SHACL permit generating a variety of conditions to evaluate string matching patterns, value type, and other constraints. Moreover, the framework of SHACL supports high-level validation by expressing more complex conditions in languages such as SPARQL protocol and RDF Query Language (SPARQL). SHACL includes two parts: SHACL Core and SHACL-SPARQL. SHACL Core includes all shapes that cover the most frequent constraint components. While SHACL-SPARQL is an extension that allows SHACL to express more complex customized constraints. Validating the efficacy of dataset mapping is an essential component of reconciled data mechanisms, as the enhancement of different datasets linking is a sustainable process. The conventional validation methods are the semantic reasoner and SPARQL queries. The former checks formalization errors and data type inconsistency, while the latter validates the data contradiction. After executing SPARQL queries, the retrieved information needs to be checked manually by an expert. However, this methodology is time-consuming and inaccurate as it does not test the mapping model comprehensively. Therefore, there is a serious need to expose a new methodology that covers the entire validation aspects for linking and mapping diverse datasets. Our goal is to conduct a new approach to achieve optimal validation outcomes. The first step towards this goal is implementing SHACL to validate the mapping between the International Committee for Documentation (CIDOC) conceptual reference model (CRM) and one of its ontologies. To initiate this project successfully, a thorough understanding of both source and target ontologies was required. Subsequently, the proper environment to run SHACL and its shape graphs were determined. As a case study, we performed SHACL over a CIDOC-CRM dataset after running a Pellet reasoner via the Protégé program. The applied validation falls under multiple categories: a) data type validation which constrains whether the source data is mapped to the correct data type. For instance, checking whether a birthdate is assigned to xsd:datetime and linked to Person entity via crm:P82a_begin_of_the_begin property. b) Data integrity validation which detects inconsistent data. For instance, inspecting whether a person's birthdate occurred before any of the linked event creation dates. The expected results of our work are: 1) highlighting validation techniques and categories, 2) selecting the most suitable techniques for those various categories of validation tasks. The next plan is to establish a comprehensive validation model and generate SHACL shapes automatically.

Keywords: SHACL, CIDOC-CRM, SPARQL, validation of ontology mapping

Procedia PDF Downloads 253
716 Numerical Simulation of a Point Absorber Wave Energy Converter Using OpenFOAM in Indian Scenario

Authors: Pooja Verma, Sumana Ghosh

Abstract:

There is a growing need for alternative way of power generation worldwide. The reason can be attributed to limited resources of fossil fuels, environmental pollution, increasing cost of conventional fuels, and lower efficiency of conversion of energy in existing systems. In this context, one of the potential alternatives for power generation is wave energy. However, it is difficult to estimate the amount of electrical energy generation in an irregular sea condition by experiment and or analytical methods. Therefore in this work, a numerical wave tank is developed using the computational fluid dynamics software Open FOAM. In this software a specific utility known as waves2Foam utility is being used to carry out the simulation work. The computational domain is a tank of dimension: 5m*1.5m*1m with a floating object of dimension: 0.5m*0.2m*0.2m. Regular waves are generated at the inlet of the wave tank according to Stokes second order theory. The main objective of the present study is to validate the numerical model against existing experimental data. It shows a good matching with the existing experimental data of floater displacement. Later the model is exploited to estimate energy extraction due to the movement of such a point absorber in real sea conditions. Scale down the wave properties like wave height, wave length, etc. are used as input parameters. Seasonal variations are also considered.

Keywords: OpenFOAM, numerical wave tank, regular waves, floating object, point absorber

Procedia PDF Downloads 352
715 A Systematic Review Examining the Experimental methodology behind in vivo testing of hiatus hernia and Diaphragmatic Hernia Mesh

Authors: Whitehead-Clarke T., Beynon V., Banks J., Karanjia R., Mudera V., Windsor A., Kureshi A.

Abstract:

Introduction: Mesh implants are regularly used to help repair both hiatus hernias (HH) and diaphragmatic hernias (DH). In vivo studies are used to test not only mesh safety but increasingly comparative efficacy. Our work examines the field of in vivo mesh testing for HH and DH models to establish current practices and standards. Method: This systematic review was registered with PROSPERO. Medline and Embase databases were searched for relevant in vivo studies. 44 articles were identified and underwent abstract review, where 22 were excluded. 4 further studies were excluded after full text review – leaving 18 to undergo data extraction. Results: Of 18 studies identified, 9 used an in vivo HH model and 9 a DH model. 5 studies undertook mechanical testing on tissue samples – all uniaxial in nature. Testing strip widths ranged from 1-20mm (median 3mm). Testing speeds varied from 1.5-60mm/minute. Upon histology, the most commonly assessed structural and cellular factors were neovascularization and macrophages, respectively (n=9 each). Structural analysis was mostly qualitative, where cellular analysis was equally likely to be quantitative. 11 studies assessed adhesion formation, of which 8 used one of four scoring systems. 8 studies measured mesh shrinkage. Discussion: In vivo studies assessing mesh for HH and DH repair are uncommon. Within this relatively young field, we encourage surgical and materials testing institutions to discuss its standardisation.

Keywords: hiatus, diaphragmatic, hernia, mesh, materials testing, in vivo

Procedia PDF Downloads 214
714 A Lexicographic Approach to Obstacles Identified in the Ontological Representation of the Tree of Life

Authors: Sandra Young

Abstract:

The biodiversity literature is vast and heterogeneous. In today’s data age, numbers of data integration and standardisation initiatives aim to facilitate simultaneous access to all the literature across biodiversity domains for research and forecasting purposes. Ontologies are being used increasingly to organise this information, but the rationalisation intrinsic to ontologies can hit obstacles when faced with the intrinsic fluidity and inconsistency found in the domains comprising biodiversity. Essentially the problem is a conceptual one: biological taxonomies are formed on the basis of specific, physical specimens yet nomenclatural rules are used to provide labels to describe these physical objects. These labels are ambiguous representations of the physical specimen. An example of this is with the genus Melpomene, the scientific nomenclatural representation of a genus of ferns, but also for a genus of spiders. The physical specimens for each of these are vastly different, but they have been assigned the same nomenclatural reference. While there is much research into the conceptual stability of the taxonomic concept versus the nomenclature used, to the best of our knowledge as yet no research has looked empirically at the literature to see the conceptual plurality or singularity of the use of these species’ names, the linguistic representation of a physical entity. Language itself uses words as symbols to represent real world concepts, whether physical entities or otherwise, and as such lexicography has a well-founded history in the conceptual mapping of words in context for dictionary making. This makes it an ideal candidate to explore this problem. The lexicographic approach uses corpus-based analysis to look at word use in context, with a specific focus on collocated word frequencies (the frequencies of words used in specific grammatical and collocational contexts). It allows for inconsistencies and contradictions in the source data and in fact includes these in the word characterisation so that 100% of the available evidence is counted. Corpus analysis is indeed suggested as one of the ways to identify concepts for ontology building, because of its ability to look empirically at data and show patterns in language usage, which can indicate conceptual ideas which go beyond words themselves. In this sense it could potentially be used to identify if the hierarchical structures present within the empirical body of literature match those which have been identified in ontologies created to represent them. The first stages of this research have revealed a hierarchical structure that becomes apparent in the biodiversity literature when annotating scientific species’ names, common names and more general names as classes, which will be the focus of this paper. The next step in the research is focusing on a larger corpus in which specific words can be analysed and then compared with existing ontological structures looking at the same material, to evaluate the methods by means of an alternative perspective. This research aims to provide evidence as to the validity of the current methods in knowledge representation for biological entities, and also shed light on the way that scientific nomenclature is used within the literature.

Keywords: ontology, biodiversity, lexicography, knowledge representation, corpus linguistics

Procedia PDF Downloads 137
713 Enhancement Production and Development of Hot Dry Rock System by Using Supercritical CO2 as Working Fluid Instead of Water to Advance Indonesia's Geothermal Energy

Authors: Dhara Adhnandya Kumara, Novrizal Novrizal

Abstract:

Hot Dry Rock (HDR) is one of geothermal energy which is abundant in many provinces in Indonesia. Heat exploitation from HDR would need a method which injects fluid to subsurface to crack the rock and sweep the heat. Water is commonly used as the working fluid but known to be less effective in some ways. The new research found out that Supercritical CO2 (SCCO2) can be used to replace water as the working fluid. By studying heat transfer efficiency, pumping power, and characteristics of the returning fluid, we might decide how effective SCCO2 to replace water as working fluid. The method used to study those parameters quantitatively could be obtained from pre-existing researches which observe the returning fluids from the same reservoir with same pumping power. The result shows that SCCO2 works better than water. For cold and hot SCCO2 has lower density difference than water, this results in higher buoyancy in the system that allows the fluid to circulate with lower pumping power. Besides, lower viscosity of SCCO2 impacts in higher flow rate in circulation. The interaction between SCCO2 and minerals in reservoir could induce dehydration of the minerals and enhancement of rock porosity and permeability. While the dissolution and transportation of minerals by SCCO2 are unlikely to occur because of the nature of SCCO2 as poor solvent, and this will reduce the mineral scaling in the system. Under those conditions, using SCCO2 as working fluid for HDR extraction would give great advantages to advance geothermal energy in Indonesia.

Keywords: geothermal, supercritical CO2, injection fluid, hot dry rock

Procedia PDF Downloads 217
712 Dyeing of Wool and Silk with Soxhlet Water Extracted Natural Dye from Dacryodes macrophylla Fruits and Study of Antimicrobial Properties of Extract

Authors: Alvine Sandrine Ndinchout, D. P. Chattopadhyay, Moundipa Fewou Paul, Nyegue Maximilienne Ascension, Varinder Kaur, Sukhraj Kaur, B. H. Patel

Abstract:

Dacryodes macrophylla is a species of the Burseraceae family that is widespread in Cameroon, Equatorial Guinea, and Gabon. The only part of D. macrophylla known to use is the pulp contained in the fruit. This very juicy pulp is consumed directly and used in making juices. During consumption, these fruit leaves a dark blackish colour on fingers and garment. This observation means that D. macrophylla fruits must be a good source of natural dye with probably good fastness properties on textile materials. But D. macrophylla has not yet been investigated with reference as a potential source of natural dye to our best knowledge. Natural dye has been extracted using water as solvent by soxhlet extraction method. The extracted color was characterized by spectroscopic studies like UV/Visible and further tested for antimicrobial activity against gram-negative (Vibrio cholerae, Escherichia coli, Salmonella enterica serotype Typhi, Shigella flexneri) and gram-positive (Listeria monocytogenes, Staphylococcus aureus) bacteria. It was observed that the water extract of D. macrophylla showed antimicrobial activities against S. enterica. The results of fastness properties of the dyed fabrics were fair to good. Taken together, these results indicate that D. macrophylla can be used as natural dye not only in textile but also in other domains like food coloring.

Keywords: antimicrobial activity, natural dye, silk, wash fastness, wool

Procedia PDF Downloads 175
711 Distribution of Spotted Fever Group in Ixodid Ticks, Domestic Cattle and Buffalos of Faisalabad District, Punjab, Pakistan

Authors: Muhammad Sohail Sajid, Qurat-ul-Ain, Zafar Iqbal, Muhammad Nisar Khan, Asma Kausar, Adil Ejaz

Abstract:

Rickettsiosis, caused by a Spotted Fever Group Rickettsiae (SFGR), is considered as an emerging infectious disease from public and veterinary perspective. The present study reports distribution of SFGR in the host (buffalo and cattle) and vector (ticks) population determined through gene specific amplification through PCR targeting outer membrane protein (ompA). Tick and blood samples were collected using standard protocols through convenient sampling from district Faisalabad. Ticks were dissected to extract salivary glands (SG). Blood and tick SG pools were subjected to DNA extraction and amplification of ompA using PCR. Overall prevalence of SFGR was reported as 21.5% and 33.6 % from blood and ticks, respectively. Hyalomma anatolicum was more prevalent tick associated with SFGR as compared to Rhipicephalus sp. Higher prevalence of SFGR was reported in cattle (25%) population as compared to that of buffalo (17.07%). On seasonal basis, high SFGR prevalence was recorded during spring season (48.1%, 26.32%, 17.76%) as compared to winter (27.9%, 21.43%, 15.38%) in vector and host (cattle and buffalo respectively) population. Sequencing analysis indicated that rickettsial endo-symbionts were associated with ticks of the study area. These results provided baseline information about the prevalence of SFGR in vector and host population.

Keywords: Rickettsia, livestock, polymerase chain reaction, sequencing, ticks, vectors

Procedia PDF Downloads 269
710 Denoising Transient Electromagnetic Data

Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen

Abstract:

Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.

Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform

Procedia PDF Downloads 85