Search results for: fault detection and classification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5549

Search results for: fault detection and classification

1079 Using Cyclic Structure to Improve Inference on Network Community Structure

Authors: Behnaz Moradijamei, Michael Higgins

Abstract:

Identifying community structure is a critical task in analyzing social media data sets often modeled by networks. Statistical models such as the stochastic block model have proven to explain the structure of communities in real-world network data. In this work, we develop a goodness-of-fit test to examine community structure's existence by using a distinguishing property in networks: cyclic structures are more prevalent within communities than across them. To better understand how communities are shaped by the cyclic structure of the network rather than just the number of edges, we introduce a novel method for deciding on the existence of communities. We utilize these structures by using renewal non-backtracking random walk (RNBRW) to the existing goodness-of-fit test. RNBRW is an important variant of random walk in which the walk is prohibited from returning back to a node in exactly two steps and terminates and restarts once it completes a cycle. We investigate the use of RNBRW to improve the performance of existing goodness-of-fit tests for community detection algorithms based on the spectral properties of the adjacency matrix. Our proposed test on community structure is based on the probability distribution of eigenvalues of the normalized retracing probability matrix derived by RNBRW. We attempt to make the best use of asymptotic results on such a distribution when there is no community structure, i.e., asymptotic distribution under the null hypothesis. Moreover, we provide a theoretical foundation for our statistic by obtaining the true mean and a tight lower bound for RNBRW edge weights variance.

Keywords: hypothesis testing, RNBRW, network inference, community structure

Procedia PDF Downloads 140
1078 Non-Destructive Testing of Selective Laser Melting Products

Authors: Luca Collini, Michele Antolotti, Diego Schiavi

Abstract:

At present, complex geometries within production time shrinkage, rapidly increasing demand, and high-quality standard requirement make the non-destructive (ND) control of additively manufactured components indispensable means. On the other hand, a technology gap and the lack of standards regulating the methods and the acceptance criteria indicate the NDT of these components a stimulating field to be still fully explored. Up to date, penetrant testing, acoustic wave, tomography, radiography, and semi-automated ultrasound methods have been tested on metal powder based products so far. External defects, distortion, surface porosity, roughness, texture, internal porosity, and inclusions are the typical defects in the focus of testing. Detection of density and layers compactness are also been tried on stainless steels by the ultrasonic scattering method. In this work, the authors want to present and discuss the radiographic and the ultrasound ND testing on additively manufactured Ti₆Al₄V and inconel parts obtained by the selective laser melting (SLM) technology. In order to test the possibilities given by the radiographic method, both X-Rays and γ-Rays are tried on a set of specifically designed specimens realized by the SLM. The specimens contain a family of defectology, which represent the most commonly found, as cracks and lack of fusion. The tests are also applied to real parts of various complexity and thickness. A set of practical indications and of acceptance criteria is finally drawn.

Keywords: non-destructive testing, selective laser melting, radiography, UT method

Procedia PDF Downloads 133
1077 A Straightforward Method for Determining Inorganic Selenium Speciations by Graphite Furnace Atomic Absorption Spectroscopy in Water Samples

Authors: Sahar Ehsani, David James, Vernon Hodge

Abstract:

In this experimental study, total selenium in solution was measured with Graphite Furnace Atomic Absorption Spectroscopy, GFAAS, then chemical reactions with sodium borohydride were used to reduce selenite to hydrogen selenide. Hydrogen selenide was then stripped from the solution by purging the solution with nitrogen gas. Since the two main speciations in oxic waters are usually selenite, Se(IV) and selenate, Se(VI), it was assumed that after Se(IV) is removed, the remaining total selenium was Se(VI). Total selenium measured after stripping gave Se(VI) concentration, and the difference of total selenium measured before and after stripping gave Se(IV) concentration. An additional step of reducing Se(VI) to Se(IV) was performed by boiling the stripped solution under acidic conditions, then removing Se(IV) by a chemical reaction with sodium borohydride. This additional procedure of removing Se(VI) from the solution is useful in rare cases where the water sample is reducing and contains selenide speciation. In this study, once Se(IV) and Se(VI) were both removed from the water sample, the remaining total selenium concentration was zero. The method was tested to determine Se(IV) and Se(VI) in both purified water and synthetic irrigation water spiked with Se(IV) and Se(VI). Average recovery of spiked samples of diluted synthetic irrigation water was 99% for Se(IV) and 97% for Se(VI). Detection limits of the method were 0.11 µg L⁻¹ and 0.32 µg L⁻¹ for Se(IV) and Se(VI), respectively.

Keywords: Analytical Method, Graphite Furnace Atomic Absorption Spectroscopy, Selenate, Selenite, Selenium Speciations

Procedia PDF Downloads 134
1076 Fuzzy Logic Classification Approach for Exponential Data Set in Health Care System for Predication of Future Data

Authors: Manish Pandey, Gurinderjit Kaur, Meenu Talwar, Sachin Chauhan, Jagbir Gill

Abstract:

Health-care management systems are a unit of nice connection as a result of the supply a straightforward and fast management of all aspects relating to a patient, not essentially medical. What is more, there are unit additional and additional cases of pathologies during which diagnosing and treatment may be solely allotted by victimization medical imaging techniques. With associate ever-increasing prevalence, medical pictures area unit directly acquired in or regenerate into digital type, for his or her storage additionally as sequent retrieval and process. Data Mining is the process of extracting information from large data sets through using algorithms and Techniques drawn from the field of Statistics, Machine Learning and Data Base Management Systems. Forecasting may be a prediction of what's going to occur within the future, associated it's an unsure method. Owing to the uncertainty, the accuracy of a forecast is as vital because the outcome foretold by foretelling the freelance variables. A forecast management should be wont to establish if the accuracy of the forecast is within satisfactory limits. Fuzzy regression strategies have normally been wont to develop shopper preferences models that correlate the engineering characteristics with shopper preferences relating to a replacement product; the patron preference models offer a platform, wherever by product developers will decide the engineering characteristics so as to satisfy shopper preferences before developing the merchandise. Recent analysis shows that these fuzzy regression strategies area units normally will not to model client preferences. We tend to propose a Testing the strength of Exponential Regression Model over regression toward the mean Model.

Keywords: health-care management systems, fuzzy regression, data mining, forecasting, fuzzy membership function

Procedia PDF Downloads 269
1075 Image Processing of Scanning Electron Microscope Micrograph of Ferrite and Pearlite Steel for Recognition of Micro-Constituents

Authors: Subir Gupta, Subhas Ganguly

Abstract:

In this paper, we demonstrate the new area of application of image processing in metallurgical images to develop the more opportunity for structure-property correlation based approaches of alloy design. The present exercise focuses on the development of image processing tools suitable for phrase segmentation, grain boundary detection and recognition of micro-constituents in SEM micrographs of ferrite and pearlite steels. A comprehensive data of micrographs have been experimentally developed encompassing the variation of ferrite and pearlite volume fractions and taking images at different magnification (500X, 1000X, 15000X, 2000X, 3000X and 5000X) under scanning electron microscope. The variation in the volume fraction has been achieved using four different plain carbon steel containing 0.1, 0.22, 0.35 and 0.48 wt% C heat treated under annealing and normalizing treatments. The obtained data pool of micrographs arbitrarily divided into two parts to developing training and testing sets of micrographs. The statistical recognition features for ferrite and pearlite constituents have been developed by learning from training set of micrographs. The obtained features for microstructure pattern recognition are applied to test set of micrographs. The analysis of the result shows that the developed strategy can successfully detect the micro constitutes across the wide range of magnification and variation of volume fractions of the constituents in the structure with an accuracy of about +/- 5%.

Keywords: SEM micrograph, metallurgical image processing, ferrite pearlite steel, microstructure

Procedia PDF Downloads 192
1074 Improve Student Performance Prediction Using Majority Vote Ensemble Model for Higher Education

Authors: Wade Ghribi, Abdelmoty M. Ahmed, Ahmed Said Badawy, Belgacem Bouallegue

Abstract:

In higher education institutions, the most pressing priority is to improve student performance and retention. Large volumes of student data are used in Educational Data Mining techniques to find new hidden information from students' learning behavior, particularly to uncover the early symptom of at-risk pupils. On the other hand, data with noise, outliers, and irrelevant information may provide incorrect conclusions. By identifying features of students' data that have the potential to improve performance prediction results, comparing and identifying the most appropriate ensemble learning technique after preprocessing the data, and optimizing the hyperparameters, this paper aims to develop a reliable students' performance prediction model for Higher Education Institutions. Data was gathered from two different systems: a student information system and an e-learning system for undergraduate students in the College of Computer Science of a Saudi Arabian State University. The cases of 4413 students were used in this article. The process includes data collection, data integration, data preprocessing (such as cleaning, normalization, and transformation), feature selection, pattern extraction, and, finally, model optimization and assessment. Random Forest, Bagging, Stacking, Majority Vote, and two types of Boosting techniques, AdaBoost and XGBoost, are ensemble learning approaches, whereas Decision Tree, Support Vector Machine, and Artificial Neural Network are supervised learning techniques. Hyperparameters for ensemble learning systems will be fine-tuned to provide enhanced performance and optimal output. The findings imply that combining features of students' behavior from e-learning and students' information systems using Majority Vote produced better outcomes than the other ensemble techniques.

Keywords: educational data mining, student performance prediction, e-learning, classification, ensemble learning, higher education

Procedia PDF Downloads 96
1073 Guidelines to Designing Generic Protocol for Responding to Chemical, Biological, Radiological and Nuclear Incidents

Authors: Mohammad H. Yarmohammadian, Mehdi Nasr Isfahani, Elham Anbari

Abstract:

Introduction: The awareness of using chemical, biological, and nuclear agents in everyday industrial and non-industrial incidents has increased recently; release of these materials can be accidental or intentional. Since hospitals are the forefronts of confronting Chemical, Biological, Radiological and Nuclear( CBRN) incidents, the goal of the present research was to provide a generic protocol for CBRN incidents through a comparative review of CBRN protocols and guidelines of different countries and reviewing various books, handbooks and papers. Method: The integrative approach or research synthesis was adopted in this study. First a simple narrative review of programs, books, handbooks, and papers about response to CBRN incidents in different countries was carried out. Then the most important and functional information was discussed in the form of a generic protocol in focus group sessions and subsequently confirmed. Results: Findings indicated that most of the countries had various protocols, guidelines, and handbooks for hazardous materials or CBRN incidents. The final outcome of the research synthesis was a 50 page generic protocol whose main topics included introduction, definition and classification of CBRN agents, four major phases of incident and disaster management cycle, hospital response management plan, equipment, and recommended supplies and antidotes for decontamination (radiological/nuclear, chemical, biological); each of these also had subtopics. Conclusion: In the majority of international protocols, guidelines, handbooks and also international and Iranian books and papers, there is an emphasis on the importance of incident command system, determining the safety degree of decontamination zones, maps of decontamination zones, decontamination process, triage classifications, personal protective equipment, and supplies and antidotes for decontamination; these are the least requirements for such incidents and also consistent with the provided generic protocol.

Keywords: hospital, CBRN, decontamination, generic protocol, CBRN Incidents

Procedia PDF Downloads 287
1072 Sound Analysis of Young Broilers Reared under Different Stocking Densities in Intensive Poultry Farming

Authors: Xiaoyang Zhao, Kaiying Wang

Abstract:

The choice of stocking density in poultry farming is a potential way for determining welfare level of poultry. However, it is difficult to measure stocking densities in poultry farming because of a lot of variables such as species, age and weight, feeding way, house structure and geographical location in different broiler houses. A method was proposed in this paper to measure the differences of young broilers reared under different stocking densities by sound analysis. Vocalisations of broilers were recorded and analysed under different stocking densities to identify the relationship between sounds and stocking densities. Recordings were made continuously for three-week-old chickens in order to evaluate the variation of sounds emitted by the animals at the beginning. The experimental trial was carried out in an indoor reared broiler farm; the audio recording procedures lasted for 5 days. Broilers were divided into 5 groups, stocking density treatments were 8/m², 10/m², 12/m² (96birds/pen), 14/m² and 16/m², all conditions including ventilation and feed conditions were kept same except from stocking densities in every group. The recordings and analysis of sounds of chickens were made noninvasively. Sound recordings were manually analysed and labelled using sound analysis software: GoldWave Digital Audio Editor. After sound acquisition process, the Mel Frequency Cepstrum Coefficients (MFCC) was extracted from sound data, and the Support Vector Machine (SVM) was used as an early detector and classifier. This preliminary study, conducted in an indoor reared broiler farm shows that this method can be used to classify sounds of chickens under different densities economically (only a cheap microphone and recorder can be used), the classification accuracy is 85.7%. This method can predict the optimum stocking density of broilers with the complement of animal welfare indicators, animal productive indicators and so on.

Keywords: broiler, stocking density, poultry farming, sound monitoring, Mel Frequency Cepstrum Coefficients (MFCC), Support Vector Machine (SVM)

Procedia PDF Downloads 151
1071 Identification of Blood Biomarkers Unveiling Early Alzheimer's Disease Diagnosis Through Single-Cell RNA Sequencing Data and Autoencoders

Authors: Hediyeh Talebi, Shokoofeh Ghiam, Changiz Eslahchi

Abstract:

Traditionally, Alzheimer’s disease research has focused on genes with significant fold changes, potentially neglecting subtle but biologically important alterations. Our study introduces an integrative approach that highlights genes crucial to underlying biological processes, regardless of their fold change magnitude. Alzheimer's Single-cell RNA-seq data related to the peripheral blood mononuclear cells (PBMC) was extracted from the Gene Expression Omnibus (GEO). After quality control, normalization, scaling, batch effect correction, and clustering, differentially expressed genes (DEGs) were identified with adjusted p-values less than 0.05. These DEGs were categorized based on cell-type, resulting in four datasets, each corresponding to a distinct cell type. To distinguish between cells from healthy individuals and those with Alzheimer's, an adversarial autoencoder with a classifier was employed. This allowed for the separation of healthy and diseased samples. To identify the most influential genes in this classification, the weight matrices in the network, which includes the encoder and classifier components, were multiplied, and focused on the top 20 genes. The analysis revealed that while some of these genes exhibit a high fold change, others do not. These genes, which may be overlooked by previous methods due to their low fold change, were shown to be significant in our study. The findings highlight the critical role of genes with subtle alterations in diagnosing Alzheimer's disease, a facet frequently overlooked by conventional methods. These genes demonstrate remarkable discriminatory power, underscoring the need to integrate biological relevance with statistical measures in gene prioritization. This integrative approach enhances our understanding of the molecular mechanisms in Alzheimer’s disease and provides a promising direction for identifying potential therapeutic targets.

Keywords: alzheimer's disease, single-cell RNA-seq, neural networks, blood biomarkers

Procedia PDF Downloads 53
1070 Molecular Identification and Genotyping of Human Brucella Strains Isolated in Kuwait

Authors: Abu Salim Mustafa

Abstract:

Brucellosis is a zoonotic disease endemic in Kuwait. Human brucellosis can be caused by several Brucella species with Brucella melitensis causing the most severe and Brucella abortus the least severe disease. Furthermore, relapses are common after successful chemotherapy of patients. The classical biochemical methods of culture and serology for identification of Brucellae provide information about the species and serotypes only. However, to differentiate between relapse and reinfection/epidemiological investigations, the identification of genotypes using molecular methods is essential. In this study, four molecular methods [16S rRNA gene sequencing, real-time PCR, enterobacterial repetitive intergenic consensus (ERIC)-PCR and multilocus variable-number tandem-repeat analysis (MLVA)-16] were evaluated for the identification and typing of 75 strains of Brucella isolated in Kuwait. The 16S rRNA gene sequencing suggested that all the strains were B. melitensis and real-time PCR confirmed their species identity as B. melitensis. The ERIC-PCR band profiles produced a dendrogram of 75 branches suggesting each strain to be of a unique type. The cluster classification, based on ~ 80% similarity, divided all the ERIC genotypes into two clusters, A and B. Cluster A consisted of 9 ERIC genotypes (A1-A9) corresponding to 9 individual strains. Cluster B comprised of 13 ERIC genotypes (B1-B13) with B5 forming the largest cluster of 51 strains. MLVA-16 identified all isolates as B. melitensis and divided them into 71 MLVA-types. The cluster analysis of MLVA-16-types suggested that most of the strains in Kuwait originated from the East Mediterranean Region, a few from the African group and one new genotype closely matched with the West Mediterranean region. In conclusion, this work demonstrates that B. melitensis, the most pathogenic species of Brucella, is prevalent in Kuwait. Furthermore, MLVA-16 is the best molecular method, which can identify the Brucella species and genotypes as well as determine their origin in the global context. Supported by Kuwait University Research Sector grants MI04/15 and SRUL02/13.

Keywords: Brucella, ERIC-PCR, MLVA-16, RT-PCR, 16S rRNA gene sequencing

Procedia PDF Downloads 380
1069 Evaluation of Brca1/2 Mutational Status among Algerian Familial Breast Cancer

Authors: Arab M., Ait Abdallah M., Zeraoulia N., Boumaza H., Aoutia M., Griene L., Ait Abdelkader B.,

Abstract:

breast and ovarian cancer are respectively the first and fourth leading causes of cancer among women in Algeria. A family story of cancer in the most important risk factor, and in most cases of families with breast and /or ovarian cancer, the pattern of cancer family can be attributed to mutation in BRCA1/2genes. objectibes: the aim of our study in to investigate the spectrum of BRCA1/2 germiline mutation in familial breast and /or ovarian cancer and to determine the prevalence and the nature of BRCA1/2mutation in Algeria methods: we deremined the prevalence of BRCA1/2 mutation within a cohort of 161 probands selected according the eisinger score double stranded sanger sequencing of all coding exons of BRCA1/2including flanking intronic region were performed results: we identified a total of 23 distinct deleterious mutations (class5) 12 differents mutations in BRCA1(52%) and 11 in BRCA2(48%). 78% (18/23) were protein truncating and 22%(5/23) missens mutations.3 novel deleterious mutations have been identified, which have not been described in public mutation database. one new mutation were found in two unrelated patients. the overall mutation detection rate in our study is 28,5%(46/161).more over, an UVS c7783 located in BRCA2 is found in two unrelated probands and segregate in the 02 families/ conclusion: our results sugget of large spectrum of BRCA1/2 mutation in Algerian breast/ovarian cancer family. The nature and prevalence of BRCA1/2mutation in algerian families are ongoing in a larger study, 80 probands are to day under investigation. This study which may therefore identify the genetic particularity of Algerian breast /ovarian cancer.

Keywords: BRCA1/2 mutations, hereditary breast cancer, algerian women, prvalence

Procedia PDF Downloads 168
1068 Evaluating Robustness of Conceptual Rainfall-runoff Models under Climate Variability in Northern Tunisia

Authors: H. Dakhlaoui, D. Ruelland, Y. Tramblay, Z. Bargaoui

Abstract:

To evaluate the impact of climate change on water resources at the catchment scale, not only future projections of climate are necessary but also robust rainfall-runoff models that are able to be fairly reliable under changing climate conditions. This study aims at assessing the robustness of three conceptual rainfall-runoff models (GR4j, HBV and IHACRES) on five basins in Northern Tunisia under long-term climate variability. Their robustness was evaluated according to a differential split sample test based on a climate classification of the observation period regarding simultaneously precipitation and temperature conditions. The studied catchments are situated in a region where climate change is likely to have significant impacts on runoff and they already suffer from scarcity of water resources. They cover the main hydrographical basins of Northern Tunisia (High Medjerda, Zouaraâ, Ichkeul and Cap bon), which produce the majority of surface water resources in Tunisia. The streamflow regime of the basins can be considered as natural since these basins are located upstream from storage-dams and in areas where withdrawals are negligible. A 30-year common period (1970‒2000) was considered to capture a large spread of hydro-climatic conditions. The calibration was based on the Kling-Gupta Efficiency (KGE) criterion, while the evaluation of model transferability is performed according to the Nash-Suttfliff efficiency criterion and volume error. The three hydrological models were shown to have similar behaviour under climate variability. Models prove a better ability to simulate the runoff pattern when transferred toward wetter periods compared to the case when transferred to drier periods. The limits of transferability are beyond -20% of precipitation and +1.5 °C of temperature in comparison with the calibration period. The deterioration of model robustness could in part be explained by the climate dependency of some parameters.

Keywords: rainfall-runoff modelling, hydro-climate variability, model robustness, uncertainty, Tunisia

Procedia PDF Downloads 286
1067 Preparedness for Microbial Forensics Evidence Collection on Best Practice

Authors: Victor Ananth Paramananth, Rashid Muniginin, Mahaya Abd Rahman, Siti Afifah Ismail

Abstract:

Safety issues, scene protection, and appropriate evidence collection must be handled in any bio crime scene. There will be a scene or multi-scene to be cordoned for investigation in any bio-incident or bio crime event. Evidence collection is critical in determining the type of microbial or toxin, its lethality, and its source. As a consequence, from the start of the investigation, a proper sampling method is required. The most significant challenges for the crime scene officer would be deciding where to obtain samples, the best sampling method, and the sample sizes needed. Since there could be evidence in liquid, viscous, or powder shape at a crime scene, crime scene officers have difficulty determining which tools to use for sampling. To maximize sample collection, the appropriate tools for sampling methods are necessary. This study aims to assist the crime scene officer in collecting liquid, viscous, and powder biological samples in sufficient quantity while preserving sample quality. Observational tests on sample collection using liquid, viscous, and powder samples for adequate quantity and sample quality were performed using UV light in this research. The density of the light emission varies upon the method of collection and sample types. The best tools for collecting sufficient amounts of liquid, viscous, and powdered samples can be identified by observing UV light. Instead of active microorganisms, the invisible powder is used to assess sufficient sample collection during a crime scene investigation using various collection tools. The liquid, powdered and viscous samples collected using different tools were analyzed using Fourier transform infrared - attenuate total reflection (FTIR-ATR). FTIR spectroscopy is commonly used for rapid discrimination, classification, and identification of intact microbial cells. The liquid, viscous and powdered samples collected using various tools have been successfully observed using UV light. Furthermore, FTIR-ATR analysis showed that collected samples are sufficient in quantity while preserving their quality.

Keywords: biological sample, crime scene, collection tool, UV light, forensic

Procedia PDF Downloads 189
1066 Arabic Lexicon Learning to Analyze Sentiment in Microblogs

Authors: Mahmoud B. Rokaya

Abstract:

The study of opinion mining and sentiment analysis includes analysis of opinions, sentiments, evaluations, attitudes, and emotions. The rapid growth of social media, social networks, reviews, forum discussions, microblogs, and Twitter, leads to a parallel growth in the field of sentiment analysis. The field of sentiment analysis tries to develop effective tools to make it possible to capture the trends of people. There are two approaches in the field, lexicon-based and corpus-based methods. A lexicon-based method uses a sentiment lexicon which includes sentiment words and phrases with assigned numeric scores. These scores reveal if sentiment phrases are positive or negative, their intensity, and/or their emotional orientations. Creation of manual lexicons is hard. This brings the need for adaptive automated methods for generating a lexicon. The proposed method generates dynamic lexicons based on the corpus and then classifies text using these lexicons. In the proposed method, different approaches are combined to generate lexicons from text. The proposed method classifies the tweets into 5 classes instead of +ve or –ve classes. The sentiment classification problem is written as an optimization problem, finding optimum sentiment lexicons are the goal of the optimization process. The solution was produced based on mathematical programming approaches to find the best lexicon to classify texts. A genetic algorithm was written to find the optimal lexicon. Then, extraction of a meta-level feature was done based on the optimal lexicon. The experiments were conducted on several datasets. Results, in terms of accuracy, recall and F measure, outperformed the state-of-the-art methods proposed in the literature in some of the datasets. A better understanding of the Arabic language and culture of Arab Twitter users and sentiment orientation of words in different contexts can be achieved based on the sentiment lexicons proposed by the algorithm.

Keywords: social media, Twitter sentiment, sentiment analysis, lexicon, genetic algorithm, evolutionary computation

Procedia PDF Downloads 174
1065 The Systems Biology Verification Endeavor: Harness the Power of the Crowd to Address Computational and Biological Challenges

Authors: Stephanie Boue, Nicolas Sierro, Julia Hoeng, Manuel C. Peitsch

Abstract:

Systems biology relies on large numbers of data points and sophisticated methods to extract biologically meaningful signal and mechanistic understanding. For example, analyses of transcriptomics and proteomics data enable to gain insights into the molecular differences in tissues exposed to diverse stimuli or test items. Whereas the interpretation of endpoints specifically measuring a mechanism is relatively straightforward, the interpretation of big data is more complex and would benefit from comparing results obtained with diverse analysis methods. The sbv IMPROVER project was created to implement solutions to verify systems biology data, methods, and conclusions. Computational challenges leveraging the wisdom of the crowd allow benchmarking methods for specific tasks, such as signature extraction and/or samples classification. Four challenges have already been successfully conducted and confirmed that the aggregation of predictions often leads to better results than individual predictions and that methods perform best in specific contexts. Whenever the scientific question of interest does not have a gold standard, but may greatly benefit from the scientific community to come together and discuss their approaches and results, datathons are set up. The inaugural sbv IMPROVER datathon was held in Singapore on 23-24 September 2016. It allowed bioinformaticians and data scientists to consolidate their ideas and work on the most promising methods as teams, after having initially reflected on the problem on their own. The outcome is a set of visualization and analysis methods that will be shared with the scientific community via the Garuda platform, an open connectivity platform that provides a framework to navigate through different applications, databases and services in biology and medicine. We will present the results we obtained when analyzing data with our network-based method, and introduce a datathon that will take place in Japan to encourage the analysis of the same datasets with other methods to allow for the consolidation of conclusions.

Keywords: big data interpretation, datathon, systems toxicology, verification

Procedia PDF Downloads 271
1064 Stability-Indicating High-Performance Thin-Layer Chromatography Method for Estimation of Naftopidil

Authors: P. S. Jain, K. D. Bobade, S. J. Surana

Abstract:

A simple, selective, precise and Stability-indicating High-performance thin-layer chromatographic method for analysis of Naftopidil both in a bulk and in pharmaceutical formulation has been developed and validated. The method employed, HPTLC aluminium plates precoated with silica gel as the stationary phase. The solvent system consisted of hexane: ethyl acetate: glacial acetic acid (4:4:2 v/v). The system was found to give compact spot for Naftopidil (Rf value of 0.43±0.02). Densitometric analysis of Naftopidil was carried out in the absorbance mode at 253 nm. The linear regression analysis data for the calibration plots showed good linear relationship with r2=0.999±0.0001 with respect to peak area in the concentration range 200-1200 ng per spot. The method was validated for precision, recovery and robustness. The limits of detection and quantification were 20.35 and 61.68 ng per spot, respectively. Naftopidil was subjected to acid and alkali hydrolysis, oxidation and thermal degradation. The drug undergoes degradation under acidic, basic, oxidation and thermal conditions. This indicates that the drug is susceptible to acid, base, oxidation and thermal conditions. The degraded product was well resolved from the pure drug with significantly different Rf value. Statistical analysis proves that the method is repeatable, selective and accurate for the estimation of investigated drug. The proposed developed HPTLC method can be applied for identification and quantitative determination of Naftopidil in bulk drug and pharmaceutical formulation.

Keywords: naftopidil, HPTLC, validation, stability, degradation

Procedia PDF Downloads 390
1063 Text Localization in Fixed-Layout Documents Using Convolutional Networks in a Coarse-to-Fine Manner

Authors: Beier Zhu, Rui Zhang, Qi Song

Abstract:

Text contained within fixed-layout documents can be of great semantic value and so requires a high localization accuracy, such as ID cards, invoices, cheques, and passports. Recently, algorithms based on deep convolutional networks achieve high performance on text detection tasks. However, for text localization in fixed-layout documents, such algorithms detect word bounding boxes individually, which ignores the layout information. This paper presents a novel architecture built on convolutional neural networks (CNNs). A global text localization network and a regional bounding-box regression network are introduced to tackle the problem in a coarse-to-fine manner. The text localization network simultaneously locates word bounding points, which takes the layout information into account. The bounding-box regression network inputs the features pooled from arbitrarily sized RoIs and refine the localizations. These two networks share their convolutional features and are trained jointly. A typical type of fixed-layout documents: ID cards, is selected to evaluate the effectiveness of the proposed system. These networks are trained on data cropped from nature scene images, and synthetic data produced by a synthetic text generation engine. Experiments show that our approach locates high accuracy word bounding boxes and achieves state-of-the-art performance.

Keywords: bounding box regression, convolutional networks, fixed-layout documents, text localization

Procedia PDF Downloads 184
1062 The Use of Geographic Information System for Selecting Landfill Sites in Osogbo

Authors: Nureni Amoo, Sunday Aroge, Oluranti Akintola, Hakeem Olujide, Ibrahim Alabi

Abstract:

This study investigated the optimum landfill site in Osogbo so as to identify suitable solid waste dumpsite for proper waste management in the capital city. Despite an increase in alternative techniques for disposing of waste, landfilling remains the primary means of waste disposal. These changes in attitudes in many parts of the world have been supported by changes in laws and policies regarding the environment and waste disposal. Selecting the most suitable site for landfill can avoid any ecological and socio-economic effects. The increase in industrial and economic development, along with the increase of population growth in Osogbo town, generates a tremendous amount of solid waste within the region. Factors such as the scarcity of land, the lifespan of the landfill, and environmental considerations warrant that the scientific and fundamental studies are carried out in determining the suitability of a landfill site. The analysis of spatial data and consideration of regulations and accepted criteria are part of the important elements in the site selection. This paper presents a multi-criteria decision-making method using geographic information system (GIS) with the integration of the fuzzy logic multi-criteria decision making (FMCDM) technique for landfill suitability site evaluation. By using the fuzzy logic method (classification of suitable areas in the range of 0 to 1 scale), the superposing of the information layers related to drainage, soil, land use/land cover, slope, land use, and geology maps were performed in the study. Based on the result obtained in this study, five (5) potential sites are suitable for the construction of a landfill are proposed, two of which belong to the most suitable zone, and the existing waste disposal site belonged to the unsuitable zone.

Keywords: fuzzy logic multi-criteria decision making, geographic information system, landfill, suitable site, waste disposal

Procedia PDF Downloads 129
1061 Context Detection in Spreadsheets Based on Automatically Inferred Table Schema

Authors: Alexander Wachtel, Michael T. Franzen, Walter F. Tichy

Abstract:

Programming requires years of training. With natural language and end user development methods, programming could become available to everyone. It enables end users to program their own devices and extend the functionality of the existing system without any knowledge of programming languages. In this paper, we describe an Interactive Spreadsheet Processing Module (ISPM), a natural language interface to spreadsheets that allows users to address ranges within the spreadsheet based on inferred table schema. Using the ISPM, end users are able to search for values in the schema of the table and to address the data in spreadsheets implicitly. Furthermore, it enables them to select and sort the spreadsheet data by using natural language. ISPM uses a machine learning technique to automatically infer areas within a spreadsheet, including different kinds of headers and data ranges. Since ranges can be identified from natural language queries, the end users can query the data using natural language. During the evaluation 12 undergraduate students were asked to perform operations (sum, sort, group and select) using the system and also Excel without ISPM interface, and the time taken for task completion was compared across the two systems. Only for the selection task did users take less time in Excel (since they directly selected the cells using the mouse) than in ISPM, by using natural language for end user software engineering, to overcome the present bottleneck of professional developers.

Keywords: natural language processing, natural language interfaces, human computer interaction, end user development, dialog systems, data recognition, spreadsheet

Procedia PDF Downloads 295
1060 Medical versus Non-Medical Students' Opinions about Academic Stress Management Using Unconventional Therapies

Authors: Ramona-Niculina Jurcau, Ioana-Marieta Jurcau, Dong Hun Kwak, Nicolae-Alexandru Colceriu

Abstract:

Background: Stress management (SM) is a topic of great academic interest and equally a task to accomplish. In addition, it is recognized the beneficial role of unconventional therapies (UCT) in stress modulation. Aims: The aim was to evaluate medical (MS) versus non-medical students’ (NMS) opinions about academic stress management (ASM) using UCT. Methods: MS (n=103, third year males and females) and NMS (n=112, males and females, from humanities faculties, different years of study), out of their academic program, voluntarily answered to a questionnaire concerning: a) Classification of the four most important academic stress factors; b) The extent to which their daily life influences academic stress; c) The most important SM methods they know; d) Which of these methods they are applying; e) the UCT they know or about which they have heard; f) Which of these they know to have stress modulation effects; g) Which of these UCT, participants are using or would like to use for modulating stress; and if participants use UTC for their own choose or following a specialist consultation in those therapies (SCT); h) If they heard about the following UCT and what opinion they have (using visual analogue scale) about their use (following CST) for the ASM: Phytotherapy (PT), apitherapy (AT), homeopathy (H), ayurvedic medicine (AM), traditional Chinese medicine (TCM), music therapy (MT), color therapy (CT), forest therapy (FT). Results: Among the four most important academic stress factors, for MS more than for NMS, are: busy schedule, large amount of information taught; high level of performance required, reduced time for relaxing. The most important methods for SM that MS and NMS know, hierarchically are: listen to music, meeting friends, playing sport, hiking, sleep, regularly breaks, seeing positive side, faith; of which, NMS more than MS, are partially applying to themselves. UCT about which MS and less NMS have heard, are phytotherapy, apitherapy, acupuncture, reiki. Of these UTC, participants know to have stress modulation effects: some plants, bee’s products and music; they use or would like to use for ASM (the majority without SCT) certain teas, honey and music. Most of MS and only some NMS heard about PT, AT, TCM, MT and much less about H, AM, CT, TT. NMS more than MS, would use these UCT, following CST. Conclusions: 1) Academic stress is similarly reflected in MS and NMS opinions. 2) MS and NMS apply similar but very few UCT for stress modulation. 3) Information that MS and NMS have about UCT and their ASM application is reduced. 4) It is remarkable that MS and especially NMS, are open to UCT use for ASM, following an SCT.

Keywords: academic stress, stress management, stress modulation, medical students, non-medical students, unconventional therapies

Procedia PDF Downloads 346
1059 Assessment of the Number of Damaged Buildings from a Flood Event Using Remote Sensing Technique

Authors: Jaturong Som-ard

Abstract:

The heavy rainfall from 3rd to 22th January 2017 had swamped much area of Ranot district in southern Thailand. Due to heavy rainfall, the district was flooded which had a lot of effects on economy and social loss. The major objective of this study is to detect flooding extent using Sentinel-1A data and identify a number of damaged buildings over there. The data were collected in two stages as pre-flooding and during flood event. Calibration, speckle filtering, geometric correction, and histogram thresholding were performed with the data, based on intensity spectral values to classify thematic maps. The maps were used to identify flooding extent using change detection, along with the buildings digitized and collected on JOSM desktop. The numbers of damaged buildings were counted within the flooding extent with respect to building data. The total flooded areas were observed as 181.45 sq.km. These areas were mostly occurred at Ban khao, Ranot, Takhria, and Phang Yang sub-districts, respectively. The Ban khao sub-district had more occurrence than the others because this area is located at lower altitude and close to Thale Noi and Thale Luang lakes than others. The numbers of damaged buildings were high in Khlong Daen (726 features), Tha Bon (645 features), and Ranot sub-district (604 features), respectively. The final flood extent map might be very useful for the plan, prevention and management of flood occurrence area. The map of building damage can be used for the quick response, recovery and mitigation to the affected areas for different concern organization.

Keywords: flooding extent, Sentinel-1A data, JOSM desktop, damaged buildings

Procedia PDF Downloads 181
1058 Polycode Texts in Communication of Antisocial Groups: Functional and Pragmatic Aspects

Authors: Ivan Potapov

Abstract:

Background: The aim of this paper is to investigate poly code texts in the communication of youth antisocial groups. Nowadays, the notion of a text has numerous interpretations. Besides all the approaches to defining a text, we must take into account semiotic and cultural-semiotic ones. Rapidly developing IT, world globalization, and new ways of coding of information increase the role of the cultural-semiotic approach. However, the development of computer technologies leads also to changes in the text itself. Polycode texts play a more and more important role in the everyday communication of the younger generation. Therefore, the research of functional and pragmatic aspects of both verbal and non-verbal content is actually quite important. Methods and Material: For this survey, we applied the combination of four methods of text investigation: not only intention and content analysis but also semantic and syntactic analysis. Using these methods provided us with information on general text properties, the content of transmitted messages, and each communicants’ intentions. Besides, during our research, we figured out the social background; therefore, we could distinguish intertextual connections between certain types of polycode texts. As the sources of the research material, we used 20 public channels in the popular messenger Telegram and data extracted from smartphones, which belonged to arrested members of antisocial groups. Findings: This investigation let us assert that polycode texts can be characterized as highly intertextual language unit. Moreover, we could outline the classification of these texts based on communicants’ intentions. The most common types of antisocial polycode texts are a call to illegal actions and agitation. What is more, each type has its own semantic core: it depends on the sphere of communication. However, syntactic structure is universal for most of the polycode texts. Conclusion: Polycode texts play important role in online communication. The results of this investigation demonstrate that in some social groups using these texts has a destructive influence on the younger generation and obviously needs further researches.

Keywords: text, polycode text, internet linguistics, text analysis, context, semiotics, sociolinguistics

Procedia PDF Downloads 124
1057 Detection and Distribution Pattern of Prevelant Genotypes of Hepatitis C in a Tertiary Care Hospital of Western India

Authors: Upasana Bhumbla

Abstract:

Background: Hepatitis C virus is a major cause of chronic hepatitis, which can further lead to cirrhosis of the liver and hepatocellular carcinoma. Worldwide the burden of Hepatitis C infection has become a serious threat to the human race. Hepatitis C virus (HCV) has population-specific genotypes and provides valuable epidemiological and therapeutic information. Genotyping and assessment of viral load in HCV patients are important for planning the therapeutic strategies. The aim of the study is to study the changing trends of prevalence and genotypic distribution of hepatitis C virus in a tertiary care hospital in Western India. Methods: It is a retrospective study; blood samples were collected and tested for anti HCV antibodies by ELISA in Dept. of Microbiology. In seropositive Hepatitis C patients, quantification of HCV-RNA was done by real-time PCR and in HCV-RNA positive samples, genotyping was conducted. Results: A total of 114 patients who were seropositive for Anti HCV were recruited in the study, out of which 79 (69.29%) were HCV-RNA positive. Out of these positive samples, 54 were further subjected to genotype determination using real-time PCR. Genotype was not detected in 24 samples due to low viral load; 30 samples were positive for genotype. Conclusion: Knowledge of genotype is crucial for the management of HCV infection and prediction of prognosis. Patients infected with HCV genotype 1 and 4 will have to receive Interferon and Ribavirin for 48 weeks. Patients with these genotypes show a poor sustained viral response when tested 24 weeks after completion of therapy. On the contrary, patients infected with HCV genotype 2 and 3 are reported to have a better response to therapy.

Keywords: hepatocellular, genotype, ribavarin, seropositive

Procedia PDF Downloads 123
1056 DNA Fingerprinting of Some Major Genera of Subterranean Termites (Isoptera) (Anacanthotermes, Psammotermes and Microtermes) from Western Saudi Arabia

Authors: AbdelRahman A. Faragalla, Mohamed H. Alqhtani, Mohamed M. M.Ahmed

Abstract:

Saudi Arabia has currently been beset by a barrage of bizarre assemblages of subterranean termite fauna, inflicting heavy catastrophic havocs on human valued properties in various homes, storage facilities, warehouses, agricultural and horticultural crops including okra, sweet pepper, tomatoes, sorghum, date palm trees, citruses and many forest domains and green lush desert oases. The most pressing urgent priority is to use modern technologies to alleviate the painstaking obstacle of taxonomic identification of these injurious noxious pests that might lead to effective pest control in both infested agricultural commodities and field crops. Our study has indicated the use of DNA fingerprinting technologies, in order to generate basic information of the genetic similarity between 3 predominant families containing the most destructive termite species. The methodologies included extraction and DNA isolation from members of the major families and the use of randomly selected primers and PCR amplifications with the nucleotide sequences. GC content and annealing temperatures for all primers, PCR amplifications and agarose gel electrophoresis were also conducted in addition to the scoring and analysis of Random Amplification Polymorphic DNA-PCR (RAPDs). A phylogenetic analysis for different species using statistical computer program on the basis of RAPD-DNA results, represented as a dendrogram based on the average of band sharing ratio between different species. Our study aims to shed more light on this intriguing subject, which may lead to an expedited display of the kinship and relatedness of species in an ambitious undertaking to arrive at correct taxonomic classification of termite species, discover sibling species, so that a logistic rational pest management strategy could be delineated.

Keywords: DNA fingerprinting, Western Saudi Arabia, DNA primers, RAPD

Procedia PDF Downloads 418
1055 Robust Heart Rate Estimation from Multiple Cardiovascular and Non-Cardiovascular Physiological Signals Using Signal Quality Indices and Kalman Filter

Authors: Shalini Rankawat, Mansi Rankawat, Rahul Dubey, Mazad Zaveri

Abstract:

Physiological signals such as electrocardiogram (ECG) and arterial blood pressure (ABP) in the intensive care unit (ICU) are often seriously corrupted by noise, artifacts, and missing data, which lead to errors in the estimation of heart rate (HR) and incidences of false alarm from ICU monitors. Clinical support in ICU requires most reliable heart rate estimation. Cardiac activity, because of its relatively high electrical energy, may introduce artifacts in Electroencephalogram (EEG), Electrooculogram (EOG), and Electromyogram (EMG) recordings. This paper presents a robust heart rate estimation method by detection of R-peaks of ECG artifacts in EEG, EMG & EOG signals, using energy-based function and a novel Signal Quality Index (SQI) assessment technique. SQIs of physiological signals (EEG, EMG, & EOG) were obtained by correlation of nonlinear energy operator (teager energy) of these signals with either ECG or ABP signal. HR is estimated from ECG, ABP, EEG, EMG, and EOG signals from separate Kalman filter based upon individual SQIs. Data fusion of each HR estimate was then performed by weighing each estimate by the Kalman filters’ SQI modified innovations. The fused signal HR estimate is more accurate and robust than any of the individual HR estimate. This method was evaluated on MIMIC II data base of PhysioNet from bedside monitors of ICU patients. The method provides an accurate HR estimate even in the presence of noise and artifacts.

Keywords: ECG, ABP, EEG, EMG, EOG, ECG artifacts, Teager-Kaiser energy, heart rate, signal quality index, Kalman filter, data fusion

Procedia PDF Downloads 689
1054 The Confiscation of Ill-Gotten Gains in Pollution: The Taiwan Experience and the Interaction between Economic Analysis of Law and Environmental Economics Perspectives

Authors: Chiang-Lead Woo

Abstract:

In reply to serious environmental problems, the Taiwan government quickly adjusted some articles to suit the needs of environmental protection recently, such as the amendment to article 190-1 of the Taiwan Criminal Code. The transfer of legislation comes as an improvement which canceled the limitation of ‘endangering public safety’. At the same time, the article 190-1 goes from accumulative concrete offense to abstract crime of danger. Thus, the public looks forward to whether environmental crime following the imposition of fines or penalties works efficiently in anti-pollution by the deterrent effects. However, according to the addition to article 38-2 of the Taiwan Criminal Code, the confiscation system seems controversial legislation to restrain ill-gotten gains. Most prior studies focused on comparisons with the Administrative Penalty Law and the Criminal Code in environmental issue in Taiwan; recently, more and more studies emphasize calculations on ill-gotten gains. Hence, this paper try to examine the deterrent effect in environmental crime by economic analysis of law and environmental economics perspective. This analysis shows that only if there is an extremely high probability (equal to 100 percent) of an environmental crime case being prosecuted criminally by Taiwan Environmental Protection Agency, the deterrent effects will work. Therefore, this paper suggests deliberating the confiscation system from supplementing the System of Environmental and Economic Accounting, reasonable deterrent fines, input management, real-time system for detection of pollution, and whistleblower system, environmental education, and modernization of law.

Keywords: confiscation, ecosystem services, environmental crime, ill-gotten gains, the deterrent effect, the system of environmental and economic accounting

Procedia PDF Downloads 159
1053 Comparison of Several Diagnostic Methods for Detecting Bovine Viral Diarrhea Virus Infection in Cattle

Authors: Azizollah Khodakaram- Tafti, Ali Mohammadi, Ghasem Farjanikish

Abstract:

Bovine viral diarrhea virus (BVDV) is one of the most important viral pathogens of cattle worldwide caused by Pestivirus genus, Flaviviridae family.The aim of the present study was to comparison several diagnostic methods and determine the prevalence of BVDV infection for the first time in dairy herds of Fars province, Iran. For initial screening, a total of 400 blood samples were randomly collected from 12 industrial dairy herds and analyzed using reverse transcription (RT)-PCR on the buffy coat. In the second step, blood samples and also ear notch biopsies were collected from 100 cattle of infected farms and tested by antigen capture ELISA (ACE), RT-PCR and immunohistochemistry (IHC). The results of nested RT-PCR (outer primers 0I100/1400R and inner primers BD1/BD2) was successful in 16 out of 400 buffy coat samples (4%) as acute infection in initial screening. Also, 8 out of 100 samples (2%) were positive as persistent infection (PI) by all of the diagnostic tests similarly including RT-PCR, ACE and IHC on buffy coat, serum and skin samples, respectively. Immunoreactivity for bovine BVDV antigen as brown, coarsely to finely granular was observed within the cytoplasm of epithelial cells of epidermis and hair follicles and also subcutaneous stromal cells. These findings confirm the importance of monitoring BVDV infection in cattle of this region and suggest detection and elimination of PI calves for controlling and eradication of this disease.

Keywords: antigen capture ELISA, bovine viral diarrhea virus, immunohistochemistry, RT-PCR, cattle

Procedia PDF Downloads 352
1052 Crustal Scale Seismic Surveys in Search for Gawler Craton Iron Oxide Cu-Au (IOCG) under Very Deep Cover

Authors: E. O. Okan, A. Kepic, P. Williams

Abstract:

Iron oxide copper gold (IOCG) deposits constitute important sources of copper and gold in Australia especially since the discovery of the supergiant Olympic Dam deposits in 1975. They are considered to be metasomatic expressions of large crustal-scale alteration events occasioned by intrusive actions and are associated with felsic igneous rocks in most cases, commonly potassic igneous magmatism, with the deposits ranging from ~2.2 –1.5 Ga in age. For the past two decades, geological, geochemical and potential methods have been used to identify the structures hosting these deposits follow up by drilling. Though these methods have largely been successful for shallow targets, at deeper depth due to low resolution they are limited to mapping only very large to gigantic deposits with sufficient contrast. As the search for ore-bodies under regolith cover continues due to depletion of the near surface deposits, there is a compelling need to develop new exploration technology to explore these deep seated ore-bodies within 1-4km which is the current mining depth range. Seismic reflection method represents this new technology as it offers a distinct advantage over all other geophysical techniques because of its great depth of penetration and superior spatial resolution maintained with depth. Further, in many different geological scenarios, it offers a greater ‘3D mapability’ of units within the stratigraphic boundary. Despite these superior attributes, no arguments for crustal scale seismic surveys have been proposed because there has not been a compelling argument of economic benefit to proceed with such work. For the seismic reflection method to be used at these scales (100’s to 1000’s of square km covered) the technical risks or the survey costs have to be reduced. In addition, as most IOCG deposits have large footprint due to its association with intrusions and large fault zones; we hypothesized that these deposits can be found by mainly looking for the seismic signatures of intrusions along prospective structures. In this study, we present two of such cases: - Olympic Dam and Vulcan iron-oxide copper-gold (IOCG) deposits all located in the Gawler craton, South Australia. Results from our 2D modelling experiments revealed that seismic reflection surveys using 20m geophones and 40m shot spacing as an exploration tool for locating IOCG deposit is possible even when hosted in very complex structures. The migrated sections were not only able to identify and trace various layers plus the complex structures but also show reflections around the edges of intrusive packages. The presences of such intrusions were clearly detected from 100m to 1000m depth range without losing its resolution. The modelled seismic images match the available real seismic data and have the hypothesized characteristics; thus, the seismic method seems to be a valid exploration tool to find IOCG deposits. We therefore propose that 2D seismic survey is viable for IOCG exploration as it can detect mineralised intrusive structures along known favourable corridors. This would help in reducing the exploration risk associated with locating undiscovered resources as well as conducting a life-of-mine study which will enable better development decisions at the very beginning.

Keywords: crustal scale, exploration, IOCG deposit, modelling, seismic surveys

Procedia PDF Downloads 319
1051 Sociolinguistic and Classroom Functions of Using Code-Switching in CLIL Context

Authors: Khatuna Buskivadze

Abstract:

The aim of the present study is to investigate the sociolinguistic and classroom functions and frequency of Teacher’s Code Switching (CS) in the Content and Language Integrated (CLIL) Lesson. Nowadays, Georgian society struggles to become the part of the European world, the English language itself plays a role in forming new generations with European values. Based on our research conducted in 2019, out of all 114 private schools in Tbilisi, full- programs of CLIL are taught in 7 schools, while only some subjects using CLIL are conducted in 3 schools. The goal of the former research was to define the features of Content and Language Integrated learning (CLIL) methodology within the process of teaching English on the Example of Georgian private high schools. Taking the Georgian reality and cultural features into account, the modified version of the questionnaire, based on the classification of using CS in ESL Classroom proposed By Ferguson (2009) was used. The qualitative research revealed students’ and teacher’s attitudes towards teacher’s code-switching in CLIL lesson. Both qualitative and quantitative research were conducted: the observations of the teacher’s lessons (Recording of T’s online lessons), interview and the questionnaire among Math’s T’s 20 high school students. We came to the several conclusions, some of them are given here: Math’s teacher’s CS behavior mostly serves (1) the conversational function of interjection; (2) the classroom functions of introducing unfamiliar materials and topics, explaining difficult concepts, maintaining classroom discipline and the structure of the lesson; The teacher and 13 students have negative attitudes towards using only Georgian in teaching Math. The higher level of English is the more negative is attitude towards using Georgian in the classroom. Although all the students were Georgian, their competence in English is higher than in Georgian, therefore they consider English as an inseparable part of their identities. The overall results of the case study of teaching Math (Educational discourse) in one of the private schools in Tbilisi will be presented at the conference.

Keywords: attitudes, bilingualism, code-switching, CLIL, conversation analysis, interactional sociolinguistics.

Procedia PDF Downloads 150
1050 Contribution of Automated Early Warning Score Usage to Patient Safety

Authors: Phang Moon Leng

Abstract:

Automated Early Warning Scores is a newly developed clinical decision tool that is used to streamline and improve the process of obtaining a patient’s vital signs so a clinical decision can be made at an earlier stage to prevent the patient from further deterioration. This technology provides immediate update on the score and clinical decision to be taken based on the outcome. This paper aims to study the use of an automated early warning score system on whether the technology has assisted the hospital in early detection and escalation of clinical condition and improve patient outcome. The hospital adopted the Modified Early Warning Scores (MEWS) Scoring System and MEWS Clinical Response into Philips IntelliVue Guardian Automated Early Warning Score equipment and studied whether the process has been leaned, whether the use of technology improved the usage & experience of the nurses, and whether the technology has improved patient care and outcome. It was found the steps required to obtain vital signs has been significantly reduced and is used more frequently to obtain patient vital signs. The number of deaths, and length of stay has significantly decreased as clinical decisions can be made and escalated more quickly with the Automated EWS. The automated early warning score equipment has helped improve work efficiency by removing the need for documenting into patient’s EMR. The technology streamlines clinical decision-making and allows faster care and intervention to be carried out and improves overall patient outcome which translates to better care for patient.

Keywords: automated early warning score, clinical quality and safety, patient safety, medical technology

Procedia PDF Downloads 170