Search results for: automated document processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4970

Search results for: automated document processing

3110 Two Years Retrospective Study of Body Fluid Cultures Obtained from Patients in the Intensive Care Unit of General Hospital of Ioannina

Authors: N. Varsamis, M. Gerasimou, P. Christodoulou, S. Mantzoukis, G. Kolliopoulou, N. Zotos

Abstract:

Purpose: Body fluids (pleural, peritoneal, synovial, pericardial, cerebrospinal) are an important element in the detection of microorganisms. For this reason, it is important to examine them in the Intensive Care Unit (ICU) patients. Material and Method: Body fluids are transported through sterile containers and enriched as soon as possible with Tryptic Soy Broth (TSB). After one day of incubation, the broth is poured into selective media: Blood, Mac Conkey No. 2, Chocolate, Mueller Hinton, Chapman and Saboureaud agar. The above selective media are incubated directly for 2 days. After this period, if any number of microbial colonies are detected, gram staining is performed. After that, the isolated organisms are identified by biochemical techniques in the automated Microscan system (Siemens) and followed by a sensitivity test on the same system using the minimum inhibitory concentration MIC technique. The sensitivity test is verified by Kirby Bauer-based plate test. Results: In 2017 the Laboratory of Microbiology received 60 samples of body fluids from the ICU. More specifically the Microbiology Department received 6 peritoneal fluid specimens, 18 pleural fluid specimens and 36 cerebrospinal fluid specimens. 36 positive cultures were tested. S. epidermidis was identified in 18 specimens, S. haemolyticus in 6, and E. faecium in 12. Conclusions: The results show low detection of microorganisms in body fluid cultures.

Keywords: body fluids, culture, intensive care unit, microorganisms

Procedia PDF Downloads 187
3109 Mastering Test Automation: Bridging Gaps for Seamless QA

Authors: Rohit Khankhoje

Abstract:

The rapid evolution of software development practices has given rise to an increasing demand for efficient and effective test automation. The paper titled "Mastering Test Automation: Bridging Gaps for Seamless QA" delves into the crucial aspects of test automation, addressing the obstacles faced by organizations in achieving flawless quality assurance. The paper highlights the importance of bridging knowledge gaps within organizations, emphasizing the necessity for management to acquire a deeper comprehension of test automation scenarios, coverage, report trends, and the importance of communication. To tackle these challenges, this paper introduces innovative solutions, including the development of an automation framework that seamlessly integrates with test cases and reporting tools like TestRail and Jira. This integration facilitates the automatic recording of bugs in Jira, enhancing bug reporting and communication between manual QA and automation teams as well as TestRail have all newly added automated testcases as soon as it is part of the automation suite. The paper demonstrates how this framework empowers management by providing clear insights into ongoing automation activities, bug origins, trend analysis, and test case specifics. "Mastering Test Automation" serves as a comprehensive guide for organizations aiming to enhance their quality assurance processes through effective test automation. It not only identifies the common pitfalls and challenges but also offers practical solutions to bridge the gaps, resulting in a more streamlined and efficient QA process.

Keywords: automation framework, API integration, test automation, test management tools

Procedia PDF Downloads 56
3108 Processing Big Data: An Approach Using Feature Selection

Authors: Nikat Parveen, M. Ananthi

Abstract:

Big data is one of the emerging technology, which collects the data from various sensors and those data will be used in many fields. Data retrieval is one of the major issue where there is a need to extract the exact data as per the need. In this paper, large amount of data set is processed by using the feature selection. Feature selection helps to choose the data which are actually needed to process and execute the task. The key value is the one which helps to point out exact data available in the storage space. Here the available data is streamed and R-Center is proposed to achieve this task.

Keywords: big data, key value, feature selection, retrieval, performance

Procedia PDF Downloads 323
3107 Non-Destructive Testing of Selective Laser Melting Products

Authors: Luca Collini, Michele Antolotti, Diego Schiavi

Abstract:

At present, complex geometries within production time shrinkage, rapidly increasing demand, and high-quality standard requirement make the non-destructive (ND) control of additively manufactured components indispensable means. On the other hand, a technology gap and the lack of standards regulating the methods and the acceptance criteria indicate the NDT of these components a stimulating field to be still fully explored. Up to date, penetrant testing, acoustic wave, tomography, radiography, and semi-automated ultrasound methods have been tested on metal powder based products so far. External defects, distortion, surface porosity, roughness, texture, internal porosity, and inclusions are the typical defects in the focus of testing. Detection of density and layers compactness are also been tried on stainless steels by the ultrasonic scattering method. In this work, the authors want to present and discuss the radiographic and the ultrasound ND testing on additively manufactured Ti₆Al₄V and inconel parts obtained by the selective laser melting (SLM) technology. In order to test the possibilities given by the radiographic method, both X-Rays and γ-Rays are tried on a set of specifically designed specimens realized by the SLM. The specimens contain a family of defectology, which represent the most commonly found, as cracks and lack of fusion. The tests are also applied to real parts of various complexity and thickness. A set of practical indications and of acceptance criteria is finally drawn.

Keywords: non-destructive testing, selective laser melting, radiography, UT method

Procedia PDF Downloads 129
3106 Stimulation of Stevioside Accumulation on Stevia rebaudiana (Bertoni) Shoot Culture Induced with Red LED Light in TIS RITA® Bioreactor System

Authors: Vincent Alexander, Rizkita Esyanti

Abstract:

Leaves of Stevia rebaudiana contain steviol glycoside which mainly comprise of stevioside, a natural sweetener compound that is 100-300 times sweeter than sucrose. Current cultivation method of Stevia rebaudiana in Indonesia has yet to reach its optimum efficiency and productivity to produce stevioside as a safe sugar substitute sweetener for people with diabetes. An alternative method that is not limited by environmental factor is in vitro temporary immersion system (TIS) culture method using recipient for automated immersion (RITA®) bioreactor. The aim of this research was to evaluate the effect of red LED light induction towards shoot growth and stevioside accumulation in TIS RITA® bioreactor system, as an endeavour to increase the secondary metabolite synthesis. The result showed that the stevioside accumulation in TIS RITA® bioreactor system induced with red LED light for one hour during night was higher than that in TIS RITA® bioreactor system without red LED light induction, i.e. 71.04 ± 5.36 μg/g and 42.92 ± 5.40 μg/g respectively. Biomass growth rate reached as high as 0.072 ± 0.015/day for red LED light induced TIS RITA® bioreactor system, whereas TIS RITA® bioreactor system without induction was only 0.046 ± 0.003/day. Productivity of Stevia rebaudiana shoots induced with red LED light was 0.065 g/L medium/day, whilst shoots without any induction was 0.041 g/L medium/day. Sucrose, salt, and inorganic consumption in both bioreactor media increased as biomass increased. It can be concluded that Stevia rebaudiana shoot in TIS RITA® bioreactor induced with red LED light produces biomass and accumulates higher stevioside concentration, in comparison to bioreactor without any light induction.

Keywords: LED, Stevia rebaudiana, Stevioside, TIS RITA

Procedia PDF Downloads 355
3105 Verbal Working Memory in Sequential and Simultaneous Bilinguals: An Exploratory Study

Authors: Archana Rao R., Deepak P., Chayashree P. D., Darshan H. S.

Abstract:

Cognitive abilities in bilinguals have been widely studied over the last few decades. Bilingualism has been found to extensively facilitate the ability to store and manipulate information in Working Memory (WM). The mechanism of WM includes primary memory, attentional control, and secondary memory, each of which makes a contribution to WM. Many researches have been done in an attempt to measure WM capabilities through both verbal (phonological) and nonverbal tasks (visuospatial). Since there is a lot of speculations regarding the relationship between WM and bilingualism, further investigation is required to understand the nature of WM in bilinguals, i.e., with respect to sequential and simultaneous bilinguals. Hence the present study aimed to highlight the verbal working memory abilities in sequential and simultaneous bilinguals with respect to the processing and recall abilities of nouns and verbs. Two groups of bilinguals aged between 18-30 years were considered for the study. Group 1 consisted of 20 (10 males and 10 females) sequential bilinguals who had acquired L1 (Kannada) before the age of 3 and had exposure to L2 (English) for a period of 8-10 years. Group 2 consisted of 20 (10 males and 10 females) simultaneous bilinguals who have acquired both L1 and L2 before the age of 3. Working memory abilities were assessed using two tasks, and a set of stimuli which was presented in gradation of complexity and the stimuli was inclusive of frequent and infrequent nouns and verbs. The tasks involved the participants to judge the correctness of the sentence and simultaneously remember the last word of each sentence and the participants are instructed to recall the words at the end of each set. The results indicated no significant difference between sequential and simultaneous bilinguals in processing the nouns and verbs, and this could be attributed to the proficiency level of the participants in L1 and the alike cognitive abilities between the groups. And recall of nouns was better compared to verbs, maybe because of the complex argument structure involved in verbs. Similarly, authors found a frequency of occurrence of nouns and verbs also had an effect on WM abilities. The difference was also found across gradation due to the load imposed on the central executive function and phonological loop.

Keywords: bilinguals, nouns, verbs, working memory

Procedia PDF Downloads 111
3104 Indian Business-Papers in Industrial Revolution 4.0: A Paradigm Shift

Authors: Disha Batra

Abstract:

The Industrial Revolution 4.0 is quite different, and a paradigm shift is underway in the media industry. With the advent of automated journalism and social media platforms, newspaper organizations have changed the way news was gathered and reported. The emergence of the fourth industrial revolution in the early 21st century has made the newspapers to adapt the changing technologies to remain relevant. This paper investigates the content of Indian business-papers in the era of the fourth industrial revolution and how these organizations have emerged in the time of convergence. The study is the content analyses of the top three Indian business dailies as per IRS (Indian Readership Survey) 2017 over a decade. The parametric analysis of the different parameters (source of information, use of illustrations, advertisements, layout, and framing, etc.) have been done in order to come across with the distinct adaptations and modifications by these dailies. The paper significantly dwells upon the thematic analysis of these newspapers in order to explore and find out the coverage given to various sub-themes of EBF (economic, business, and financial) journalism. Further, this study reveals the effect of high-speed algorithm-based trading, the aftermath of the fourth industrial revolution on the creative and investigative aspect of delivering financial stories by these respective newspapers. The study indicates a change heading towards an ongoing paradigm shift in the business newspaper industry with an adequate change in the source of information gathering along with the subtle increase in the coverage of financial news stories over the time.

Keywords: business-papers, business news, financial news, industrial revolution 4.0.

Procedia PDF Downloads 102
3103 Development of the Internal Educational Quality Assurance System of Suan Sunandha Rajabhat University

Authors: Nipawan Tharasak, Sajeewan Darbavasu

Abstract:

This research aims 1) to study the opinion, problems and obstacles to internal educational quality assurance system for individual and the university levels, 2) to propose an approach to the development of quality assurance system of Suan Sunandha Rajabhat University. A study of problems and obstacles to internal educational quality assurance system of the university conducted with sample group consisting of staff and quality assurance committee members of the year 2010. There were 152 respondents. 5 executives were interviewed. Tool used in the research was document analysis. The structure of the interview questions and questionnaires with 5-rate scale. Reliability was 0.981. Data analysis were percentage, mean and standard deviation with content analysis. Results can be divided into 3 main points: (1) The implementation of the internal quality assurance system of the university. It was found that in overall, input, process and output factors received high scores. Each item is considered, the preparation, planning, monitoring and evaluation. The results of evaluation to improve the reporting and improvement according to an evaluation received high scores. However, the process received an average score. (2) Problems and obstacles. It was found that the personnel responsible for the duty still lack understanding of indicators and criteria of the quality assurance. (3) Development approach: -Staff should be encouraged to develop a better understanding of the quality assurance system. -Database system for quality assurance should be developed. -The results and suggestions should be applied in the next year development planning.

Keywords: development system, internal quality assurance, education, educational quality assurance

Procedia PDF Downloads 279
3102 Comparison of Tribological and Mechanical Properties of White Metal Produced by Laser Cladding and Conventional Methods

Authors: Jae-Il Jeong, Hoon-Jae Park, Jung-Woo Cho, Yang-Gon Kim, Jin-Young Park, Joo-Young Oh, Si-Geun Choi, Seock-Sam Kim, Young Tae Cho, Chan Gyu Kim, Jong-Hyoung Kim

Abstract:

Bearing component has strongly required to decrease vibration and wear to achieve high durability and life time. In the industry field, bearing durability is improved by surface treatment on the bearing surface by centrifugal casting or gravity casting production method. However, this manufacturing method has caused problems such as long processing time, defect rate, and health harmful effect. To solve this problem, there is a laser cladding deposition treatment, which provides fast processing and food adhesion. Therefore, optimum conditions of white metal laser deposition should be studied to minimize bearing contact axis wear using laser cladding techniques. In this study, we deposit a soft white metal layer on SCM440, which is mainly used for shaft and bolt. On laser deposition process, the laser power and powder feed rate and laser head speed factors are controlled to find out the optimal conditions. We also measure hardness using micro Vickers, analyze FE-SEM (Field Emission Scanning Electron Microscope) and EDS (Energy Dispersive Spectroscopy) to study the mechanical properties and surface characteristics with various parameters change. Furthermore, this paper suggests the optimum condition of laser cladding deposition to apply in industrial fields. This work was supported by the Industrial Innovation Project of the Korea Evaluation Institute of Industrial Technology (KEIT) granted financial resource from the Ministry of Trade, Industry & Energy, Republic of Korea (Research no. 10051653).

Keywords: laser deposition, bearing, white metal, mechanical properties

Procedia PDF Downloads 249
3101 Measuring the Effect of Co-Composting Oil Sludge with Pig, Cow, Horse And Poultry Manures on the Degradation in Selected Polycyclic Aromatic Hydrocarbons Concentrations

Authors: Ubani Onyedikachi, Atagana Harrison Ifeanyichukwu, Thantsha Mapitsi Silvester

Abstract:

Components of oil sludge (PAHs) are known cytotoxic, mutagenic and potentially carcinogenic compounds also bacteria and fungi have been found to degrade PAHs to innocuous compounds. This study is aimed at measuring the effect of pig, cow, horse and poultry manures on the degradation in selected PAHs present in oil sludge. Soil spiked with oil sludge was co-composted differently with each manure in a ratio of 2:1 (w/w) spiked soil: manure and wood-chips in a ratio of 2:1 (w/v) spiked soil: wood-chips. Control was set up similar as the one above but without manure. The mixtures were incubated for 10 months at room temperature. Compost piles were turned weekly and moisture level was maintained at between 50% and 70%. Moisture level, pH, temperature, CO2 evolution and oxygen consumption were measured monthly and the ash content at the end of experimentation. Highest temperature reached was 27.5 °C in all compost heaps, pH ranged from 5.5 to 7.8 and CO2 evolution was highest in poultry manure at 18.78μg/dwt/day. Microbial growth and activities were enhanced; bacteria identified were Bacillus, Arthrobacter and Staphylococcus species. Percentage reduction in PAHs was measured using automated soxhlet extractor with Dichloromethane coupled with gas chromatography/mass spectrometry (GC/MS). Results from PAH measurements showed reduction between 77% and 99%. Co-composting of spiked soils with animal manures enhanced the reduction in PAHs.

Keywords: animal manures, bioremediation, co-composting, oil refinery sludge, PAHs

Procedia PDF Downloads 253
3100 Establishment of Precision System for Underground Facilities Based on 3D Absolute Positioning Technology

Authors: Yonggu Jang, Jisong Ryu, Woosik Lee

Abstract:

The study aims to address the limitations of existing underground facility exploration equipment in terms of exploration depth range, relative depth measurement, data processing time, and human-centered ground penetrating radar image interpretation. The study proposed the use of 3D absolute positioning technology to develop a precision underground facility exploration system. The aim of this study is to establish a precise exploration system for underground facilities based on 3D absolute positioning technology, which can accurately survey up to a depth of 5m and measure the 3D absolute location of precise underground facilities. The study developed software and hardware technologies to build the precision exploration system. The software technologies developed include absolute positioning technology, ground surface location synchronization technology of GPR exploration equipment, GPR exploration image AI interpretation technology, and integrated underground space map-based composite data processing technology. The hardware systems developed include a vehicle-type exploration system and a cart-type exploration system. The data was collected using the developed exploration system, which employs 3D absolute positioning technology. The GPR exploration images were analyzed using AI technology, and the three-dimensional location information of the explored precise underground facilities was compared to the integrated underground space map. The study successfully developed a precision underground facility exploration system based on 3D absolute positioning technology. The developed exploration system can accurately survey up to a depth of 5m and measure the 3D absolute location of precise underground facilities. The system comprises software technologies that build a 3D precise DEM, synchronize the GPR sensor's ground surface 3D location coordinates, automatically analyze and detect underground facility information in GPR exploration images and improve accuracy through comparative analysis of the three-dimensional location information, and hardware systems, including a vehicle-type exploration system and a cart-type exploration system. The study's findings and technological advancements are essential for underground safety management in Korea. The proposed precision exploration system significantly contributes to establishing precise location information of underground facility information, which is crucial for underground safety management and improves the accuracy and efficiency of exploration. The study addressed the limitations of existing equipment in exploring underground facilities, proposed 3D absolute positioning technology-based precision exploration system, developed software and hardware systems for the exploration system, and contributed to underground safety management by providing precise location information. The developed precision underground facility exploration system based on 3D absolute positioning technology has the potential to provide accurate and efficient exploration of underground facilities up to a depth of 5m. The system's technological advancements contribute to the establishment of precise location information of underground facility information, which is essential for underground safety management in Korea.

Keywords: 3D absolute positioning, AI interpretation of GPR exploration images, complex data processing, integrated underground space maps, precision exploration system for underground facilities

Procedia PDF Downloads 47
3099 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading

Authors: Robert Caulk

Abstract:

A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.

Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration

Procedia PDF Downloads 75
3098 A Proposal of Advanced Key Performance Indicators for Assessing Six Performances of Construction Projects

Authors: Wi Sung Yoo, Seung Woo Lee, Youn Kyoung Hur, Sung Hwan Kim

Abstract:

Large-scale construction projects are continuously increasing, and the need for tools to monitor and evaluate the project success is emphasized. At the construction industry level, there are limitations in deriving performance evaluation factors that reflect the diversity of construction sites and systems that can objectively evaluate and manage performance. Additionally, there are difficulties in integrating structured and unstructured data generated at construction sites and deriving improvements. In this study, we propose the Key Performance Indicators (KPIs) to enable performance evaluation that reflects the increased diversity of construction sites and the unstructured data generated, and present a model for measuring performance by the derived indicators. The comprehensive performance of a unit construction site is assessed based on 6 areas (Time, Cost, Quality, Safety, Environment, Productivity) and 26 indicators. We collect performance indicator information from 30 construction sites that meet legal standards and have been successfully performed. And We apply data augmentation and optimization techniques into establishing measurement standards for each indicator. In other words, the KPI for construction site performance evaluation presented in this study provides standards for evaluating performance in six areas using institutional requirement data and document data. This can be expanded to establish a performance evaluation system considering the scale and type of construction project. Also, they are expected to be used as a comprehensive indicator of the construction industry and used as basic data for tracking competitiveness at the national level and establishing policies.

Keywords: key performance indicator, performance measurement, structured and unstructured data, data augmentation

Procedia PDF Downloads 15
3097 Surgical Prep-Related Burns in Laterally Positioned Hip Procedures

Authors: B. Kenny, M. Dixon, A. Boshell

Abstract:

The use of alcoholic surgical prep was recently introduced into the Royal Newcastle Center for elective procedures. In the past 3 months there have been a significant number of burns believed to be related to ‘pooling’ of this surgical prep in patients undergoing procedures where they are placed in the lateral position with hip bolsters. The aim of the audit was to determine the reason for the burns, analyze what pre-existing factors may contribute to the development of the burns and what can be changed to prevent further burns occurring. All patients undergoing a procedure performed on the hip who were placed in the lateral position with sacral and anterior, superior iliac spine (ASIS) support with ‘bolsters’ were included in the audit. Patients who developed a ‘burn’ were recorded, details of the surgery, demographics, surgical prep used and length of surgery were obtained as well as photographs taken to document the burn. Measures were then taken to prevent further burns and the efficacy was documented. Overall 14 patients developed burns over the ipsilateral ASIS. Of these, 13 were Total Hip Arthroplasty (THA) and 1 was a removal of femoral nail. All patients had Chlorhexidine 0.5% in Alcohol 70% Tinted Red surgical preparation or Betadine Alcoholic Skin Prep (70% etoh). Patients were set up in the standard lateral decubitus position with sacral and bilateral ASIS bolsters with a valband covering. 86% of patients were found to have pre-existing hypersensitivities to various substances. There is very little literature besides a few case reports on surgical prep-related burns. The case reports that do exist are related to the use of tourniquet-related burns and there is no mention in the literature examining ‘bolster’ related burns. The burns are hypothesized to be caused by pooling of the alcoholic solution which is amplified by the use of Valband.

Keywords: arthroplasty, chemical burns, wounds, rehabilitation

Procedia PDF Downloads 283
3096 Arabic Light Word Analyser: Roles with Deep Learning Approach

Authors: Mohammed Abu Shquier

Abstract:

This paper introduces a word segmentation method using the novel BP-LSTM-CRF architecture for processing semantic output training. The objective of web morphological analysis tools is to link a formal morpho-syntactic description to a lemma, along with morpho-syntactic information, a vocalized form, a vocalized analysis with morpho-syntactic information, and a list of paradigms. A key objective is to continuously enhance the proposed system through an inductive learning approach that considers semantic influences. The system is currently under construction and development based on data-driven learning. To evaluate the tool, an experiment on homograph analysis was conducted. The tool also encompasses the assumption of deep binary segmentation hypotheses, the arbitrary choice of trigram or n-gram continuation probabilities, language limitations, and morphology for both Modern Standard Arabic (MSA) and Dialectal Arabic (DA), which provide justification for updating this system. Most Arabic word analysis systems are based on the phonotactic morpho-syntactic analysis of a word transmitted using lexical rules, which are mainly used in MENA language technology tools, without taking into account contextual or semantic morphological implications. Therefore, it is necessary to have an automatic analysis tool taking into account the word sense and not only the morpho-syntactic category. Moreover, they are also based on statistical/stochastic models. These stochastic models, such as HMMs, have shown their effectiveness in different NLP applications: part-of-speech tagging, machine translation, speech recognition, etc. As an extension, we focus on language modeling using Recurrent Neural Network (RNN); given that morphological analysis coverage was very low in dialectal Arabic, it is significantly important to investigate deeply how the dialect data influence the accuracy of these approaches by developing dialectal morphological processing tools to show that dialectal variability can support to improve analysis.

Keywords: NLP, DL, ML, analyser, MSA, RNN, CNN

Procedia PDF Downloads 23
3095 GA3C for Anomalous Radiation Source Detection

Authors: Chia-Yi Liu, Bo-Bin Xiao, Wen-Bin Lin, Hsiang-Ning Wu, Liang-Hsun Huang

Abstract:

In order to reduce the risk of radiation damage that personnel may suffer during operations in the radiation environment, the use of automated guided vehicles to assist or replace on-site personnel in the radiation environment has become a key technology and has become an important trend. In this paper, we demonstrate our proof of concept for autonomous self-learning radiation source searcher in an unknown environment without a map. The research uses GPU version of Asynchronous Advantage Actor-Critic network (GA3C) of deep reinforcement learning to search for radiation sources. The searcher network, based on GA3C architecture, has self-directed learned and improved how search the anomalous radiation source by training 1 million episodes under three simulation environments. In each episode of training, the radiation source position, the radiation source intensity, starting position, are all set randomly in one simulation environment. The input for searcher network is the fused data from a 2D laser scanner and a RGB-D camera as well as the value of the radiation detector. The output actions are the linear and angular velocities. The searcher network is trained in a simulation environment to accelerate the learning process. The well-performance searcher network is deployed to the real unmanned vehicle, Dashgo E2, which mounts LIDAR of YDLIDAR G4, RGB-D camera of Intel D455, and radiation detector made by Institute of Nuclear Energy Research. In the field experiment, the unmanned vehicle is enable to search out the radiation source of the 18.5MBq Na-22 by itself and avoid obstacles simultaneously without human interference.

Keywords: deep reinforcement learning, GA3C, source searching, source detection

Procedia PDF Downloads 99
3094 Intelligent Process and Model Applied for E-Learning Systems

Authors: Mafawez Alharbi, Mahdi Jemmali

Abstract:

E-learning is a developing area especially in education. E-learning can provide several benefits to learners. An intelligent system to collect all components satisfying user preferences is so important. This research presents an approach that it capable to personalize e-information and give the user their needs following their preferences. This proposal can make some knowledge after more evaluations made by the user. In addition, it can learn from the habit from the user. Finally, we show a walk-through to prove how intelligent process work.

Keywords: artificial intelligence, architecture, e-learning, software engineering, processing

Procedia PDF Downloads 175
3093 Integrating Natural Language Processing (NLP) and Machine Learning in Lung Cancer Diagnosis

Authors: Mehrnaz Mostafavi

Abstract:

The assessment and categorization of incidental lung nodules present a considerable challenge in healthcare, often necessitating resource-intensive multiple computed tomography (CT) scans for growth confirmation. This research addresses this issue by introducing a distinct computational approach leveraging radiomics and deep-learning methods. However, understanding local services is essential before implementing these advancements. With diverse tracking methods in place, there is a need for efficient and accurate identification approaches, especially in the context of managing lung nodules alongside pre-existing cancer scenarios. This study explores the integration of text-based algorithms in medical data curation, indicating their efficacy in conjunction with machine learning and deep-learning models for identifying lung nodules. Combining medical images with text data has demonstrated superior data retrieval compared to using each modality independently. While deep learning and text analysis show potential in detecting previously missed nodules, challenges persist, such as increased false positives. The presented research introduces a Structured-Query-Language (SQL) algorithm designed for identifying pulmonary nodules in a tertiary cancer center, externally validated at another hospital. Leveraging natural language processing (NLP) and machine learning, the algorithm categorizes lung nodule reports based on sentence features, aiming to facilitate research and assess clinical pathways. The hypothesis posits that the algorithm can accurately identify lung nodule CT scans and predict concerning nodule features using machine-learning classifiers. Through a retrospective observational study spanning a decade, CT scan reports were collected, and an algorithm was developed to extract and classify data. Results underscore the complexity of lung nodule cohorts in cancer centers, emphasizing the importance of careful evaluation before assuming a metastatic origin. The SQL and NLP algorithms demonstrated high accuracy in identifying lung nodule sentences, indicating potential for local service evaluation and research dataset creation. Machine-learning models exhibited strong accuracy in predicting concerning changes in lung nodule scan reports. While limitations include variability in disease group attribution, the potential for correlation rather than causality in clinical findings, and the need for further external validation, the algorithm's accuracy and potential to support clinical decision-making and healthcare automation represent a significant stride in lung nodule management and research.

Keywords: lung cancer diagnosis, structured-query-language (SQL), natural language processing (NLP), machine learning, CT scans

Procedia PDF Downloads 65
3092 Multiscale Process Modeling Analysis for the Prediction of Composite Strength Allowables

Authors: Marianna Maiaru, Gregory M. Odegard

Abstract:

During the processing of high-performance thermoset polymer matrix composites, chemical reactions occur during elevated pressure and temperature cycles, causing the constituent monomers to crosslink and form a molecular network that gradually can sustain stress. As the crosslinking process progresses, the material naturally experiences a gradual shrinkage due to the increase in covalent bonds in the network. Once the cured composite completes the cure cycle and is brought to room temperature, the thermal expansion mismatch of the fibers and matrix cause additional residual stresses to form. These compounded residual stresses can compromise the reliability of the composite material and affect the composite strength. Composite process modeling is greatly complicated by the multiscale nature of the composite architecture. At the molecular level, the degree of cure controls the local shrinkage and thermal-mechanical properties of the thermoset. At the microscopic level, the local fiber architecture and packing affect the magnitudes and locations of residual stress concentrations. At the macroscopic level, the layup sequence controls the nature of crack initiation and propagation due to residual stresses. The goal of this research is use molecular dynamics (MD) and finite element analysis (FEA) to predict the residual stresses in composite laminates and the corresponding effect on composite failure. MD is used to predict the polymer shrinkage and thermomechanical properties as a function of degree of cure. This information is used as input into FEA to predict the residual stresses on the microscopic level resulting from the complete cure process. Virtual testing is subsequently conducted to predict strength allowables. Experimental characterization is used to validate the modeling.

Keywords: molecular dynamics, finite element analysis, processing modeling, multiscale modeling

Procedia PDF Downloads 80
3091 Evaluation of the Benefit of Anti-Endomysial IgA and Anti-Tissue Transglutaminase IgA Antibodies for the Diagnosis of Coeliac Disease in a University Hospital, 2010-2016

Authors: Recep Keşli, Onur Türkyılmaz, Hayriye Tokay, Kasım Demir

Abstract:

Objective: Coeliac disease (CD) is a primary small intestine disorder caused by high sensitivity to gluten which is present in the crops, characterized by inflammation in the small intestine mucosa. The goal of this study was to determine and to compare the sensitivity and specificity values of anti-endomysial IgA (EMA IgA) (IFA) and anti-tissue transglutaminase IgA (anti-tTG IgA) (ELISA) antibodies in the diagnosis of patients suspected with the CD. Methods: One thousand two hundred seventy three patients, who have applied to gastroenterology and pediatric disease polyclinics of Afyon Kocatepe University ANS Research and Practice Hospital were included into the study between 23.09.2010 and 30.05.2016. Sera samples were investigated by immunofluorescence method for EMA positiveness (Euroimmun, Luebeck, Germany). In order to determine quantitative value of Anti-tTG IgA (EIA) (Orgentec Mainz, Germany) fully automated ELISA device (Alisei, Seac, Firenze, Italy) were used. Results: Out of 1273 patients, 160 were diagnosed with coeliac disease according to ESPGHAN 2012 diagnosis criteria. Out of 160 CD patients, 120 were female, 40 were male. The EMA specificity and sensitivity were calculated as 98% and 80% respectively. Specificity and sensitivity of Anti-tTG IgA were determined as 99% and 96% respectively. Conclusion: The specificity of EMA for CD was excellent because all EMA-positive patients (n = 144) were diagnosed with CD. The presence of human anti-tTG IgA was found as a reliable marker for diagnosis and follow-up the CD. Diagnosis of CD should be established on both the clinical and serologic profiles together.

Keywords: anti-endomysial antibody, anti-tTG IgA, coeliac disease, immunofluorescence assay (IFA)

Procedia PDF Downloads 246
3090 The Link between Migration Status and Occupational Health and Safety of Filipino Migrant Workers in South Korea

Authors: Lito M. Amit, Venecio U. Ultra, Young Woong Song

Abstract:

The purpose of this study was to document the prevalence and types of work-related health and safety problems among Filipino migrant workers and the link between their migration status and occupational health and safety (OHS) problems. We conducted a survey among 116 Filipino migrant workers who were both legal and undocumented. To assess the various forms of occupational health problems, we utilized the Korean occupational stress scale (KOSS), Nordic musculoskeletal questionnaire (NMQ) and a validated health and safety questionnaire. A focus group discussion (FGD) was also conducted to record relevant information that was limited by the questionnaires. Descriptive data were presented in frequency with percentages, mean, and standard deviation. Chi-square tests and logistic regression analyses were performed to estimate the degree of association between variables (p < 0.05). Among the eight subscales of KOSS, inadequate social support (2.48), organizational injustice (2.57), and lack of reward (2.52) were experienced by workers. There was a 44.83% prevalence of musculoskeletal disorders with arm/elbow having the highest rate, followed by shoulder and low back regions. Inadequate social support and discomfort in organizational climate and overall MSDs prevalence showed significant relationships with migration status (p < 0.05). There was a positive association between migration status and seven items under language and communication. A positive association was seen between migration status and some of the OHS problems of Filipino migrant workers in Korea. Undocumented workers in this study were seen to be more vulnerable to those stressors compared to those employed legally.

Keywords: Filipino workers, migration status, occupational health and safety, undocumented workers

Procedia PDF Downloads 113
3089 A Qualitative Study Examining the Process of EFL Course Design from the Perspectives of Teachers

Authors: Iman Al Khalidi

Abstract:

Recently, English has become the language of globalization and technology. In turn, this has resulted in a seemingly bewildering array of influences and trends in the domain of TESOL curriculum. In light of these changes, higher education has to provide a new and more powerful kind of education. It should prepare students to be more engaged citizens, more capable to solve complex problems at work, and well prepared to lead meaningful life. In response to this, universities, colleges, schools, and departments have to work out in light of the requirements and challenges of the global and technological era. Consequently they have to focus on the adoption of contemporary curriculum which goes in line with the pedagogical shifts from teaching –centered approach to learning centered approach. Ideally, there has been noticeable emphasis on the crucial importance of developing and professionalizing teachers in order to engage them in the process of curriculum development and action research. This is a qualitative study that aims at understanding and exploring the process of designing EFL courses by teachers at the tertiary level from the perspectives of the participants in a professional context in TESOL, Department of English, a private college in Oman. It is a case study that stands on the philosophy of the qualitative approach. It employs multi methods for collecting qualitative data: semi-structured interviews with teachers, focus group discussions with students, and document analysis. The collected data have been analyzed qualitatively by adopting Miles and Huberman's Approach using procedures of reduction, coding, displaying and conclusion drawing and verification.

Keywords: course design, components of course design, case study, data analysis

Procedia PDF Downloads 526
3088 A Qualitative Study Examining the Process of Course Design from the Perspectives of Teachers

Authors: Iman Al Khalidi

Abstract:

Recently, English has become the language of globalization and technology. In turn, this has resulted in a seemingly bewildering array of influences and trends in the domain of TESOL curriculum. In light of these changes, higher education has to provide a new and more powerful kind of education. It should prepare students to be more engaged citizens, more capable to solve complex problems at work, and well prepared to lead a meaningful life. In response to this, universities, colleges, schools, and departments have to work out in light of the requirements and challenges of the global and technological era. Consequently, they have to focus on the adoption of contemporary curriculum which goes in line with the pedagogical shifts from teaching –centered approach to learning centered approach. Ideally, there has been noticeable emphasis on the crucial importance of developing and professionalizing teachers in order to engage them in the process of curriculum development and action research. This is a qualitative study that aims at understanding and exploring the process of designing EFL courses by teachers at the tertiary level from the perspectives of the participants in a professional context in TESOL, Department of English, a private college in Oman. It is a case study that stands on the philosophy of the qualitative approach. It employs multi-methods for collecting qualitative data: semi-structured interviews with teachers, focus group discussions with students, and document analysis. The collected data have been analyzed qualitatively by adopting Miles and Huberman's Approach using procedures of reduction, coding, displaying, and conclusion drawing and verification.

Keywords: course design, components of course design, case study, data analysis

Procedia PDF Downloads 431
3087 Bactericidal Efficacy of Quaternary Ammonium Compound on Carriers with Food Additive Grade Calcium Hydroxide against Salmonella Infantis and Escherichia coli

Authors: M. Shahin Alam, Satoru Takahashi, Mariko Itoh, Miyuki Komura, Mayuko Suzuki, Natthanan Sangsriratanakul, Kazuaki Takehara

Abstract:

Cleaning and disinfection are key components of routine biosecurity in livestock farming and food processing industry. The usage of suitable disinfectants and their proper concentration are important factors for a successful biosecurity program. Disinfectants have optimum bactericidal and virucidal efficacies at temperatures above 20°C, but very few studies on application and effectiveness of disinfectants at low temperatures have been done. In the present study, the bactericidal efficacies of food additive grade calcium hydroxide (FdCa(OH)), quaternary ammonium compound (QAC) and their mixture, were investigated under different conditions, including time, organic materials (fetal bovine serum: FBS) and temperature, either in suspension or in carrier test. Salmonella Infantis and Escherichia coli, which are the most prevalent gram negative bacteria in commercial poultry housing and food processing industry, were used in this study. Initially, we evaluated these disinfectants at two different temperatures (4°C and room temperature (RT) (25°C ± 2°C)) and 7 contact times (0, 5 and 30 sec, 1, 3, 20 and 30 min), with suspension tests either in the presence or absence of 5% FBS. Secondly, we investigated the bactericidal efficacies of these disinfectants by carrier tests (rubber, stainless steel and plastic) at same temperatures and 4 contact times (30 sec, 1, 3, and 5 min). Then, we compared the bactericidal efficacies of each disinfectant within their mixtures, as follows. When QAC was diluted with redistilled water (dW2) at 1: 500 (QACx500) to obtain the final concentration of didecyl-dimethylammonium chloride (DDAC) of 200 ppm, it could inactivate Salmonella Infantis within 5 sec at RT either with or without 5% FBS in suspension test; however, at 4°C it required 30 min in presence of 5% FBS. FdCa(OH)2 solution alone could inactivate bacteria within 1 min both at RT and 4°C even with 5% FBS. While FdCa(OH)2 powder was added at final concentration 0.2% to QACx500 (Mix500), the mixture could inactivate bacteria within 30 sec and 5 sec, respectively, with or without 5% FBS at 4°C. The findings from the suspension test indicated that low temperature inhibited the bactericidal efficacy of QAC, whereas Mix500 was effective, regardless of short contact time and low temperature, even with 5% FBS. In the carrier test, single disinfectant required bit more time to inactivate bacteria on rubber and plastic surfaces than on stainless steel. However, Mix500 could inactivate S. Infantis on rubber, stainless steel and plastic surfaces within 30 sec and 1 min, respectively, at RT and 4°C; but, for E. coli, it required only 30 sec at both temperatures. So, synergistic effects were observed on different carriers at both temperatures. For a successful enhancement of biosecurity during winter, the disinfectants should be selected that could have short contact times with optimum efficacy against the target pathogen. The present study findings help farmers to make proper strategies for application of disinfectants in their livestock farming and food processing industry.

Keywords: carrier, food additive grade calcium hydroxide (FdCa(OH)₂), quaternary ammonium compound, synergistic effects

Procedia PDF Downloads 282
3086 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory

Authors: Xiaochen Mu

Abstract:

Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.

Keywords: data protection, property rights, intellectual property, Big data

Procedia PDF Downloads 10
3085 Building Atmospheric Moisture Diagnostics: Environmental Monitoring and Data Collection

Authors: Paula Lopez-Arce, Hector Altamirano, Dimitrios Rovas, James Berry, Bryan Hindle, Steven Hodgson

Abstract:

Efficient mould remediation and accurate moisture diagnostics leading to condensation and mould growth in dwellings are largely untapped. Number of factors are contributing to the rising trend of excessive moisture in homes mainly linked with modern living, increased levels of occupation and rising fuel costs, as well as making homes more energy efficient. Environmental monitoring by means of data collection though loggers sensors and survey forms has been performed in a range of buildings from different UK regions. Air and surface temperature and relative humidity values of residential areas affected by condensation and/or mould issues were recorded. Additional measurements were taken through different trials changing type, location, and position of loggers. In some instances, IR thermal images and ventilation rates have also been acquired. Results have been interpreted together with environmental key parameters by processing and connecting data from loggers and survey questionnaires, both in buildings with and without moisture issues. Monitoring exercises carried out during Winter and Spring time show the importance of developing and following accurate protocols for guidance to obtain consistent, repeatable and comparable results and to improve the performance of environmental monitoring. A model and a protocol are being developed to build a diagnostic tool with the goal of performing a simple but precise residential atmospheric moisture diagnostics to distinguish the cause entailing condensation and mould generation, i.e., ventilation, insulation or heating systems issue. This research shows the relevance of monitoring and processing environmental data to assign moisture risk levels and determine the origin of condensation or mould when dealing with a building atmospheric moisture excess.

Keywords: environmental monitoring, atmospheric moisture, protocols, mould

Procedia PDF Downloads 126
3084 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review

Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha

Abstract:

Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision-making has not been far-fetched. Proper classification of this textual information in a given context has also been very difficult. As a result, we decided to conduct a systematic review of previous literature on sentiment classification and AI-based techniques that have been used in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that can correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy by assessing different artificial intelligence techniques. We evaluated over 250 articles from digital sources like ScienceDirect, ACM, Google Scholar, and IEEE Xplore and whittled down the number of research to 31. Findings revealed that Deep learning approaches such as CNN, RNN, BERT, and LSTM outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also necessary for developing a robust sentiment classifier and can be obtained from places like Twitter, movie reviews, Kaggle, SST, and SemEval Task4. Hybrid Deep Learning techniques like CNN+LSTM, CNN+GRU, CNN+BERT outperformed single Deep Learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of sentiment analyzer development due to its simplicity and AI-based library functionalities. Based on some of the important findings from this study, we made a recommendation for future research.

Keywords: artificial intelligence, natural language processing, sentiment analysis, social network, text

Procedia PDF Downloads 103
3083 Biomass Enhancement of Stevia (Stevia rebaudiana Bertoni) Shoot Culture in Temporary Immersion System (TIS) RITA® Bioreactor Optimized in Two Different Immersion Periods

Authors: Agustine Melviana, Rizkita Esyanti

Abstract:

Stevia plant contains steviol glycosides which is estimated to be 300 times sweeter than sucrose. However in Indonesia, conventional (in vivo) propagation of Stevia rebaudiana was not effective due to a poor result. Therefore, alternative methods to propagate S. rebaudiana plants is needed, one of it is using in vitro method. Multiplication with a large quantity of stevia biomass in relatively short period can be conducted by using TIS RITA® (Recipient for Automated Temporary Immersion System). The objective of this study was to evaluate the effect of immersion period of the medium on growth and the medium bioconversion into the production of shoot biomass. The study was conducted to determine the effect of different intensity period of medium to enhance biomass of stevia shoots. Shoot culture of S. rebaudiana was grown in full strength MS medium supplemented with 1 ppm Kinetin. RITA® bioreactors were set up with two different immersion periods, 15 min (RITA® 15) and 30 min (RITA® 30), scheduled every 6 hours and incubated for 21 days. The result indicated that immersion period affected the biomass and growth rate (µ). Thirty-minutes immersion showed greater percentage of shoot multiplication (93.44 ± 0.83%), percentage of leaf growth (85.24 ± 5.99%), growth rate (0.042 ± 0.001 g/day), and productivity (0.066 g/L medium/day) compared to that immersed in RITA® 15 min (76.90 ± 4.85%; 79.73 ± 7.76; 0.045 ± 0.004 g/day, and 0.045 g/L medium/day respectively). Enhancement of biomass in RITA® 30 reached 1,702 ± 0,114 gr, whereas in RITA® 15 only 0,953 ± 0,093 gr. Additionally, the pattern of sucrose, mineral, and inorganic compounds consumption followed the growth of plant biomass for both systems. In conclusion, the bioconversion efficiency from medium to biomass in RITA® 30 is better than RITA® 15.

Keywords: intensity period, shoot culture, Stevia rebaudiana, TIS RITA®

Procedia PDF Downloads 236
3082 The Nature and Impacts of 2015 Indian Unofficial Blockade in Nepal

Authors: Jhabakhar Aryal, Kesh Bahadur Rana, Durga Prasad Neupane

Abstract:

This research analyzes the nature and impacts of the 2015 unofficial blockade in Nepal, a significant event that triggered an economic and humanitarian crisis. While official channels denied claims of involvement, Nepal perceived the blockade as orchestrated by India due to concerns about the newly adopted constitution and Madheshi infringements. The study adopts a qualitative approach, utilizing semi-structured interviews, document analysis, and content analysis to gather data from various perspectives. Employing a "colonial hangover lens," it investigates if colonial legacies continue to influence postcolonial nation dynamics, focusing on India's potential attempt to exert influence over Nepal. The findings suggest that the 2015 blockade had profound consequences for Nepal, potentially reflecting lingering colonial power dynamics in the region. Despite India's denials, a significant portion of Nepalis perceived the blockade as an act of external pressure. Examining these perceptions offers valuable insights into postcolonial relations and their impact on regional stability. The 2015 unofficial blockade serves as a critical case study in understanding the complex interplay of internal dynamics, external influences, and historical legacies in shaping the geopolitics of the region. This research contributes to a deeper understanding of these factors and their ongoing implications for Nepal and its relationship with India.

Keywords: blockade, unofficial, constitution, Madhesis, India, Nepal, postcolonial, regional stability, geopolitics

Procedia PDF Downloads 46
3081 User-Awareness from Eye Line Tracing During Specification Writing to Improve Specification Quality

Authors: Yoshinori Wakatake

Abstract:

Many defects after the release of software packages are caused due to omissions of sufficient test items in test specifications. Poor test specifications are detected by manual review, which imposes a high human load. The prevention of omissions depends on the end-user awareness of test specification writers. If test specifications were written while envisioning the behavior of end-users, the number of omissions in test items would be greatly reduced. The paper pays attention to the point that writers who can achieve it differ from those who cannot in not only the description richness but also their gaze information. It proposes a method to estimate the degree of user-awareness of writers through the analysis of their gaze information when writing test specifications. We conduct an experiment to obtain the gaze information of a writer of the test specifications. Test specifications are automatically classified using gaze information. In this method, a Random Forest model is constructed for the classification. The classification is highly accurate. By looking at the explanatory variables which turn out to be important variables, we know behavioral features to distinguish test specifications of high quality from others. It is confirmed they are pupil diameter size and the number and the duration of blinks. The paper also investigates test specifications automatically classified with gaze information to discuss features in their writing ways in each quality level. The proposed method enables us to automatically classify test specifications. It also prevents test item omissions, because it reveals writing features that test specifications of high quality should satisfy.

Keywords: blink, eye tracking, gaze information, pupil diameter, quality improvement, specification document, user-awareness

Procedia PDF Downloads 52