Search results for: ion torrent personal genome machine (PGM)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5298

Search results for: ion torrent personal genome machine (PGM)

2628 Perception of Violence through the Drawing: A Research with Mexican University Students

Authors: Yessica Martinez Soto, Cesar E. Jimenez Yanez, Margarita Barak Velasquez, Yaralin Aceves Villanueva

Abstract:

The presence of violent behavior in society is growing rapidly, which causes people to live in an environment of constant tension due to fear of becoming victims of violent acts. It is up to social scientists to be able to carry out analyzes in this regard to identify the different ways in which violence is normalized among people. The interest of this research work focuses on investigating the perception of violence in Mexican University students through the technique of drawing. To carry out this research, we worked with 67 university students from the Autonomous University of Baja California in Mexico, who drew an image of how they understood the concept of violence. His works showed us a variety of emotions, actions, and elements that relate and link with violence. One of the methodological tools to recognize and establish the link between the knowledge of a concept between discourse and practice is through graphic representations, that is, drawings. Although the drawing gives us a personal interpretation of the reality of each artist, the repetition of elements and the representation of similar situations allowed us to identify the degrees of incidence of the different types of violence and the areas in which it manifests itself.

Keywords: college students, Mexico, social representations, violence

Procedia PDF Downloads 218
2627 Using Discrete Event Simulation Approach to Reduce Waiting Times in Computed Tomography Radiology Department

Authors: Mwafak Shakoor

Abstract:

The purpose of this study was to reduce patient waiting times, improve system throughput and improve resources utilization in radiology department. A discrete event simulation model was developed using Arena simulation software to investigate different alternatives to improve the overall system delivery based on adding resource scenarios due to the linkage between patient waiting times and resource availability. The study revealed that there is no addition investment need to procure additional scanner but hospital management deploy managerial tactics to enhance machine utilization and reduce the long waiting time in the department.

Keywords: discrete event simulation, radiology department, arena, waiting time, healthcare modeling, computed tomography

Procedia PDF Downloads 581
2626 Investigating the Encouraging Factors for Scholarly Works Contribution towards Institutional Repository: A Case Study at a Malaysian University

Authors: Mohd Rashid bin Ab Hamid, Noor Azura binti Omar, Zainol Bin Mustafa

Abstract:

Purpose: The aim of this paper is to study the encouraging factors for scholarly works contribution towards among academicians at Malaysian university. Methods: This paper uses questionnaire for data collection on the respondents’ perceptional level on the institutional repository efforts in one of the university under study. Several encouraging factors have been identified and to be measured using descriptive statistics. The factors are related to content contribution, i.e. personal factor, professional factor, organizational factor and technological factor. Findings: The study found that all these four encouraging factors did have a relation to the contribution of scholarly works in the university by the academician. Research Limitations: This study used a case study and generalization to all Malaysian universities should be well taken care of. Practical implications: The library at the university should look into these four encouraging factors in order to enhance the contribution from academician towards the repository. Originality/value: This research paper provides basic information for the knowledge management officers in the university by endeavouring more efforts in order to attract more contributions.

Keywords: institutional repository, information retrieval, information storage and retrieval

Procedia PDF Downloads 549
2625 Intuitive Decision Making When Facing Risks

Authors: Katharina Fellnhofer

Abstract:

The more information and knowledge that technology provides, the more important are profoundly human skills like intuition, the skill of using nonconscious information. As our world becomes more complex, shaken by crises, and characterized by uncertainty, time pressure, ambiguity, and rapidly changing conditions, intuition is increasingly recognized as a key human asset. However, due to methodological limitations of sample size or time frame or a lack of real-world or cross-cultural scope, precisely how to measure intuition when facing risks on a nonconscious level remains unclear. In light of the measurement challenge related to intuition’s nonconscious nature, a technique is introduced to measure intuition via hidden images as nonconscious additional information to trigger intuition. This technique has been tested in a within-subject fully online design with 62,721 real-world investment decisions made by 657 subjects in Europe and the United States. Bayesian models highlight the technique’s potential to measure skill at using nonconscious information for conscious decision making. Over the long term, solving the mysteries of intuition and mastering its use could be of immense value in personal and organizational decision-making contexts.

Keywords: cognition, intuition, investment decisions, methodology

Procedia PDF Downloads 71
2624 Multi-Label Approach to Facilitate Test Automation Based on Historical Data

Authors: Warda Khan, Remo Lachmann, Adarsh S. Garakahally

Abstract:

The increasing complexity of software and its applicability in a wide range of industries, e.g., automotive, call for enhanced quality assurance techniques. Test automation is one option to tackle the prevailing challenges by supporting test engineers with fast, parallel, and repetitive test executions. A high degree of test automation allows for a shift from mundane (manual) testing tasks to a more analytical assessment of the software under test. However, a high initial investment of test resources is required to establish test automation, which is, in most cases, a limitation to the time constraints provided for quality assurance of complex software systems. Hence, a computer-aided creation of automated test cases is crucial to increase the benefit of test automation. This paper proposes the application of machine learning for the generation of automated test cases. It is based on supervised learning to analyze test specifications and existing test implementations. The analysis facilitates the identification of patterns between test steps and their implementation with test automation components. For the test case generation, this approach exploits historical data of test automation projects. The identified patterns are the foundation to predict the implementation of unknown test case specifications. Based on this support, a test engineer solely has to review and parameterize the test automation components instead of writing them manually, resulting in a significant time reduction for establishing test automation. Compared to other generation approaches, this ML-based solution can handle different writing styles, authors, application domains, and even languages. Furthermore, test automation tools require expert knowledge by means of programming skills, whereas this approach only requires historical data to generate test cases. The proposed solution is evaluated using various multi-label evaluation criteria (EC) and two small-sized real-world systems. The most prominent EC is ‘Subset Accuracy’. The promising results show an accuracy of at least 86% for test cases, where a 1:1 relationship (Multi-Class) between test step specification and test automation component exists. For complex multi-label problems, i.e., one test step can be implemented by several components, the prediction accuracy is still at 60%. It is better than the current state-of-the-art results. It is expected the prediction quality to increase for larger systems with respective historical data. Consequently, this technique facilitates the time reduction for establishing test automation and is thereby independent of the application domain and project. As a work in progress, the next steps are to investigate incremental and active learning as additions to increase the usability of this approach, e.g., in case labelled historical data is scarce.

Keywords: machine learning, multi-class, multi-label, supervised learning, test automation

Procedia PDF Downloads 115
2623 Operator Efficiency Study for Assembly Line Optimization at Semiconductor Assembly and Test

Authors: Rohana Abdullah, Md Nizam Abd Rahman, Seri Rahayu Kamat

Abstract:

Operator efficiency aspect is gaining importance in ensuring optimized usage of resources especially in the semi-automated manufacturing environment. This paper addresses a case study done to solve operator efficiency and line balancing issue at a semiconductor assembly and test manufacturing. A Man-to-Machine (M2M) work study technique is used to study operator current utilization and determine the optimum allocation of the operators to the machines. Critical factors such as operator activity, activity frequency and operator competency level are considered to gain insight on the parameters that affects the operator utilization. Equipment standard time and overall equipment efficiency (OEE) information are also gathered and analyzed to achieve a balanced and optimized production.

Keywords: operator efficiency, optimized production, line balancing, industrial and manufacturing engineering

Procedia PDF Downloads 715
2622 Power Control of DFIG in WECS Using Backstipping and Sliding Mode Controller

Authors: Abdellah Boualouch, Ahmed Essadki, Tamou Nasser, Ali Boukhriss, Abdellatif Frigui

Abstract:

This paper presents a power control for a Doubly Fed Induction Generator (DFIG) using in Wind Energy Conversion System (WECS) connected to the grid. The proposed control strategy employs two nonlinear controllers, Backstipping (BSC) and sliding-mode controller (SMC) scheme to directly calculate the required rotor control voltage so as to eliminate the instantaneous errors of active and reactive powers. In this paper the advantages of BSC and SMC are presented, the performance and robustness of this two controller’s strategy are compared between them. First, we present a model of wind turbine and DFIG machine, then a synthesis of the controllers and their application in the DFIG power control. Simulation results on a 1.5MW grid-connected DFIG system are provided by MATLAB/Simulink.

Keywords: backstipping, DFIG, power control, sliding-mode, WESC

Procedia PDF Downloads 582
2621 Object Detection Based on Plane Segmentation and Features Matching for a Service Robot

Authors: António J. R. Neves, Rui Garcia, Paulo Dias, Alina Trifan

Abstract:

With the aging of the world population and the continuous growth in technology, service robots are more and more explored nowadays as alternatives to healthcare givers or personal assistants for the elderly or disabled people. Any service robot should be capable of interacting with the human companion, receive commands, navigate through the environment, either known or unknown, and recognize objects. This paper proposes an approach for object recognition based on the use of depth information and color images for a service robot. We present a study on two of the most used methods for object detection, where 3D data is used to detect the position of objects to classify that are found on horizontal surfaces. Since most of the objects of interest accessible for service robots are on these surfaces, the proposed 3D segmentation reduces the processing time and simplifies the scene for object recognition. The first approach for object recognition is based on color histograms, while the second is based on the use of the SIFT and SURF feature descriptors. We present comparative experimental results obtained with a real service robot.

Keywords: object detection, feature, descriptors, SIFT, SURF, depth images, service robots

Procedia PDF Downloads 529
2620 Online Authenticity Verification of a Biometric Signature Using Dynamic Time Warping Method and Neural Networks

Authors: Gałka Aleksandra, Jelińska Justyna, Masiak Albert, Walentukiewicz Krzysztof

Abstract:

An offline signature is well-known however not the safest way to verify identity. Nowadays, to ensure proper authentication, i.e. in banking systems, multimodal verification is more widely used. In this paper the online signature analysis based on dynamic time warping (DTW) coupled with machine learning approaches has been presented. In our research signatures made with biometric pens were gathered. Signature features as well as their forgeries have been described. For verification of authenticity various methods were used including convolutional neural networks using DTW matrix and multilayer perceptron using sums of DTW matrix paths. System efficiency has been evaluated on signatures and signature forgeries collected on the same day. Results are presented and discussed in this paper.

Keywords: dynamic time warping, handwritten signature verification, feature-based recognition, online signature

Procedia PDF Downloads 152
2619 BART Matching Method: Using Bayesian Additive Regression Tree for Data Matching

Authors: Gianna Zou

Abstract:

Propensity score matching (PSM), introduced by Paul R. Rosenbaum and Donald Rubin in 1983, is a popular statistical matching technique which tries to estimate the treatment effects by taking into account covariates that could impact the efficacy of study medication in clinical trials. PSM can be used to reduce the bias due to confounding variables. However, PSM assumes that the response values are normally distributed. In some cases, this assumption may not be held. In this paper, a machine learning method - Bayesian Additive Regression Tree (BART), is used as a more robust method of matching. BART can work well when models are misspecified since it can be used to model heterogeneous treatment effects. Moreover, it has the capability to handle non-linear main effects and multiway interactions. In this research, a BART Matching Method (BMM) is proposed to provide a more reliable matching method over PSM. By comparing the analysis results from PSM and BMM, BMM can perform well and has better prediction capability when the response values are not normally distributed.

Keywords: BART, Bayesian, matching, regression

Procedia PDF Downloads 135
2618 Classification Based on Deep Neural Cellular Automata Model

Authors: Yasser F. Hassan

Abstract:

Deep learning structure is a branch of machine learning science and greet achievement in research and applications. Cellular neural networks are regarded as array of nonlinear analog processors called cells connected in a way allowing parallel computations. The paper discusses how to use deep learning structure for representing neural cellular automata model. The proposed learning technique in cellular automata model will be examined from structure of deep learning. A deep automata neural cellular system modifies each neuron based on the behavior of the individual and its decision as a result of multi-level deep structure learning. The paper will present the architecture of the model and the results of simulation of approach are given. Results from the implementation enrich deep neural cellular automata system and shed a light on concept formulation of the model and the learning in it.

Keywords: cellular automata, neural cellular automata, deep learning, classification

Procedia PDF Downloads 175
2617 The Application of a Hybrid Neural Network for Recognition of a Handwritten Kazakh Text

Authors: Almagul Assainova , Dariya Abykenova, Liudmila Goncharenko, Sergey Sybachin, Saule Rakhimova, Abay Aman

Abstract:

The recognition of a handwritten Kazakh text is a relevant objective today for the digitization of materials. The study presents a model of a hybrid neural network for handwriting recognition, which includes a convolutional neural network and a multi-layer perceptron. Each network includes 1024 input neurons and 42 output neurons. The model is implemented in the program, written in the Python programming language using the EMNIST database, NumPy, Keras, and Tensorflow modules. The neural network training of such specific letters of the Kazakh alphabet as ә, ғ, қ, ң, ө, ұ, ү, h, і was conducted. The neural network model and the program created on its basis can be used in electronic document management systems to digitize the Kazakh text.

Keywords: handwriting recognition system, image recognition, Kazakh font, machine learning, neural networks

Procedia PDF Downloads 248
2616 Application of Deep Learning in Top Pair and Single Top Quark Production at the Large Hadron Collider

Authors: Ijaz Ahmed, Anwar Zada, Muhammad Waqas, M. U. Ashraf

Abstract:

We demonstrate the performance of a very efficient tagger applies on hadronically decaying top quark pairs as signal based on deep neural network algorithms and compares with the QCD multi-jet background events. A significant enhancement of performance in boosted top quark events is observed with our limited computing resources. We also compare modern machine learning approaches and perform a multivariate analysis of boosted top-pair as well as single top quark production through weak interaction at √s = 14 TeV proton-proton Collider. The most relevant known background processes are incorporated. Through the techniques of Boosted Decision Tree (BDT), likelihood and Multlayer Perceptron (MLP) the analysis is trained to observe the performance in comparison with the conventional cut based and count approach

Keywords: top tagger, multivariate, deep learning, LHC, single top

Procedia PDF Downloads 98
2615 Like Life Itself: Elemental Affordances in the Creation of Transmedia Storyworlds-The Four Broken Hearts Case Study

Authors: Muhammad Babar Suleman

Abstract:

Transgressing the boundaries of the real and the virtual, the temporal and the spatial and the personal and the political, Four Broken Hearts is a hybrid storyworld encompassing film, live performance, location-based experiences and social media. The project is scheduled for launch early next year and is currently a work-in-progress undergoing initial user testing. The story of Four Broken Hearts is being told by taking each of the classic elements of fiction- character, setting, exposition, climax and denouement - and bringing them ‘to life’ in the medium that conveys them to the highest degree of mimesis: Characters are built and explored through social media, Setting is experienced through location-based storytelling, the Backstory is fleshed out using film and the Climax is performed as an immersive drama. By taking advantage of what each medium does best while complementing the other mediums, Four Broken Hearts is presented in the form of a rich transmedia experience that allows audiences to explore the story world across many different platforms while still tying it all together within a cohesive narrative. This article presents an investigation of the project’s narrative outputs produced so far.

Keywords: narratology, storyworld, transmedia, narrative, storytelling

Procedia PDF Downloads 292
2614 Linac Quality Controls Using An Electronic Portal Imaging Device

Authors: Domingo Planes Meseguer, Raffaele Danilo Esposito, Maria Del Pilar Dorado Rodriguez

Abstract:

Monthly quality control checks for a Radiation Therapy Linac may be performed is a simple and efficient way once they have been standardized and protocolized. On the other hand this checks, in spite of being imperatives, require a not negligible execution times in terms of machine time and operators time. Besides it must be taken into account the amount of disposable material which may be needed together with the use of commercial software for their performing. With the aim of optimizing and standardizing mechanical-geometric checks and multi leaves collimator checks, we decided to implement a protocol which makes use of the Electronic Portal Imaging Device (EPID) available on our Linacs. The user is step by step guided by the software during the whole procedure. Acquired images are automatically analyzed by our programs all of them written using only free software.

Keywords: quality control checks, linac, radiation oncology, medical physics, free software

Procedia PDF Downloads 185
2613 Understanding Context and Its Effects in the Implementation of Modern Foreign Language Curriculum in Vietnam

Authors: Ngoc T. Bui

Abstract:

The key issue for teachers of a modern foreign language is the creation of a pedagogic environment, and this means that an understanding of context is vital. A pedagogic environment addresses the following: time, feedback, relations with other people, curriculum integration, forms of knowledge, resources and control in the pedagogic relationship. In this light, the multiple case study of the implementation of a modern foreign language curriculum focuses on exploring Vietnamese contexts and participants’ perceptions of factors that may affect their implementation process in order to examine thoroughly how the communicative language teaching (CLT) curriculum is being implemented in second language classrooms. A mixed methods approach is utilized to investigate contextual and personal factors that may affect teachers’ implementation of curriculum and pedagogical reform in Vietnam. This project therefore has the capability to inform stakeholders of useful information and identify further changes and measures to solve potential problems to ensure the achievement of the curriculum goals. The expected outcomes may also lead to intercultural language teaching guidelines to support english as a foreign language (EFL) teachers with curriculum design, planning and how to create pedagogic environment to best implement it.

Keywords: communicative language teaching, context, curriculum implementation, modern foreign language, pedagogic environment

Procedia PDF Downloads 253
2612 Features for Measuring Credibility on Facebook Information

Authors: Kanda Runapongsa Saikaew, Chaluemwut Noyunsan

Abstract:

Nowadays social media information, such as news, links, images, or VDOs, is shared extensively. However, the effectiveness of disseminating information through social media lacks in quality: less fact checking, more biases, and several rumors. Many researchers have investigated about credibility on Twitter, but there is no the research report about credibility information on Facebook. This paper proposes features for measuring credibility on Facebook information. We developed the system for credibility on Facebook. First, we have developed FB credibility evaluator for measuring credibility of each post by manual human’s labelling. We then collected the training data for creating a model using Support Vector Machine (SVM). Secondly, we developed a chrome extension of FB credibility for Facebook users to evaluate the credibility of each post. Based on the usage analysis of our FB credibility chrome extension, about 81% of users’ responses agree with suggested credibility automatically computed by the proposed system.

Keywords: facebook, social media, credibility measurement, internet

Procedia PDF Downloads 343
2611 Experimental and Analytical Dose Assessment of Patient's Family Members Treated with I-131

Authors: Marzieh Ebrahimi, Vahid Changizi, Mohammad Reza Kardan, Seyed Mahdi Hosseini Pooya, Parham Geramifar

Abstract:

Radiation exposure to the patient's family members is one of the major concerns during thyroid cancer radionuclide therapy. The aim of this study was to measure the total effective dose of the family members by means of thermoluminescence personal dosimeter, and compare with those calculated by analytical methods. Eighty-five adult family members of fifty-one patients volunteered to participate in this research study. Considering the minimum and maximum range of dose rate from 15 µsv/h to 120 µsv/h at patients' release time, the calculated mean and median dose values of family members were 0.45 mSv and 0.28 mSv, respectively. Moreover, almost all family members’ doses were measured to be less than the dose constraint of 5 mSv recommended by Basic Safety Standards. Considering the influence parameters such as patient dose rate and administrated activity, the total effective doses of family members were calculated by TEDE and NRC formulas and compared with those of experimental results. The results indicated that, it is fruitful to use the quantitative calculations for releasing patients treated with I-131 and correct estimation of patients' family doses.

Keywords: effective dose, thermoluminescence, I-131, thyroid cancer

Procedia PDF Downloads 383
2610 Phthalate Exposure among Roma Population in Slovakia

Authors: Miroslava Šidlovská, Ida Petrovičová, Tomáš Pilka, Branislav Kolena

Abstract:

Phthalates are ubiquitous environmental pollutants well-known because of their endocrine disrupting activity in human organism. The aim of our study was, by biological monitoring, investigate exposure to phthalates of Roma ethnicity group i.e. children and adults from 5 families (n=29, average age 11.8 ± 7.6 years) living in western Slovakia. Additionally, we analysed some associations between anthropometric measures, questionnaire data i.e. socio-economic status, eating and drinking habits, practise of personal care products and household conditions in comparison with concentrations of phthalate metabolites. We used for analysis of urine samples high performance liquid chromatography and tandem mass spectrometry (HPLC-MS/MS) to determine concentrations of phthalate metabolites monoethyl phthalate (MEP), mono-n-butyl phthalate (MnBP), mono-iso-butyl phthalate (MiBP), mono(2-ethyl-5-hydroxyhexyl) phthalate (5OH-MEHP), mono(2-ethyl-5-oxohexyl) phthalate (5oxo-MEHP) and mono(2-etylhexyl) phthalate (MEHP). Our results indicate that ethnicity, lower socioeconomic status and different housing conditions in Roma population can affect urinary concentration of phthalate metabolites.

Keywords: biomonitoring, ethnicity, human exposure, phthalate metabolites

Procedia PDF Downloads 288
2609 Competitive Advantages of a Firm without Fundamental Technology: A Case Study of Sony, Casio and Nintendo

Authors: Kiyohiro Yamazaki

Abstract:

A purpose of this study is to examine how a firm without fundamental technology is able to gain the competitive advantage. This paper examines three case studies, Sony in the flat display TV industry, Casio in the digital camera industry and Nintendo in the home game machine industry. This paper maintain the firms without fundamental technology construct two advantages, economic advantage and organizational advantage. An economic advantage involves the firm can select either high-tech or cheap devices out of several device makers, and change the alternatives cheaply and quickly. In addition, organizational advantage means that a firm without fundamental technology is not restricted by organizational inertia and cognitive restraints, and exercises the characteristic of strength.

Keywords: firm without fundamental technology, economic advantage, organizational advantage, Sony, Casio, Nintendo

Procedia PDF Downloads 275
2608 Association of Genetically Proxied Cholesterol-Lowering Drug Targets and Head and Neck Cancer Survival: A Mendelian Randomization Analysis

Authors: Danni Cheng

Abstract:

Background: Preclinical and epidemiological studies have reported potential protective effects of low-density lipoprotein cholesterol (LDL-C) lowering drugs on head and neck squamous cell cancer (HNSCC) survival, but the causality was not consistent. Genetic variants associated with LDL-C lowering drug targets can predict the effects of their therapeutic inhibition on disease outcomes. Objective: We aimed to evaluate the causal association of genetically proxied cholesterol-lowering drug targets and circulating lipid traits with cancer survival in HNSCC patients stratified by human papillomavirus (HPV) status using two-sample Mendelian randomization (MR) analyses. Method: Single-nucleotide polymorphisms (SNPs) in gene region of LDL-C lowering drug targets (HMGCR, NPC1L1, CETP, PCSK9, and LDLR) associated with LDL-C levels in genome-wide association study (GWAS) from the Global Lipids Genetics Consortium (GLGC) were used to proxy LDL-C lowering drug action. SNPs proxy circulating lipids (LDL-C, HDL-C, total cholesterol, triglycerides, apoprotein A and apoprotein B) were also derived from the GLGC data. Genetic associations of these SNPs and cancer survivals were derived from 1,120 HPV-positive oropharyngeal squamous cell carcinoma (OPSCC) and 2,570 non-HPV-driven HNSCC patients in VOYAGER program. We estimated the causal associations of LDL-C lowering drugs and circulating lipids with HNSCC survival using the inverse-variance weighted method. Results: Genetically proxied HMGCR inhibition was significantly associated with worse overall survival (OS) in non-HPV-drive HNSCC patients (inverse variance-weighted hazard ratio (HR IVW), 2.64[95%CI,1.28-5.43]; P = 0.01) but better OS in HPV-positive OPSCC patients (HR IVW,0.11[95%CI,0.02-0.56]; P = 0.01). Estimates for NPC1L1 were strongly associated with worse OS in both total HNSCC (HR IVW,4.17[95%CI,1.06-16.36]; P = 0.04) and non-HPV-driven HNSCC patients (HR IVW,7.33[95%CI,1.63-32.97]; P = 0.01). A similar result was found that genetically proxied PSCK9 inhibitors were significantly associated with poor OS in non-HPV-driven HNSCC (HR IVW,1.56[95%CI,1.02 to 2.39]). Conclusion: Genetically proxied long-term HMGCR inhibition was significantly associated with decreased OS in non-HPV-driven HNSCC and increased OS in HPV-positive OPSCC. While genetically proxied NPC1L1 and PCSK9 had associations with worse OS in total and non-HPV-driven HNSCC patients. Further research is needed to understand whether these drugs have consistent associations with head and neck tumor outcomes.

Keywords: Mendelian randomization analysis, head and neck cancer, cancer survival, cholesterol, statin

Procedia PDF Downloads 88
2607 [Keynote] Implementation of Quality Control Procedures in Radiotherapy CT Simulator

Authors: B. Petrović, L. Rutonjski, M. Baucal, M. Teodorović, O. Čudić, B. Basarić

Abstract:

Purpose/Objective: Radiotherapy treatment planning requires use of CT simulator, in order to acquire CT images. The overall performance of CT simulator determines the quality of radiotherapy treatment plan, and at the end, the outcome of treatment for every single patient. Therefore, it is strongly advised by international recommendations, to set up a quality control procedures for every machine involved in radiotherapy treatment planning process, including the CT scanner/ simulator. The overall process requires number of tests, which are used on daily, weekly, monthly or yearly basis, depending on the feature tested. Materials/Methods: Two phantoms were used: a dedicated phantom CIRS 062QA, and a QA phantom obtained with the CT simulator. The examined CT simulator was Siemens Somatom Definition as Open, dedicated for radiation therapy treatment planning. The CT simulator has a built in software, which enables fast and simple evaluation of CT QA parameters, using the phantom provided with the CT simulator. On the other hand, recommendations contain additional test, which were done with the CIRS phantom. Also, legislation on ionizing radiation protection requires CT testing in defined periods of time. Taking into account the requirements of law, built in tests of a CT simulator, and international recommendations, the intitutional QC programme for CT imulator is defined, and implemented. Results: The CT simulator parameters evaluated through the study were following: CT number accuracy, field uniformity, complete CT to ED conversion curve, spatial and contrast resolution, image noise, slice thickness, and patient table stability.The following limits are established and implemented: CT number accuracy limits are +/- 5 HU of the value at the comissioning. Field uniformity: +/- 10 HU in selected ROIs. Complete CT to ED curve for each tube voltage must comply with the curve obtained at comissioning, with deviations of not more than 5%. Spatial and contrast resultion tests must comply with the tests obtained at comissioning, otherwise machine requires service. Result of image noise test must fall within the limit of 20% difference of the base value. Slice thickness must meet manufacturer specifications, and patient stability with longitudinal transfer of loaded table must not differ of more than 2mm vertical deviation. Conclusion: The implemented QA tests gave overall basic understanding of CT simulator functionality and its clinical effectiveness in radiation treatment planning. The legal requirement to the clinic is to set up it’s own QA programme, with minimum testing, but it remains user’s decision whether additional testing, as recommended by international organizations, will be implemented, so to improve the overall quality of radiation treatment planning procedure, as the CT image quality used for radiation treatment planning, influences the delineation of a tumor and calculation accuracy of treatment planning system, and finally delivery of radiation treatment to a patient.

Keywords: CT simulator, radiotherapy, quality control, QA programme

Procedia PDF Downloads 516
2606 Using Computer Vision and Machine Learning to Improve Facility Design for Healthcare Facility Worker Safety

Authors: Hengameh Hosseini

Abstract:

Design of large healthcare facilities – such as hospitals, multi-service line clinics, and nursing facilities - that can accommodate patients with wide-ranging disabilities is a challenging endeavor and one that is poorly understood among healthcare facility managers, administrators, and executives. An even less-understood extension of this problem is the implications of weakly or insufficiently accommodative design of facilities for healthcare workers in physically-intensive jobs who may also suffer from a range of disabilities and who are therefore at increased risk of workplace accident and injury. Combine this reality with the vast range of facility types, ages, and designs, and the problem of universal accommodation becomes even more daunting and complex. In this study, we focus on the implication of facility design for healthcare workers suffering with low vision who also have physically active jobs. The points of difficulty are myriad and could span health service infrastructure, the equipment used in health facilities, and transport to and from appointments and other services can all pose a barrier to health care if they are inaccessible, less accessible, or even simply less comfortable for people with various disabilities. We conduct a series of surveys and interviews with employees and administrators of 7 facilities of a range of sizes and ownership models in the Northeastern United States and combine that corpus with in-facility observations and data collection to identify five major points of failure common to all the facilities that we concluded could pose safety threats to employees with vision impairments, ranging from very minor to severe. We determine that lack of design empathy is a major commonality among facility management and ownership. We subsequently propose three methods for remedying this lack of empathy-informed design, to remedy the dangers posed to employees: the use of an existing open-sourced Augmented Reality application to simulate the low-vision experience for designers and managers; the use of a machine learning model we develop to automatically infer facility shortcomings from large datasets of recorded patient and employee reviews and feedback; and the use of a computer vision model fine tuned on images of each facility to infer and predict facility features, locations, and workflows, that could again pose meaningful dangers to visually impaired employees of each facility. After conducting a series of real-world comparative experiments with each of these approaches, we conclude that each of these are viable solutions under particular sets of conditions, and finally characterize the range of facility types, workforce composition profiles, and work conditions under which each of these methods would be most apt and successful.

Keywords: artificial intelligence, healthcare workers, facility design, disability, visually impaired, workplace safety

Procedia PDF Downloads 92
2605 Comparison of Tensile Strength and Folding Endurance of (FDM Process) 3D Printed ABS and PLA Materials

Authors: R. Devicharan

Abstract:

In a short span 3D Printing is expected to play a vital role in our life. The possibility of creativity and speed in manufacturing through various 3D printing processes is infinite. This study is performed on the FDM (Fused Deposition Modelling) method of 3D printing, which is one of the pre-dominant methods of 3D printing technologies. This study focuses on physical properties of the objects produced by 3D printing which determine the applications of the 3D printed objects. This paper specifically aims at the study of the tensile strength and the folding endurance of the 3D printed objects through the FDM (Fused Deposition Modelling) method using the ABS (Acronitirile Butadiene Styrene) and PLA (Poly Lactic Acid) plastic materials. The study is performed on a controlled environment and the specific machine settings. Appropriate tables, graphs are plotted and research analysis techniques will be utilized to analyse, verify and validate the experiment results.

Keywords: FDM process, 3D printing, ABS for 3D printing, PLA for 3D printing, rapid prototyping

Procedia PDF Downloads 588
2604 Narrative Inquiry into Teachers’ Experiences of Empathy in English Language Teaching

Authors: Yao Chen

Abstract:

Empathy is crucial for teachers working with teenagers in secondary school. Despite that, little attention was paid to English language teachers’ experiences of empathy in class. Empathy contains cognitive, emotional, and behavioral components that are manifested in the teaching practice. The qualitative study focused on how Chinese ELT teachers expressed empathy in interaction with students in public high schools and private institutions and what factors might lead them to show empathy in different ways. Four participants were invited to attend the individual interviews to share their stories about their empathic experiences. Classroom observation was conducted to investigate teachers’ language use in teaching and non-verbal communication with students to witness their behavior of expressing empathy. Through thematic analysis, three main themes relevant to different types of empathy in teachers’ interaction with students were generated: 1) perspective taking, 2) emotional connections, 3) action taking. Based on the participants’ statements of their personal experiences, the discussion concluded the reasons for their differences in expressing empathy. The result underlined the significance of the role of empathy in building a rapport with students and motivating their language learning. Further implications for the role of empathy in ELT teachers’ professional development are also discussed.

Keywords: teacher empathy, experiences, interaction with students, ELT class

Procedia PDF Downloads 49
2603 Adhesion of Sputtered Copper Thin Films Deposited on Flexible Substrates

Authors: Rwei-Ching Chang, Bo-Yu Su

Abstract:

Adhesion of copper thin films deposited on polyethylene terephthAdhesion of copper thin films deposited on polyethylene terephthalate substrate by direct current sputtering with different sputtering parameters is discussed in this work. The effects of plasma treatment with 0, 5, and 10 minutes on the thin film properties are investigated first. Various argon flow rates at 40, 50, 60 standard cubic centimeters per minute (sccm), deposition power at 30, 40, 50 W, and film thickness at 100, 200, 300 nm are also discussed. The 3-dimensional surface profilometer, micro scratch machine, and optical microscope are used to characterize the thin film properties. The results show that the increase of the plasma treatment time on the polyethylene terephthalate surface affects the roughness and critical load of the films. The critical load increases as the plasma treatment time increases. When the plasma treatment time was adjusted from 5 minutes to 10 minutes, the adhesion increased from 8.20 mN to 13.67 mN. When the argon flow rate is decreased from 60 sccm to 40 sccm, the adhesion increases from 8.27 mN to 13.67 mN. The adhesion is also increased by the condition of higher power, where the adhesion increased from 13.67 mN to 25.07 mN as the power increases from 30 W to 50 W. The adhesion of the film increases from 13.67 mN to 21.41mN as the film thickness increases from 100 nm to 300 nm. Comparing all the deposition parameters, it indicates the change of the power and thickness has much improvement on the film adhesion.alate substrate by direct current sputtering with different sputtering parameters is discussed in this work. The effects of plasma treatment with 0, 5, and 10 minutes on the thin film properties are investigated first. Various argon flow rates at 40, 50, 60 standard cubic centimeters per minute (sccm), deposition power at 30, 40, 50 W, and film thickness at 100, 200, 300 nm are also discussed. The 3-dimensional surface profilometer, micro scratch machine, and optical microscope are used to characterize the thin film properties. The results show that the increase of the plasma treatment time on the polyethylene terephthalate surface affects the roughness and critical load of the films. The critical load increases as the plasma treatment time increases. When the plasma treatment time was adjusted from 5 minutes to 10 minutes, the adhesion increased from 8.20 mN to 13.67 mN. When the argon flow rate is decreased from 60 sccm to 40 sccm, the adhesion increases from 8.27 mN to 13.67 mN. The adhesion is also increased by the condition of higher power, where the adhesion increased from 13.67 mN to 25.07 mN as the power increases from 30 W to 50 W. The adhesion of the film increases from 13.67 mN to 21.41mN as the film thickness increases from 100 nm to 300 nm. Comparing all the deposition parameters, it indicates the change of the power and thickness has much improvement on the film adhesion.

Keywords: flexible substrate, sputtering, adhesion, copper thin film

Procedia PDF Downloads 121
2602 Predicting OpenStreetMap Coverage by Means of Remote Sensing: The Case of Haiti

Authors: Ran Goldblatt, Nicholas Jones, Jennifer Mannix, Brad Bottoms

Abstract:

Accurate, complete, and up-to-date geospatial information is the foundation of successful disaster management. When the 2010 Haiti Earthquake struck, accurate and timely information on the distribution of critical infrastructure was essential for the disaster response community for effective search and rescue operations. Existing geospatial datasets such as Google Maps did not have comprehensive coverage of these features. In the days following the earthquake, many organizations released high-resolution satellite imagery, catalyzing a worldwide effort to map Haiti and support the recovery operations. Of these organizations, OpenStreetMap (OSM), a collaborative project to create a free editable map of the world, used the imagery to support volunteers to digitize roads, buildings, and other features, creating the most detailed map of Haiti in existence in just a few weeks. However, large portions of the island are still not fully covered by OSM. There is an increasing need for a tool to automatically identify which areas in Haiti, as well as in other countries vulnerable to disasters, that are not fully mapped. The objective of this project is to leverage different types of remote sensing measurements, together with machine learning approaches, in order to identify geographical areas where OSM coverage of building footprints is incomplete. Several remote sensing measures and derived products were assessed as potential predictors of OSM building footprints coverage, including: intensity of light emitted at night (based on VIIRS measurements), spectral indices derived from Sentinel-2 satellite (normalized difference vegetation index (NDVI), normalized difference built-up index (NDBI), soil-adjusted vegetation index (SAVI), urban index (UI)), surface texture (based on Sentinel-1 SAR measurements)), elevation and slope. Additional remote sensing derived products, such as Hansen Global Forest Change, DLR`s Global Urban Footprint (GUF), and World Settlement Footprint (WSF), were also evaluated as predictors, as well as OSM street and road network (including junctions). Using a supervised classification with a random forest classifier resulted in the prediction of 89% of the variation of OSM building footprint area in a given cell. These predictions allowed for the identification of cells that are predicted to be covered but are actually not mapped yet. With these results, this methodology could be adapted to any location to assist with preparing for future disastrous events and assure that essential geospatial information is available to support the response and recovery efforts during and following major disasters.

Keywords: disaster management, Haiti, machine learning, OpenStreetMap, remote sensing

Procedia PDF Downloads 110
2601 The Mentoring in Professional Development of University Teachers

Authors: Nagore Guerra Bilbao, Clemente Lobato Fraile

Abstract:

Mentoring is provided by professionals with a higher level of experience and competence as part of the professional development of a university faculty. This paper explores the characteristics of the mentoring provided by those teachers participating in the development of an active methodology program run at the University of the Basque Country: to examine and to analyze mentors’ performance with the aim of providing empirical evidence regarding its value as a lifelong learning strategy for teaching staff. A total of 183 teachers were trained during the first three programs. The analysis method uses a coding technique and is based on flexible, systematic guidelines for gathering and analyzing qualitative data. The results have confirmed the conception of mentoring as a methodological innovation in higher education. In short, university teachers in general assessed the mentoring they received positively, considering it to be a valid, useful strategy in their professional development. They highlighted the methodological expertise of their mentor and underscored how they monitored the learning process of the active method and provided guidance and advice when necessary. Finally, they also drew attention to traits such as availability, personal commitment and flexibility in. However, a minority critique is pointed to some aspects of the performance of some mentors.

Keywords: higher education, mentoring, professional development, university teachers

Procedia PDF Downloads 224
2600 An Eco-Translatology Approach to the Translation of Spanish Tourism Advertising in Digital Communication in Chinese

Authors: Mingshu Liu, Laura Santamaria, Xavier Carmaniu Mainadé

Abstract:

As one of the sectors most affected by the COVID-19 pandemic, tourism is facing challenges in revitalizing the industry. But at the same time, it would be a good opportunity to take advantage of digital communication as an effective tool for tourism promotion. Our proposal aims to verify the linguistic operations on online platforms in China. The research is carried out based on the theory of Eco-traductology put forward by Gengshen Hu, whose contribution focuses on the translator's adaptation to the ecosystem environment and the three elaborated parameters (linguistic, cultural and communicative). We also relate it to Even-Zohar's and Toury's theoretical postulates on the Polysystem to elaborate on interdisciplinary methodology. Such a methodology allows us to analyze personal treatments and phraseology in the target text. As for the corpus, we adopt the official Spanish-language website of Turismo de España as the source text and the postings on the two major social networks in China, Weibo and Wechat, in 2019. Through qualitative analysis, we conclude that, in the tourism advertising campaign on Chinese social networks, chengyu (Chinese phraseology) and honorific titles are used very frequently.

Keywords: digital communication, eco-traductology, polysystem theory, tourism advertising

Procedia PDF Downloads 215
2599 Verification of Geophysical Investigation during Subsea Tunnelling in Qatar

Authors: Gary Peach, Furqan Hameed

Abstract:

Musaimeer outfall tunnel is one of the longest storm water tunnels in the world, with a total length of 10.15 km. The tunnel will accommodate surface and rain water received from the drainage networks from 270 km of urban areas in southern Doha with a pumping capacity of 19.7m³/sec. The tunnel is excavated by Tunnel Boring Machine (TBM) through Rus Formation, Midra Shales, and Simsima Limestone. Water inflows at high pressure, complex mixed ground, and weaker ground strata prone to karstification with the presence of vertical and lateral fractures connected to the sea bed were also encountered during mining. In addition to pre-tender geotechnical investigations, the Contractor carried out a supplementary offshore geophysical investigation in order to fine-tune the existing results of geophysical and geotechnical investigations. Electric resistivity tomography (ERT) and Seismic Reflection survey was carried out. Offshore geophysical survey was performed, and interpretations of rock mass conditions were made to provide an overall picture of underground conditions along the tunnel alignment. This allowed the critical tunnelling area and cutter head intervention to be planned accordingly. Karstification was monitored with a non-intrusive radar system facility installed on the TBM. The Boring Electric Ahead Monitoring(BEAM) was installed at the cutter head and was able to predict the rock mass up to 3 tunnel diameters ahead of the cutter head. BEAM system was provided with an online system for real time monitoring of rock mass condition and then correlated with the rock mass conditions predicted during the interpretation phase of offshore geophysical surveys. The further correlation was carried by Samples of the rock mass taken from tunnel face inspections and excavated material produced by the TBM. The BEAM data was continuously monitored to check the variations in resistivity and percentage frequency effect (PFE) of the ground. This system provided information about rock mass condition, potential karst risk, and potential of water inflow. BEAM system was found to be more than 50% accurate in picking up the difficult ground conditions and faults as predicted in the geotechnical interpretative report before the start of tunnelling operations. Upon completion of the project, it was concluded that the combined use of different geophysical investigation results can make the execution stage be carried out in a more confident way with the less geotechnical risk involved. The approach used for the prediction of rock mass condition in Geotechnical Interpretative Report (GIR) and Geophysical Reflection and electric resistivity tomography survey (ERT) Geophysical Reflection surveys were concluded to be reliable as the same rock mass conditions were encountered during tunnelling operations.

Keywords: tunnel boring machine (TBM), subsea, karstification, seismic reflection survey

Procedia PDF Downloads 218