Search results for: RFPM-type machine
1313 Deep-Learning Based Approach to Facial Emotion Recognition through Convolutional Neural Network
Authors: Nouha Khediri, Mohammed Ben Ammar, Monji Kherallah
Abstract:
Recently, facial emotion recognition (FER) has become increasingly essential to understand the state of the human mind. Accurately classifying emotion from the face is a challenging task. In this paper, we present a facial emotion recognition approach named CV-FER, benefiting from deep learning, especially CNN and VGG16. First, the data is pre-processed with data cleaning and data rotation. Then, we augment the data and proceed to our FER model, which contains five convolutions layers and five pooling layers. Finally, a softmax classifier is used in the output layer to recognize emotions. Based on the above contents, this paper reviews the works of facial emotion recognition based on deep learning. Experiments show that our model outperforms the other methods using the same FER2013 database and yields a recognition rate of 92%. We also put forward some suggestions for future work.Keywords: CNN, deep-learning, facial emotion recognition, machine learning
Procedia PDF Downloads 931312 Optimization of the Control Scheme for Human Extremity Exoskeleton
Authors: Yang Li, Xiaorong Guan, Cheng Xu
Abstract:
In order to design a suitable control scheme for human extremity exoskeleton, the interaction force control scheme with traditional PI controller was presented, and the simulation study of the electromechanical system of the human extremity exoskeleton was carried out by using a MATLAB/Simulink module. By analyzing the simulation calculation results, it was shown that the traditional PI controller is not very suitable for every movement speed of human body. So, at last the fuzzy self-adaptive PI controller was presented to solve this problem. Eventually, the superiority and feasibility of the fuzzy self-adaptive PI controller was proved by the simulation results and experimental results.Keywords: human extremity exoskeleton, interaction force control scheme, simulation study, fuzzy self-adaptive pi controller, man-machine coordinated walking, bear payload
Procedia PDF Downloads 3611311 An Artificial Intelligence Framework to Forecast Air Quality
Authors: Richard Ren
Abstract:
Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms
Procedia PDF Downloads 1241310 Investigation of the Cooling and Uniformity Effectiveness in a Sinter Packed Bed
Authors: Uzu-Kuei Hsu, Chang-Hsien Tai, Kai-Wun Jin
Abstract:
When sinters are filled into the cooler from the sintering machine, and the non-uniform distribution of the sinters leads to uneven cooling. This causes the temperature difference of the sinters leaving the cooler to be so large that it results in the conveyors being deformed by the heat. The present work applies CFD method to investigate the thermo flowfield phenomena in a sinter cooler by the Porous Media Model. Using the obtained experimental data to simulate porosity (Ε), permeability (κ), inertial coefficient (F), specific heat (Cp) and effective thermal conductivity (keff) of the sinter packed beds. The physical model is a similar geometry whose Darcy numbers (Da) are similar to the sinter cooler. Using the Cooling Index (CI) and Uniformity Index (UI) to analyze the thermo flowfield in the sinter packed bed obtains the cooling performance of the sinter cooler.Keywords: porous media, sinter, cooling index (CI), uniformity index (UI), CFD
Procedia PDF Downloads 4001309 An Improvement Study for Mattress Manufacturing Line with a Simulation Model
Authors: Murat Sarı, Emin Gundogar, Mumtaz Ipek
Abstract:
Nowadays, in a furniture sector, competition of market share (portion) and production variety and changeability enforce the firm to reengineer operations on manufacturing line to increase the productivity. In this study, spring mattress manufacturing line of the furniture manufacturing firm is analyzed analytically. It’s intended to search and find the bottlenecks of production to balance the semi-finished material flow. There are four base points required to investigate in bottleneck elimination process. These are bottlenecks of Method, Material, Machine and Man (work force) resources, respectively. Mentioned bottlenecks are investigated and varied scenarios are created for recruitment of manufacturing system. Probable near optimal alternatives are determined by system models built in Arena simulation software.Keywords: bottleneck search, buffer stock, furniture sector, simulation
Procedia PDF Downloads 3561308 An Approach Based on Statistics and Multi-Resolution Representation to Classify Mammograms
Authors: Nebi Gedik
Abstract:
One of the significant and continual public health problems in the world is breast cancer. Early detection is very important to fight the disease, and mammography has been one of the most common and reliable methods to detect the disease in the early stages. However, it is a difficult task, and computer-aided diagnosis (CAD) systems are needed to assist radiologists in providing both accurate and uniform evaluation for mass in mammograms. In this study, a multiresolution statistical method to classify mammograms as normal and abnormal in digitized mammograms is used to construct a CAD system. The mammogram images are represented by wave atom transform, and this representation is made by certain groups of coefficients, independently. The CAD system is designed by calculating some statistical features using each group of coefficients. The classification is performed by using support vector machine (SVM).Keywords: wave atom transform, statistical features, multi-resolution representation, mammogram
Procedia PDF Downloads 2221307 Using Discrete Event Simulation Approach to Reduce Waiting Times in Computed Tomography Radiology Department
Authors: Mwafak Shakoor
Abstract:
The purpose of this study was to reduce patient waiting times, improve system throughput and improve resources utilization in radiology department. A discrete event simulation model was developed using Arena simulation software to investigate different alternatives to improve the overall system delivery based on adding resource scenarios due to the linkage between patient waiting times and resource availability. The study revealed that there is no addition investment need to procure additional scanner but hospital management deploy managerial tactics to enhance machine utilization and reduce the long waiting time in the department.Keywords: discrete event simulation, radiology department, arena, waiting time, healthcare modeling, computed tomography
Procedia PDF Downloads 5911306 Multi-Label Approach to Facilitate Test Automation Based on Historical Data
Authors: Warda Khan, Remo Lachmann, Adarsh S. Garakahally
Abstract:
The increasing complexity of software and its applicability in a wide range of industries, e.g., automotive, call for enhanced quality assurance techniques. Test automation is one option to tackle the prevailing challenges by supporting test engineers with fast, parallel, and repetitive test executions. A high degree of test automation allows for a shift from mundane (manual) testing tasks to a more analytical assessment of the software under test. However, a high initial investment of test resources is required to establish test automation, which is, in most cases, a limitation to the time constraints provided for quality assurance of complex software systems. Hence, a computer-aided creation of automated test cases is crucial to increase the benefit of test automation. This paper proposes the application of machine learning for the generation of automated test cases. It is based on supervised learning to analyze test specifications and existing test implementations. The analysis facilitates the identification of patterns between test steps and their implementation with test automation components. For the test case generation, this approach exploits historical data of test automation projects. The identified patterns are the foundation to predict the implementation of unknown test case specifications. Based on this support, a test engineer solely has to review and parameterize the test automation components instead of writing them manually, resulting in a significant time reduction for establishing test automation. Compared to other generation approaches, this ML-based solution can handle different writing styles, authors, application domains, and even languages. Furthermore, test automation tools require expert knowledge by means of programming skills, whereas this approach only requires historical data to generate test cases. The proposed solution is evaluated using various multi-label evaluation criteria (EC) and two small-sized real-world systems. The most prominent EC is ‘Subset Accuracy’. The promising results show an accuracy of at least 86% for test cases, where a 1:1 relationship (Multi-Class) between test step specification and test automation component exists. For complex multi-label problems, i.e., one test step can be implemented by several components, the prediction accuracy is still at 60%. It is better than the current state-of-the-art results. It is expected the prediction quality to increase for larger systems with respective historical data. Consequently, this technique facilitates the time reduction for establishing test automation and is thereby independent of the application domain and project. As a work in progress, the next steps are to investigate incremental and active learning as additions to increase the usability of this approach, e.g., in case labelled historical data is scarce.Keywords: machine learning, multi-class, multi-label, supervised learning, test automation
Procedia PDF Downloads 1311305 Operator Efficiency Study for Assembly Line Optimization at Semiconductor Assembly and Test
Authors: Rohana Abdullah, Md Nizam Abd Rahman, Seri Rahayu Kamat
Abstract:
Operator efficiency aspect is gaining importance in ensuring optimized usage of resources especially in the semi-automated manufacturing environment. This paper addresses a case study done to solve operator efficiency and line balancing issue at a semiconductor assembly and test manufacturing. A Man-to-Machine (M2M) work study technique is used to study operator current utilization and determine the optimum allocation of the operators to the machines. Critical factors such as operator activity, activity frequency and operator competency level are considered to gain insight on the parameters that affects the operator utilization. Equipment standard time and overall equipment efficiency (OEE) information are also gathered and analyzed to achieve a balanced and optimized production.Keywords: operator efficiency, optimized production, line balancing, industrial and manufacturing engineering
Procedia PDF Downloads 7291304 Power Control of DFIG in WECS Using Backstipping and Sliding Mode Controller
Authors: Abdellah Boualouch, Ahmed Essadki, Tamou Nasser, Ali Boukhriss, Abdellatif Frigui
Abstract:
This paper presents a power control for a Doubly Fed Induction Generator (DFIG) using in Wind Energy Conversion System (WECS) connected to the grid. The proposed control strategy employs two nonlinear controllers, Backstipping (BSC) and sliding-mode controller (SMC) scheme to directly calculate the required rotor control voltage so as to eliminate the instantaneous errors of active and reactive powers. In this paper the advantages of BSC and SMC are presented, the performance and robustness of this two controller’s strategy are compared between them. First, we present a model of wind turbine and DFIG machine, then a synthesis of the controllers and their application in the DFIG power control. Simulation results on a 1.5MW grid-connected DFIG system are provided by MATLAB/Simulink.Keywords: backstipping, DFIG, power control, sliding-mode, WESC
Procedia PDF Downloads 5921303 Flow Visualization and Mixing Enhancement in Y-Junction Microchannel with 3D Acoustic Streaming Flow Patterns Induced by Trapezoidal Triangular Structure using High-Viscous Liquids
Authors: Ayalew Yimam Ali
Abstract:
The Y-shaped microchannel is used to mix both miscible or immiscible fluids with different viscosities. However, mixing at the entrance of the Y-junction microchannel can be a difficult mixing phenomena due to micro-scale laminar flow aspects with the two miscible high-viscosity water-glycerol fluids. One of the most promising methods to improve mixing performance and diffusion mass transfer in laminar flow phenomena is acoustic streaming (AS), which is a time-averaged, second-order steady streaming that can produce rolling motion in the microchannel by oscillating a low-frequency range acoustic transducer and inducing an acoustic wave in the flow field. The developed 3D trapezoidal, triangular structure spine used in this study was created using sophisticated CNC machine cutting tools used to create microchannel mold with a 3D trapezoidal triangular structure spine alone the Y-junction longitudinal mixing region. In order to create the molds for the 3D trapezoidal structure with the 3D sharp edge tip angles of 30° and 0.3mm trapezoidal triangular sharp edge tip depth from PMMA glass (Polymethylmethacrylate) with advanced CNC machine and the channel manufactured using PDMS (Polydimethylsiloxane) which is grown up longitudinally on top surface of the Y-junction microchannel using soft lithography nanofabrication strategies. Flow visualization of 3D rolling steady acoustic streaming and mixing enhancement with high-viscosity miscible fluids with different trapezoidal, triangular structure longitudinal length, channel width, high volume flow rate, oscillation frequency, and amplitude using micro-particle image velocimetry (μPIV) techniques were used to study the 3D acoustic streaming flow patterns and mixing enhancement. The streaming velocity fields and vorticity flow fields show 16 times more high vorticity maps than in the absence of acoustic streaming, and mixing performance has been evaluated at various amplitudes, flow rates, and frequencies using the grayscale value of pixel intensity with MATLAB software. Mixing experiments were performed using fluorescent green dye solution with de-ionized water in one inlet side of the channel, and the de-ionized water-glycerol mixture on the other inlet side of the Y-channel and degree of mixing was found to have greatly improved from 67.42% without acoustic streaming to 0.96.83% with acoustic streaming. The results show that the creation of a new 3D steady streaming rolling motion with a high volume flowrate around the entrance was enhanced by the formation of a new, three-dimensional, intense streaming rolling motion with a high-volume flowrate around the entrance junction mixing zone with the two miscible high-viscous fluids which are influenced by laminar flow fluid transport phenomena.Keywords: micro fabrication, 3d acoustic streaming flow visualization, micro-particle image velocimetry, mixing enhancement
Procedia PDF Downloads 201302 Online Authenticity Verification of a Biometric Signature Using Dynamic Time Warping Method and Neural Networks
Authors: Gałka Aleksandra, Jelińska Justyna, Masiak Albert, Walentukiewicz Krzysztof
Abstract:
An offline signature is well-known however not the safest way to verify identity. Nowadays, to ensure proper authentication, i.e. in banking systems, multimodal verification is more widely used. In this paper the online signature analysis based on dynamic time warping (DTW) coupled with machine learning approaches has been presented. In our research signatures made with biometric pens were gathered. Signature features as well as their forgeries have been described. For verification of authenticity various methods were used including convolutional neural networks using DTW matrix and multilayer perceptron using sums of DTW matrix paths. System efficiency has been evaluated on signatures and signature forgeries collected on the same day. Results are presented and discussed in this paper.Keywords: dynamic time warping, handwritten signature verification, feature-based recognition, online signature
Procedia PDF Downloads 1741301 BART Matching Method: Using Bayesian Additive Regression Tree for Data Matching
Authors: Gianna Zou
Abstract:
Propensity score matching (PSM), introduced by Paul R. Rosenbaum and Donald Rubin in 1983, is a popular statistical matching technique which tries to estimate the treatment effects by taking into account covariates that could impact the efficacy of study medication in clinical trials. PSM can be used to reduce the bias due to confounding variables. However, PSM assumes that the response values are normally distributed. In some cases, this assumption may not be held. In this paper, a machine learning method - Bayesian Additive Regression Tree (BART), is used as a more robust method of matching. BART can work well when models are misspecified since it can be used to model heterogeneous treatment effects. Moreover, it has the capability to handle non-linear main effects and multiway interactions. In this research, a BART Matching Method (BMM) is proposed to provide a more reliable matching method over PSM. By comparing the analysis results from PSM and BMM, BMM can perform well and has better prediction capability when the response values are not normally distributed.Keywords: BART, Bayesian, matching, regression
Procedia PDF Downloads 1451300 Classification Based on Deep Neural Cellular Automata Model
Authors: Yasser F. Hassan
Abstract:
Deep learning structure is a branch of machine learning science and greet achievement in research and applications. Cellular neural networks are regarded as array of nonlinear analog processors called cells connected in a way allowing parallel computations. The paper discusses how to use deep learning structure for representing neural cellular automata model. The proposed learning technique in cellular automata model will be examined from structure of deep learning. A deep automata neural cellular system modifies each neuron based on the behavior of the individual and its decision as a result of multi-level deep structure learning. The paper will present the architecture of the model and the results of simulation of approach are given. Results from the implementation enrich deep neural cellular automata system and shed a light on concept formulation of the model and the learning in it.Keywords: cellular automata, neural cellular automata, deep learning, classification
Procedia PDF Downloads 1941299 The Application of a Hybrid Neural Network for Recognition of a Handwritten Kazakh Text
Authors: Almagul Assainova , Dariya Abykenova, Liudmila Goncharenko, Sergey Sybachin, Saule Rakhimova, Abay Aman
Abstract:
The recognition of a handwritten Kazakh text is a relevant objective today for the digitization of materials. The study presents a model of a hybrid neural network for handwriting recognition, which includes a convolutional neural network and a multi-layer perceptron. Each network includes 1024 input neurons and 42 output neurons. The model is implemented in the program, written in the Python programming language using the EMNIST database, NumPy, Keras, and Tensorflow modules. The neural network training of such specific letters of the Kazakh alphabet as ә, ғ, қ, ң, ө, ұ, ү, h, і was conducted. The neural network model and the program created on its basis can be used in electronic document management systems to digitize the Kazakh text.Keywords: handwriting recognition system, image recognition, Kazakh font, machine learning, neural networks
Procedia PDF Downloads 2601298 Application of Deep Learning in Top Pair and Single Top Quark Production at the Large Hadron Collider
Authors: Ijaz Ahmed, Anwar Zada, Muhammad Waqas, M. U. Ashraf
Abstract:
We demonstrate the performance of a very efficient tagger applies on hadronically decaying top quark pairs as signal based on deep neural network algorithms and compares with the QCD multi-jet background events. A significant enhancement of performance in boosted top quark events is observed with our limited computing resources. We also compare modern machine learning approaches and perform a multivariate analysis of boosted top-pair as well as single top quark production through weak interaction at √s = 14 TeV proton-proton Collider. The most relevant known background processes are incorporated. Through the techniques of Boosted Decision Tree (BDT), likelihood and Multlayer Perceptron (MLP) the analysis is trained to observe the performance in comparison with the conventional cut based and count approachKeywords: top tagger, multivariate, deep learning, LHC, single top
Procedia PDF Downloads 1101297 Linac Quality Controls Using An Electronic Portal Imaging Device
Authors: Domingo Planes Meseguer, Raffaele Danilo Esposito, Maria Del Pilar Dorado Rodriguez
Abstract:
Monthly quality control checks for a Radiation Therapy Linac may be performed is a simple and efficient way once they have been standardized and protocolized. On the other hand this checks, in spite of being imperatives, require a not negligible execution times in terms of machine time and operators time. Besides it must be taken into account the amount of disposable material which may be needed together with the use of commercial software for their performing. With the aim of optimizing and standardizing mechanical-geometric checks and multi leaves collimator checks, we decided to implement a protocol which makes use of the Electronic Portal Imaging Device (EPID) available on our Linacs. The user is step by step guided by the software during the whole procedure. Acquired images are automatically analyzed by our programs all of them written using only free software.Keywords: quality control checks, linac, radiation oncology, medical physics, free software
Procedia PDF Downloads 1991296 Features for Measuring Credibility on Facebook Information
Authors: Kanda Runapongsa Saikaew, Chaluemwut Noyunsan
Abstract:
Nowadays social media information, such as news, links, images, or VDOs, is shared extensively. However, the effectiveness of disseminating information through social media lacks in quality: less fact checking, more biases, and several rumors. Many researchers have investigated about credibility on Twitter, but there is no the research report about credibility information on Facebook. This paper proposes features for measuring credibility on Facebook information. We developed the system for credibility on Facebook. First, we have developed FB credibility evaluator for measuring credibility of each post by manual human’s labelling. We then collected the training data for creating a model using Support Vector Machine (SVM). Secondly, we developed a chrome extension of FB credibility for Facebook users to evaluate the credibility of each post. Based on the usage analysis of our FB credibility chrome extension, about 81% of users’ responses agree with suggested credibility automatically computed by the proposed system.Keywords: facebook, social media, credibility measurement, internet
Procedia PDF Downloads 3551295 Competitive Advantages of a Firm without Fundamental Technology: A Case Study of Sony, Casio and Nintendo
Authors: Kiyohiro Yamazaki
Abstract:
A purpose of this study is to examine how a firm without fundamental technology is able to gain the competitive advantage. This paper examines three case studies, Sony in the flat display TV industry, Casio in the digital camera industry and Nintendo in the home game machine industry. This paper maintain the firms without fundamental technology construct two advantages, economic advantage and organizational advantage. An economic advantage involves the firm can select either high-tech or cheap devices out of several device makers, and change the alternatives cheaply and quickly. In addition, organizational advantage means that a firm without fundamental technology is not restricted by organizational inertia and cognitive restraints, and exercises the characteristic of strength.Keywords: firm without fundamental technology, economic advantage, organizational advantage, Sony, Casio, Nintendo
Procedia PDF Downloads 2871294 [Keynote] Implementation of Quality Control Procedures in Radiotherapy CT Simulator
Authors: B. Petrović, L. Rutonjski, M. Baucal, M. Teodorović, O. Čudić, B. Basarić
Abstract:
Purpose/Objective: Radiotherapy treatment planning requires use of CT simulator, in order to acquire CT images. The overall performance of CT simulator determines the quality of radiotherapy treatment plan, and at the end, the outcome of treatment for every single patient. Therefore, it is strongly advised by international recommendations, to set up a quality control procedures for every machine involved in radiotherapy treatment planning process, including the CT scanner/ simulator. The overall process requires number of tests, which are used on daily, weekly, monthly or yearly basis, depending on the feature tested. Materials/Methods: Two phantoms were used: a dedicated phantom CIRS 062QA, and a QA phantom obtained with the CT simulator. The examined CT simulator was Siemens Somatom Definition as Open, dedicated for radiation therapy treatment planning. The CT simulator has a built in software, which enables fast and simple evaluation of CT QA parameters, using the phantom provided with the CT simulator. On the other hand, recommendations contain additional test, which were done with the CIRS phantom. Also, legislation on ionizing radiation protection requires CT testing in defined periods of time. Taking into account the requirements of law, built in tests of a CT simulator, and international recommendations, the intitutional QC programme for CT imulator is defined, and implemented. Results: The CT simulator parameters evaluated through the study were following: CT number accuracy, field uniformity, complete CT to ED conversion curve, spatial and contrast resolution, image noise, slice thickness, and patient table stability.The following limits are established and implemented: CT number accuracy limits are +/- 5 HU of the value at the comissioning. Field uniformity: +/- 10 HU in selected ROIs. Complete CT to ED curve for each tube voltage must comply with the curve obtained at comissioning, with deviations of not more than 5%. Spatial and contrast resultion tests must comply with the tests obtained at comissioning, otherwise machine requires service. Result of image noise test must fall within the limit of 20% difference of the base value. Slice thickness must meet manufacturer specifications, and patient stability with longitudinal transfer of loaded table must not differ of more than 2mm vertical deviation. Conclusion: The implemented QA tests gave overall basic understanding of CT simulator functionality and its clinical effectiveness in radiation treatment planning. The legal requirement to the clinic is to set up it’s own QA programme, with minimum testing, but it remains user’s decision whether additional testing, as recommended by international organizations, will be implemented, so to improve the overall quality of radiation treatment planning procedure, as the CT image quality used for radiation treatment planning, influences the delineation of a tumor and calculation accuracy of treatment planning system, and finally delivery of radiation treatment to a patient.Keywords: CT simulator, radiotherapy, quality control, QA programme
Procedia PDF Downloads 5291293 Using Computer Vision and Machine Learning to Improve Facility Design for Healthcare Facility Worker Safety
Authors: Hengameh Hosseini
Abstract:
Design of large healthcare facilities – such as hospitals, multi-service line clinics, and nursing facilities - that can accommodate patients with wide-ranging disabilities is a challenging endeavor and one that is poorly understood among healthcare facility managers, administrators, and executives. An even less-understood extension of this problem is the implications of weakly or insufficiently accommodative design of facilities for healthcare workers in physically-intensive jobs who may also suffer from a range of disabilities and who are therefore at increased risk of workplace accident and injury. Combine this reality with the vast range of facility types, ages, and designs, and the problem of universal accommodation becomes even more daunting and complex. In this study, we focus on the implication of facility design for healthcare workers suffering with low vision who also have physically active jobs. The points of difficulty are myriad and could span health service infrastructure, the equipment used in health facilities, and transport to and from appointments and other services can all pose a barrier to health care if they are inaccessible, less accessible, or even simply less comfortable for people with various disabilities. We conduct a series of surveys and interviews with employees and administrators of 7 facilities of a range of sizes and ownership models in the Northeastern United States and combine that corpus with in-facility observations and data collection to identify five major points of failure common to all the facilities that we concluded could pose safety threats to employees with vision impairments, ranging from very minor to severe. We determine that lack of design empathy is a major commonality among facility management and ownership. We subsequently propose three methods for remedying this lack of empathy-informed design, to remedy the dangers posed to employees: the use of an existing open-sourced Augmented Reality application to simulate the low-vision experience for designers and managers; the use of a machine learning model we develop to automatically infer facility shortcomings from large datasets of recorded patient and employee reviews and feedback; and the use of a computer vision model fine tuned on images of each facility to infer and predict facility features, locations, and workflows, that could again pose meaningful dangers to visually impaired employees of each facility. After conducting a series of real-world comparative experiments with each of these approaches, we conclude that each of these are viable solutions under particular sets of conditions, and finally characterize the range of facility types, workforce composition profiles, and work conditions under which each of these methods would be most apt and successful.Keywords: artificial intelligence, healthcare workers, facility design, disability, visually impaired, workplace safety
Procedia PDF Downloads 1141292 Automatic Calibration of Agent-Based Models Using Deep Neural Networks
Authors: Sima Najafzadehkhoei, George Vega Yon
Abstract:
This paper presents an approach for calibrating Agent-Based Models (ABMs) efficiently, utilizing Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) networks. These machine learning techniques are applied to Susceptible-Infected-Recovered (SIR) models, which are a core framework in the study of epidemiology. Our method replicates parameter values from observed trajectory curves, enhancing the accuracy of predictions when compared to traditional calibration techniques. Through the use of simulated data, we train the models to predict epidemiological parameters more accurately. Two primary approaches were explored: one where the number of susceptible, infected, and recovered individuals is fully known, and another using only the number of infected individuals. Our method shows promise for application in other ABMs where calibration is computationally intensive and expensive.Keywords: ABM, calibration, CNN, LSTM, epidemiology
Procedia PDF Downloads 231291 Comparison of Tensile Strength and Folding Endurance of (FDM Process) 3D Printed ABS and PLA Materials
Authors: R. Devicharan
Abstract:
In a short span 3D Printing is expected to play a vital role in our life. The possibility of creativity and speed in manufacturing through various 3D printing processes is infinite. This study is performed on the FDM (Fused Deposition Modelling) method of 3D printing, which is one of the pre-dominant methods of 3D printing technologies. This study focuses on physical properties of the objects produced by 3D printing which determine the applications of the 3D printed objects. This paper specifically aims at the study of the tensile strength and the folding endurance of the 3D printed objects through the FDM (Fused Deposition Modelling) method using the ABS (Acronitirile Butadiene Styrene) and PLA (Poly Lactic Acid) plastic materials. The study is performed on a controlled environment and the specific machine settings. Appropriate tables, graphs are plotted and research analysis techniques will be utilized to analyse, verify and validate the experiment results.Keywords: FDM process, 3D printing, ABS for 3D printing, PLA for 3D printing, rapid prototyping
Procedia PDF Downloads 5971290 Adhesion of Sputtered Copper Thin Films Deposited on Flexible Substrates
Authors: Rwei-Ching Chang, Bo-Yu Su
Abstract:
Adhesion of copper thin films deposited on polyethylene terephthAdhesion of copper thin films deposited on polyethylene terephthalate substrate by direct current sputtering with different sputtering parameters is discussed in this work. The effects of plasma treatment with 0, 5, and 10 minutes on the thin film properties are investigated first. Various argon flow rates at 40, 50, 60 standard cubic centimeters per minute (sccm), deposition power at 30, 40, 50 W, and film thickness at 100, 200, 300 nm are also discussed. The 3-dimensional surface profilometer, micro scratch machine, and optical microscope are used to characterize the thin film properties. The results show that the increase of the plasma treatment time on the polyethylene terephthalate surface affects the roughness and critical load of the films. The critical load increases as the plasma treatment time increases. When the plasma treatment time was adjusted from 5 minutes to 10 minutes, the adhesion increased from 8.20 mN to 13.67 mN. When the argon flow rate is decreased from 60 sccm to 40 sccm, the adhesion increases from 8.27 mN to 13.67 mN. The adhesion is also increased by the condition of higher power, where the adhesion increased from 13.67 mN to 25.07 mN as the power increases from 30 W to 50 W. The adhesion of the film increases from 13.67 mN to 21.41mN as the film thickness increases from 100 nm to 300 nm. Comparing all the deposition parameters, it indicates the change of the power and thickness has much improvement on the film adhesion.alate substrate by direct current sputtering with different sputtering parameters is discussed in this work. The effects of plasma treatment with 0, 5, and 10 minutes on the thin film properties are investigated first. Various argon flow rates at 40, 50, 60 standard cubic centimeters per minute (sccm), deposition power at 30, 40, 50 W, and film thickness at 100, 200, 300 nm are also discussed. The 3-dimensional surface profilometer, micro scratch machine, and optical microscope are used to characterize the thin film properties. The results show that the increase of the plasma treatment time on the polyethylene terephthalate surface affects the roughness and critical load of the films. The critical load increases as the plasma treatment time increases. When the plasma treatment time was adjusted from 5 minutes to 10 minutes, the adhesion increased from 8.20 mN to 13.67 mN. When the argon flow rate is decreased from 60 sccm to 40 sccm, the adhesion increases from 8.27 mN to 13.67 mN. The adhesion is also increased by the condition of higher power, where the adhesion increased from 13.67 mN to 25.07 mN as the power increases from 30 W to 50 W. The adhesion of the film increases from 13.67 mN to 21.41mN as the film thickness increases from 100 nm to 300 nm. Comparing all the deposition parameters, it indicates the change of the power and thickness has much improvement on the film adhesion.Keywords: flexible substrate, sputtering, adhesion, copper thin film
Procedia PDF Downloads 1291289 Predicting OpenStreetMap Coverage by Means of Remote Sensing: The Case of Haiti
Authors: Ran Goldblatt, Nicholas Jones, Jennifer Mannix, Brad Bottoms
Abstract:
Accurate, complete, and up-to-date geospatial information is the foundation of successful disaster management. When the 2010 Haiti Earthquake struck, accurate and timely information on the distribution of critical infrastructure was essential for the disaster response community for effective search and rescue operations. Existing geospatial datasets such as Google Maps did not have comprehensive coverage of these features. In the days following the earthquake, many organizations released high-resolution satellite imagery, catalyzing a worldwide effort to map Haiti and support the recovery operations. Of these organizations, OpenStreetMap (OSM), a collaborative project to create a free editable map of the world, used the imagery to support volunteers to digitize roads, buildings, and other features, creating the most detailed map of Haiti in existence in just a few weeks. However, large portions of the island are still not fully covered by OSM. There is an increasing need for a tool to automatically identify which areas in Haiti, as well as in other countries vulnerable to disasters, that are not fully mapped. The objective of this project is to leverage different types of remote sensing measurements, together with machine learning approaches, in order to identify geographical areas where OSM coverage of building footprints is incomplete. Several remote sensing measures and derived products were assessed as potential predictors of OSM building footprints coverage, including: intensity of light emitted at night (based on VIIRS measurements), spectral indices derived from Sentinel-2 satellite (normalized difference vegetation index (NDVI), normalized difference built-up index (NDBI), soil-adjusted vegetation index (SAVI), urban index (UI)), surface texture (based on Sentinel-1 SAR measurements)), elevation and slope. Additional remote sensing derived products, such as Hansen Global Forest Change, DLR`s Global Urban Footprint (GUF), and World Settlement Footprint (WSF), were also evaluated as predictors, as well as OSM street and road network (including junctions). Using a supervised classification with a random forest classifier resulted in the prediction of 89% of the variation of OSM building footprint area in a given cell. These predictions allowed for the identification of cells that are predicted to be covered but are actually not mapped yet. With these results, this methodology could be adapted to any location to assist with preparing for future disastrous events and assure that essential geospatial information is available to support the response and recovery efforts during and following major disasters.Keywords: disaster management, Haiti, machine learning, OpenStreetMap, remote sensing
Procedia PDF Downloads 1221288 Verification of Geophysical Investigation during Subsea Tunnelling in Qatar
Authors: Gary Peach, Furqan Hameed
Abstract:
Musaimeer outfall tunnel is one of the longest storm water tunnels in the world, with a total length of 10.15 km. The tunnel will accommodate surface and rain water received from the drainage networks from 270 km of urban areas in southern Doha with a pumping capacity of 19.7m³/sec. The tunnel is excavated by Tunnel Boring Machine (TBM) through Rus Formation, Midra Shales, and Simsima Limestone. Water inflows at high pressure, complex mixed ground, and weaker ground strata prone to karstification with the presence of vertical and lateral fractures connected to the sea bed were also encountered during mining. In addition to pre-tender geotechnical investigations, the Contractor carried out a supplementary offshore geophysical investigation in order to fine-tune the existing results of geophysical and geotechnical investigations. Electric resistivity tomography (ERT) and Seismic Reflection survey was carried out. Offshore geophysical survey was performed, and interpretations of rock mass conditions were made to provide an overall picture of underground conditions along the tunnel alignment. This allowed the critical tunnelling area and cutter head intervention to be planned accordingly. Karstification was monitored with a non-intrusive radar system facility installed on the TBM. The Boring Electric Ahead Monitoring(BEAM) was installed at the cutter head and was able to predict the rock mass up to 3 tunnel diameters ahead of the cutter head. BEAM system was provided with an online system for real time monitoring of rock mass condition and then correlated with the rock mass conditions predicted during the interpretation phase of offshore geophysical surveys. The further correlation was carried by Samples of the rock mass taken from tunnel face inspections and excavated material produced by the TBM. The BEAM data was continuously monitored to check the variations in resistivity and percentage frequency effect (PFE) of the ground. This system provided information about rock mass condition, potential karst risk, and potential of water inflow. BEAM system was found to be more than 50% accurate in picking up the difficult ground conditions and faults as predicted in the geotechnical interpretative report before the start of tunnelling operations. Upon completion of the project, it was concluded that the combined use of different geophysical investigation results can make the execution stage be carried out in a more confident way with the less geotechnical risk involved. The approach used for the prediction of rock mass condition in Geotechnical Interpretative Report (GIR) and Geophysical Reflection and electric resistivity tomography survey (ERT) Geophysical Reflection surveys were concluded to be reliable as the same rock mass conditions were encountered during tunnelling operations.Keywords: tunnel boring machine (TBM), subsea, karstification, seismic reflection survey
Procedia PDF Downloads 2431287 Using Single Decision Tree to Assess the Impact of Cutting Conditions on Vibration
Authors: S. Ghorbani, N. I. Polushin
Abstract:
Vibration during machining process is crucial since it affects cutting tool, machine, and workpiece leading to a tool wear, tool breakage, and an unacceptable surface roughness. This paper applies a nonparametric statistical method, single decision tree (SDT), to identify factors affecting on vibration in machining process. Workpiece material (AISI 1045 Steel, AA2024 Aluminum alloy, A48-class30 Gray Cast Iron), cutting tool (conventional, cutting tool with holes in toolholder, cutting tool filled up with epoxy-granite), tool overhang (41-65 mm), spindle speed (630-1000 rpm), feed rate (0.05-0.075 mm/rev) and depth of cut (0.05-0.15 mm) were used as input variables, while vibration was the output parameter. It is concluded that workpiece material is the most important parameters for natural frequency followed by cutting tool and overhang.Keywords: cutting condition, vibration, natural frequency, decision tree, CART algorithm
Procedia PDF Downloads 3341286 Uplift Modeling Approach to Optimizing Content Quality in Social Q/A Platforms
Authors: Igor A. Podgorny
Abstract:
TurboTax AnswerXchange is a social Q/A system supporting users working on federal and state tax returns. Content quality and popularity in the AnswerXchange can be predicted with propensity models using attributes of the question and answer. Using uplift modeling, we identify features of questions and answers that can be modified during the question-asking and question-answering experience in order to optimize the AnswerXchange content quality. We demonstrate that adding details to the questions always results in increased question popularity that can be used to promote good quality content. Responding to close-ended questions assertively improve content quality in the AnswerXchange in 90% of cases. Answering knowledge questions with web links increases the likelihood of receiving a negative vote from 60% of the askers. Our findings provide a rationale for employing the uplift modeling approach for AnswerXchange operations.Keywords: customer relationship management, human-machine interaction, text mining, uplift modeling
Procedia PDF Downloads 2431285 LED Lighting Interviews and Assessment in Forest Machines
Authors: Rauno Pääkkönen, Fabriziomaria Gobba, Leena Korpinen
Abstract:
The objective of the study is to assess the implementation of LED lighting into forest machine work in the dark. In addition, the paper includes a wide variety of important and relevant safety and health parameters. In modern, computerized work in the cab of forest machines, artificial illumination is a demanding task when performing duties, such as the visual inspections of wood and computer calculations. We interviewed entrepreneurs and gathered the following as the most pertinent themes: (1) safety, (2) practical problems, and (3) work with LED lighting. The most important comments were in regards to the practical problems of LED lighting. We found indications of technical problems in implementing LED lighting, like snow and dirt on the surfaces of lamps that dim the emission of light. Moreover, service work in the dark forest is dangerous and increases the risks of on-site accidents. We also concluded that the amount of blue light to the eyes should be assessed, especially, when the drivers are working in a semi-dark cab.Keywords: forest machines, health, LED, safety
Procedia PDF Downloads 4291284 Statistical Analysis of Natural Images after Applying ICA and ISA
Authors: Peyman Sheikholharam Mashhadi
Abstract:
Difficulties in analyzing real world images in classical image processing and machine vision framework have motivated researchers towards considering the biology-based vision. It is a common belief that mammalian visual cortex has been adapted to the statistics of the real world images through the evolution process. There are two well-known successful models of mammalian visual cortical cells: Independent Component Analysis (ICA) and Independent Subspace Analysis (ISA). In this paper, we statistically analyze the dependencies which remain in the components after applying these models to the natural images. Also, we investigate the response of feature detectors to gratings with various parameters in order to find optimal parameters of the feature detectors. Finally, the selectiveness of feature detectors to phase, in both models is considered.Keywords: statistics, independent component analysis, independent subspace analysis, phase, natural images
Procedia PDF Downloads 338