Search results for: covering machine
1506 Classification Based on Deep Neural Cellular Automata Model
Authors: Yasser F. Hassan
Abstract:
Deep learning structure is a branch of machine learning science and greet achievement in research and applications. Cellular neural networks are regarded as array of nonlinear analog processors called cells connected in a way allowing parallel computations. The paper discusses how to use deep learning structure for representing neural cellular automata model. The proposed learning technique in cellular automata model will be examined from structure of deep learning. A deep automata neural cellular system modifies each neuron based on the behavior of the individual and its decision as a result of multi-level deep structure learning. The paper will present the architecture of the model and the results of simulation of approach are given. Results from the implementation enrich deep neural cellular automata system and shed a light on concept formulation of the model and the learning in it.Keywords: cellular automata, neural cellular automata, deep learning, classification
Procedia PDF Downloads 1991505 The Application of a Hybrid Neural Network for Recognition of a Handwritten Kazakh Text
Authors: Almagul Assainova , Dariya Abykenova, Liudmila Goncharenko, Sergey Sybachin, Saule Rakhimova, Abay Aman
Abstract:
The recognition of a handwritten Kazakh text is a relevant objective today for the digitization of materials. The study presents a model of a hybrid neural network for handwriting recognition, which includes a convolutional neural network and a multi-layer perceptron. Each network includes 1024 input neurons and 42 output neurons. The model is implemented in the program, written in the Python programming language using the EMNIST database, NumPy, Keras, and Tensorflow modules. The neural network training of such specific letters of the Kazakh alphabet as ә, ғ, қ, ң, ө, ұ, ү, h, і was conducted. The neural network model and the program created on its basis can be used in electronic document management systems to digitize the Kazakh text.Keywords: handwriting recognition system, image recognition, Kazakh font, machine learning, neural networks
Procedia PDF Downloads 2641504 Application of Deep Learning in Top Pair and Single Top Quark Production at the Large Hadron Collider
Authors: Ijaz Ahmed, Anwar Zada, Muhammad Waqas, M. U. Ashraf
Abstract:
We demonstrate the performance of a very efficient tagger applies on hadronically decaying top quark pairs as signal based on deep neural network algorithms and compares with the QCD multi-jet background events. A significant enhancement of performance in boosted top quark events is observed with our limited computing resources. We also compare modern machine learning approaches and perform a multivariate analysis of boosted top-pair as well as single top quark production through weak interaction at √s = 14 TeV proton-proton Collider. The most relevant known background processes are incorporated. Through the techniques of Boosted Decision Tree (BDT), likelihood and Multlayer Perceptron (MLP) the analysis is trained to observe the performance in comparison with the conventional cut based and count approachKeywords: top tagger, multivariate, deep learning, LHC, single top
Procedia PDF Downloads 1111503 Linac Quality Controls Using An Electronic Portal Imaging Device
Authors: Domingo Planes Meseguer, Raffaele Danilo Esposito, Maria Del Pilar Dorado Rodriguez
Abstract:
Monthly quality control checks for a Radiation Therapy Linac may be performed is a simple and efficient way once they have been standardized and protocolized. On the other hand this checks, in spite of being imperatives, require a not negligible execution times in terms of machine time and operators time. Besides it must be taken into account the amount of disposable material which may be needed together with the use of commercial software for their performing. With the aim of optimizing and standardizing mechanical-geometric checks and multi leaves collimator checks, we decided to implement a protocol which makes use of the Electronic Portal Imaging Device (EPID) available on our Linacs. The user is step by step guided by the software during the whole procedure. Acquired images are automatically analyzed by our programs all of them written using only free software.Keywords: quality control checks, linac, radiation oncology, medical physics, free software
Procedia PDF Downloads 2011502 Features for Measuring Credibility on Facebook Information
Authors: Kanda Runapongsa Saikaew, Chaluemwut Noyunsan
Abstract:
Nowadays social media information, such as news, links, images, or VDOs, is shared extensively. However, the effectiveness of disseminating information through social media lacks in quality: less fact checking, more biases, and several rumors. Many researchers have investigated about credibility on Twitter, but there is no the research report about credibility information on Facebook. This paper proposes features for measuring credibility on Facebook information. We developed the system for credibility on Facebook. First, we have developed FB credibility evaluator for measuring credibility of each post by manual human’s labelling. We then collected the training data for creating a model using Support Vector Machine (SVM). Secondly, we developed a chrome extension of FB credibility for Facebook users to evaluate the credibility of each post. Based on the usage analysis of our FB credibility chrome extension, about 81% of users’ responses agree with suggested credibility automatically computed by the proposed system.Keywords: facebook, social media, credibility measurement, internet
Procedia PDF Downloads 3561501 Competitive Advantages of a Firm without Fundamental Technology: A Case Study of Sony, Casio and Nintendo
Authors: Kiyohiro Yamazaki
Abstract:
A purpose of this study is to examine how a firm without fundamental technology is able to gain the competitive advantage. This paper examines three case studies, Sony in the flat display TV industry, Casio in the digital camera industry and Nintendo in the home game machine industry. This paper maintain the firms without fundamental technology construct two advantages, economic advantage and organizational advantage. An economic advantage involves the firm can select either high-tech or cheap devices out of several device makers, and change the alternatives cheaply and quickly. In addition, organizational advantage means that a firm without fundamental technology is not restricted by organizational inertia and cognitive restraints, and exercises the characteristic of strength.Keywords: firm without fundamental technology, economic advantage, organizational advantage, Sony, Casio, Nintendo
Procedia PDF Downloads 2881500 [Keynote] Implementation of Quality Control Procedures in Radiotherapy CT Simulator
Authors: B. Petrović, L. Rutonjski, M. Baucal, M. Teodorović, O. Čudić, B. Basarić
Abstract:
Purpose/Objective: Radiotherapy treatment planning requires use of CT simulator, in order to acquire CT images. The overall performance of CT simulator determines the quality of radiotherapy treatment plan, and at the end, the outcome of treatment for every single patient. Therefore, it is strongly advised by international recommendations, to set up a quality control procedures for every machine involved in radiotherapy treatment planning process, including the CT scanner/ simulator. The overall process requires number of tests, which are used on daily, weekly, monthly or yearly basis, depending on the feature tested. Materials/Methods: Two phantoms were used: a dedicated phantom CIRS 062QA, and a QA phantom obtained with the CT simulator. The examined CT simulator was Siemens Somatom Definition as Open, dedicated for radiation therapy treatment planning. The CT simulator has a built in software, which enables fast and simple evaluation of CT QA parameters, using the phantom provided with the CT simulator. On the other hand, recommendations contain additional test, which were done with the CIRS phantom. Also, legislation on ionizing radiation protection requires CT testing in defined periods of time. Taking into account the requirements of law, built in tests of a CT simulator, and international recommendations, the intitutional QC programme for CT imulator is defined, and implemented. Results: The CT simulator parameters evaluated through the study were following: CT number accuracy, field uniformity, complete CT to ED conversion curve, spatial and contrast resolution, image noise, slice thickness, and patient table stability.The following limits are established and implemented: CT number accuracy limits are +/- 5 HU of the value at the comissioning. Field uniformity: +/- 10 HU in selected ROIs. Complete CT to ED curve for each tube voltage must comply with the curve obtained at comissioning, with deviations of not more than 5%. Spatial and contrast resultion tests must comply with the tests obtained at comissioning, otherwise machine requires service. Result of image noise test must fall within the limit of 20% difference of the base value. Slice thickness must meet manufacturer specifications, and patient stability with longitudinal transfer of loaded table must not differ of more than 2mm vertical deviation. Conclusion: The implemented QA tests gave overall basic understanding of CT simulator functionality and its clinical effectiveness in radiation treatment planning. The legal requirement to the clinic is to set up it’s own QA programme, with minimum testing, but it remains user’s decision whether additional testing, as recommended by international organizations, will be implemented, so to improve the overall quality of radiation treatment planning procedure, as the CT image quality used for radiation treatment planning, influences the delineation of a tumor and calculation accuracy of treatment planning system, and finally delivery of radiation treatment to a patient.Keywords: CT simulator, radiotherapy, quality control, QA programme
Procedia PDF Downloads 5341499 Using Computer Vision and Machine Learning to Improve Facility Design for Healthcare Facility Worker Safety
Authors: Hengameh Hosseini
Abstract:
Design of large healthcare facilities – such as hospitals, multi-service line clinics, and nursing facilities - that can accommodate patients with wide-ranging disabilities is a challenging endeavor and one that is poorly understood among healthcare facility managers, administrators, and executives. An even less-understood extension of this problem is the implications of weakly or insufficiently accommodative design of facilities for healthcare workers in physically-intensive jobs who may also suffer from a range of disabilities and who are therefore at increased risk of workplace accident and injury. Combine this reality with the vast range of facility types, ages, and designs, and the problem of universal accommodation becomes even more daunting and complex. In this study, we focus on the implication of facility design for healthcare workers suffering with low vision who also have physically active jobs. The points of difficulty are myriad and could span health service infrastructure, the equipment used in health facilities, and transport to and from appointments and other services can all pose a barrier to health care if they are inaccessible, less accessible, or even simply less comfortable for people with various disabilities. We conduct a series of surveys and interviews with employees and administrators of 7 facilities of a range of sizes and ownership models in the Northeastern United States and combine that corpus with in-facility observations and data collection to identify five major points of failure common to all the facilities that we concluded could pose safety threats to employees with vision impairments, ranging from very minor to severe. We determine that lack of design empathy is a major commonality among facility management and ownership. We subsequently propose three methods for remedying this lack of empathy-informed design, to remedy the dangers posed to employees: the use of an existing open-sourced Augmented Reality application to simulate the low-vision experience for designers and managers; the use of a machine learning model we develop to automatically infer facility shortcomings from large datasets of recorded patient and employee reviews and feedback; and the use of a computer vision model fine tuned on images of each facility to infer and predict facility features, locations, and workflows, that could again pose meaningful dangers to visually impaired employees of each facility. After conducting a series of real-world comparative experiments with each of these approaches, we conclude that each of these are viable solutions under particular sets of conditions, and finally characterize the range of facility types, workforce composition profiles, and work conditions under which each of these methods would be most apt and successful.Keywords: artificial intelligence, healthcare workers, facility design, disability, visually impaired, workplace safety
Procedia PDF Downloads 1171498 Automatic Calibration of Agent-Based Models Using Deep Neural Networks
Authors: Sima Najafzadehkhoei, George Vega Yon
Abstract:
This paper presents an approach for calibrating Agent-Based Models (ABMs) efficiently, utilizing Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) networks. These machine learning techniques are applied to Susceptible-Infected-Recovered (SIR) models, which are a core framework in the study of epidemiology. Our method replicates parameter values from observed trajectory curves, enhancing the accuracy of predictions when compared to traditional calibration techniques. Through the use of simulated data, we train the models to predict epidemiological parameters more accurately. Two primary approaches were explored: one where the number of susceptible, infected, and recovered individuals is fully known, and another using only the number of infected individuals. Our method shows promise for application in other ABMs where calibration is computationally intensive and expensive.Keywords: ABM, calibration, CNN, LSTM, epidemiology
Procedia PDF Downloads 271497 Comparison of Tensile Strength and Folding Endurance of (FDM Process) 3D Printed ABS and PLA Materials
Authors: R. Devicharan
Abstract:
In a short span 3D Printing is expected to play a vital role in our life. The possibility of creativity and speed in manufacturing through various 3D printing processes is infinite. This study is performed on the FDM (Fused Deposition Modelling) method of 3D printing, which is one of the pre-dominant methods of 3D printing technologies. This study focuses on physical properties of the objects produced by 3D printing which determine the applications of the 3D printed objects. This paper specifically aims at the study of the tensile strength and the folding endurance of the 3D printed objects through the FDM (Fused Deposition Modelling) method using the ABS (Acronitirile Butadiene Styrene) and PLA (Poly Lactic Acid) plastic materials. The study is performed on a controlled environment and the specific machine settings. Appropriate tables, graphs are plotted and research analysis techniques will be utilized to analyse, verify and validate the experiment results.Keywords: FDM process, 3D printing, ABS for 3D printing, PLA for 3D printing, rapid prototyping
Procedia PDF Downloads 5991496 Adhesion of Sputtered Copper Thin Films Deposited on Flexible Substrates
Authors: Rwei-Ching Chang, Bo-Yu Su
Abstract:
Adhesion of copper thin films deposited on polyethylene terephthAdhesion of copper thin films deposited on polyethylene terephthalate substrate by direct current sputtering with different sputtering parameters is discussed in this work. The effects of plasma treatment with 0, 5, and 10 minutes on the thin film properties are investigated first. Various argon flow rates at 40, 50, 60 standard cubic centimeters per minute (sccm), deposition power at 30, 40, 50 W, and film thickness at 100, 200, 300 nm are also discussed. The 3-dimensional surface profilometer, micro scratch machine, and optical microscope are used to characterize the thin film properties. The results show that the increase of the plasma treatment time on the polyethylene terephthalate surface affects the roughness and critical load of the films. The critical load increases as the plasma treatment time increases. When the plasma treatment time was adjusted from 5 minutes to 10 minutes, the adhesion increased from 8.20 mN to 13.67 mN. When the argon flow rate is decreased from 60 sccm to 40 sccm, the adhesion increases from 8.27 mN to 13.67 mN. The adhesion is also increased by the condition of higher power, where the adhesion increased from 13.67 mN to 25.07 mN as the power increases from 30 W to 50 W. The adhesion of the film increases from 13.67 mN to 21.41mN as the film thickness increases from 100 nm to 300 nm. Comparing all the deposition parameters, it indicates the change of the power and thickness has much improvement on the film adhesion.alate substrate by direct current sputtering with different sputtering parameters is discussed in this work. The effects of plasma treatment with 0, 5, and 10 minutes on the thin film properties are investigated first. Various argon flow rates at 40, 50, 60 standard cubic centimeters per minute (sccm), deposition power at 30, 40, 50 W, and film thickness at 100, 200, 300 nm are also discussed. The 3-dimensional surface profilometer, micro scratch machine, and optical microscope are used to characterize the thin film properties. The results show that the increase of the plasma treatment time on the polyethylene terephthalate surface affects the roughness and critical load of the films. The critical load increases as the plasma treatment time increases. When the plasma treatment time was adjusted from 5 minutes to 10 minutes, the adhesion increased from 8.20 mN to 13.67 mN. When the argon flow rate is decreased from 60 sccm to 40 sccm, the adhesion increases from 8.27 mN to 13.67 mN. The adhesion is also increased by the condition of higher power, where the adhesion increased from 13.67 mN to 25.07 mN as the power increases from 30 W to 50 W. The adhesion of the film increases from 13.67 mN to 21.41mN as the film thickness increases from 100 nm to 300 nm. Comparing all the deposition parameters, it indicates the change of the power and thickness has much improvement on the film adhesion.Keywords: flexible substrate, sputtering, adhesion, copper thin film
Procedia PDF Downloads 1311495 Predicting OpenStreetMap Coverage by Means of Remote Sensing: The Case of Haiti
Authors: Ran Goldblatt, Nicholas Jones, Jennifer Mannix, Brad Bottoms
Abstract:
Accurate, complete, and up-to-date geospatial information is the foundation of successful disaster management. When the 2010 Haiti Earthquake struck, accurate and timely information on the distribution of critical infrastructure was essential for the disaster response community for effective search and rescue operations. Existing geospatial datasets such as Google Maps did not have comprehensive coverage of these features. In the days following the earthquake, many organizations released high-resolution satellite imagery, catalyzing a worldwide effort to map Haiti and support the recovery operations. Of these organizations, OpenStreetMap (OSM), a collaborative project to create a free editable map of the world, used the imagery to support volunteers to digitize roads, buildings, and other features, creating the most detailed map of Haiti in existence in just a few weeks. However, large portions of the island are still not fully covered by OSM. There is an increasing need for a tool to automatically identify which areas in Haiti, as well as in other countries vulnerable to disasters, that are not fully mapped. The objective of this project is to leverage different types of remote sensing measurements, together with machine learning approaches, in order to identify geographical areas where OSM coverage of building footprints is incomplete. Several remote sensing measures and derived products were assessed as potential predictors of OSM building footprints coverage, including: intensity of light emitted at night (based on VIIRS measurements), spectral indices derived from Sentinel-2 satellite (normalized difference vegetation index (NDVI), normalized difference built-up index (NDBI), soil-adjusted vegetation index (SAVI), urban index (UI)), surface texture (based on Sentinel-1 SAR measurements)), elevation and slope. Additional remote sensing derived products, such as Hansen Global Forest Change, DLR`s Global Urban Footprint (GUF), and World Settlement Footprint (WSF), were also evaluated as predictors, as well as OSM street and road network (including junctions). Using a supervised classification with a random forest classifier resulted in the prediction of 89% of the variation of OSM building footprint area in a given cell. These predictions allowed for the identification of cells that are predicted to be covered but are actually not mapped yet. With these results, this methodology could be adapted to any location to assist with preparing for future disastrous events and assure that essential geospatial information is available to support the response and recovery efforts during and following major disasters.Keywords: disaster management, Haiti, machine learning, OpenStreetMap, remote sensing
Procedia PDF Downloads 1251494 A Joinpoint Regression Analysis of Trends in Tuberculosis Notifications in Two Urban Regions in Namibia
Authors: Anna M. N. Shifotoka, Richard Walker, Katie Haighton, Richard McNally
Abstract:
An analysis of trends in Case Notification Rates (CNR) can be used to monitor the impact of Tuberculosis (TB) control interventions over time in order to inform the implementation of current and future TB interventions. A retrospective analysis of trends in TB CNR for two urban regions in Namibia, namely Khomas and Erongo regions, was conducted. TB case notification data were obtained from annual TB reports of the national TB programme, Ministry of Health and Social Services, covering the period from 1997 to 2015. Joinpoint regression was used to analyse trends in CNR for different types of TB groups. A trend was considered to be statistically significant when a p-value was less than 0.05. During the period under review, the crude CNR for all forms of TB declined from 808 to 400 per 100 000 population in Khomas, and from 1051 to 611 per 100 000 population in Erongo. In both regions, significant change points in trends were observed for all types of TB groups examined. In Khomas region, the trend for new smear positive pulmonary TB increased significantly by an annual rate of 4.1% (95% Confidence Interval (CI): 0.3% to 8.2%) during the period 1997 to 2004, and thereafter declined significantly by -6.2% (95%CI: -7.7% to -4.3%) per year until 2015. Similarly, the trend for smear negative pulmonary TB increased significantly by 23.7% (95%CI: 9.7 to 39.5) per year from 1997 to 2004 and thereafter declined significantly by an annual change of -26.4% (95%CI: -33.1% to -19.8%). The trend for all forms of TB CNR in Khomas region increased significantly by 8.1% (95%CI: 3.7 to 12.7) per year from 1997 to 2004 and thereafter declined significantly a rate of -8.7% (95%CI: -10.6 to -6.8). In Erongo region, the trend for smear positive pulmonary TB increased at a rate of 1.2% (95%CI: -1.2% to 3.6%) annually during the earlier years (1997 to 2008), and thereafter declined significantly by -9.3% (95%CI: -13.3% to -5.0%) per year from 2008 to 2015. Also in Erongo, the trend for all forms of TB CNR increased significantly by an annual rate of 4.0% (95%CI: 1.4% to 6.6%) during the years between 1997 to 2006 and thereafter declined significantly by -10.4% (95%CI: -12.7% to -8.0%) per year during 2006 to 2015. The trend for extra-pulmonary TB CNR declined but did not reach statistical significance in both regions. In conclusion, CNRs declined for all types of TB examined in both regions. Further research is needed to study trends for other TB dimensions such as treatment outcomes and notification of drug resistant TB cases.Keywords: epidemiology, Namibia, temporal trends, tuberculosis
Procedia PDF Downloads 1541493 Tracking the Mind's Mouth: Use of Smart Technology for Effective Teaching of Speaking to Pupils with Developmental Co-ordination Disorder
Authors: Sadeq Al Yaari, Muhammad Alkhunayn, Ayah Al Yaari, Ayman Al Yaari, Montaha Al Yaari, Adham Al Yaari, Sajedah Al Yaari, Fatehi Eissa
Abstract:
Developmental co-ordination disorder (DCD) (also known as dyspraxia) causes a child to speak less well than expected in social conversations. We propose that the smart speaking technology could help improve sound production mechanism at both phonetic and phonological levels, which leads to better articulation of an utterance. The participants are twelve privately beginner pupils aged between 6-12 years old and diagnosed with DCD (apraxia) divided into two groups: Experimental group (n=6) and control group (called apraxic control group) (n=6). A total of fifty typically developing and achieving (TD) pupils participated as control group 2 in both groups and were preassigned into two groups (27 pupils with the treatment group and 23 with the apraxic control group). Weekly quizzes were given to all participants each week for four continuous months and results were analyzed by psychoneurolinguists and a statistician. Although being taught by the same speech-language therapist (SLT), treatment group along with TD groups were taught a full-time speaking course with sociolinguistic themes covering both phonetic and phonological properties. The course lasted for a whole semester whereby smart speaking aids have become dominant while apraxic control group and its TD group were not. Compared with apraxic control group and its TD subgroup, results show obvious changes in speaking behavioral mechanism of the DCD experimental group and its TD subgroup. Improvement could be taken from the scores where the zero marks disappeared in the fourth week (end of the first month of treatment). Good marks (5 +/10) were seen starting from the eighth week and culminating with full marks in the week number 15 of treatment where some participants scored full mark. This study concludes to support the primacy of the smart educational technology for speaking purposes and also shows that such aids can expand the range of academic performance differential categories. Further research is required to evaluate the current demonizing of smart educational aids and weighting more reasonably the relationship specificity that speaking aids can offer to other language skills, as well as their limitations.Keywords: smart educational technology, speaking aids, pupils with SCD, apraxia
Procedia PDF Downloads 521492 Verification of Geophysical Investigation during Subsea Tunnelling in Qatar
Authors: Gary Peach, Furqan Hameed
Abstract:
Musaimeer outfall tunnel is one of the longest storm water tunnels in the world, with a total length of 10.15 km. The tunnel will accommodate surface and rain water received from the drainage networks from 270 km of urban areas in southern Doha with a pumping capacity of 19.7m³/sec. The tunnel is excavated by Tunnel Boring Machine (TBM) through Rus Formation, Midra Shales, and Simsima Limestone. Water inflows at high pressure, complex mixed ground, and weaker ground strata prone to karstification with the presence of vertical and lateral fractures connected to the sea bed were also encountered during mining. In addition to pre-tender geotechnical investigations, the Contractor carried out a supplementary offshore geophysical investigation in order to fine-tune the existing results of geophysical and geotechnical investigations. Electric resistivity tomography (ERT) and Seismic Reflection survey was carried out. Offshore geophysical survey was performed, and interpretations of rock mass conditions were made to provide an overall picture of underground conditions along the tunnel alignment. This allowed the critical tunnelling area and cutter head intervention to be planned accordingly. Karstification was monitored with a non-intrusive radar system facility installed on the TBM. The Boring Electric Ahead Monitoring(BEAM) was installed at the cutter head and was able to predict the rock mass up to 3 tunnel diameters ahead of the cutter head. BEAM system was provided with an online system for real time monitoring of rock mass condition and then correlated with the rock mass conditions predicted during the interpretation phase of offshore geophysical surveys. The further correlation was carried by Samples of the rock mass taken from tunnel face inspections and excavated material produced by the TBM. The BEAM data was continuously monitored to check the variations in resistivity and percentage frequency effect (PFE) of the ground. This system provided information about rock mass condition, potential karst risk, and potential of water inflow. BEAM system was found to be more than 50% accurate in picking up the difficult ground conditions and faults as predicted in the geotechnical interpretative report before the start of tunnelling operations. Upon completion of the project, it was concluded that the combined use of different geophysical investigation results can make the execution stage be carried out in a more confident way with the less geotechnical risk involved. The approach used for the prediction of rock mass condition in Geotechnical Interpretative Report (GIR) and Geophysical Reflection and electric resistivity tomography survey (ERT) Geophysical Reflection surveys were concluded to be reliable as the same rock mass conditions were encountered during tunnelling operations.Keywords: tunnel boring machine (TBM), subsea, karstification, seismic reflection survey
Procedia PDF Downloads 2501491 Using Single Decision Tree to Assess the Impact of Cutting Conditions on Vibration
Authors: S. Ghorbani, N. I. Polushin
Abstract:
Vibration during machining process is crucial since it affects cutting tool, machine, and workpiece leading to a tool wear, tool breakage, and an unacceptable surface roughness. This paper applies a nonparametric statistical method, single decision tree (SDT), to identify factors affecting on vibration in machining process. Workpiece material (AISI 1045 Steel, AA2024 Aluminum alloy, A48-class30 Gray Cast Iron), cutting tool (conventional, cutting tool with holes in toolholder, cutting tool filled up with epoxy-granite), tool overhang (41-65 mm), spindle speed (630-1000 rpm), feed rate (0.05-0.075 mm/rev) and depth of cut (0.05-0.15 mm) were used as input variables, while vibration was the output parameter. It is concluded that workpiece material is the most important parameters for natural frequency followed by cutting tool and overhang.Keywords: cutting condition, vibration, natural frequency, decision tree, CART algorithm
Procedia PDF Downloads 3371490 Uplift Modeling Approach to Optimizing Content Quality in Social Q/A Platforms
Authors: Igor A. Podgorny
Abstract:
TurboTax AnswerXchange is a social Q/A system supporting users working on federal and state tax returns. Content quality and popularity in the AnswerXchange can be predicted with propensity models using attributes of the question and answer. Using uplift modeling, we identify features of questions and answers that can be modified during the question-asking and question-answering experience in order to optimize the AnswerXchange content quality. We demonstrate that adding details to the questions always results in increased question popularity that can be used to promote good quality content. Responding to close-ended questions assertively improve content quality in the AnswerXchange in 90% of cases. Answering knowledge questions with web links increases the likelihood of receiving a negative vote from 60% of the askers. Our findings provide a rationale for employing the uplift modeling approach for AnswerXchange operations.Keywords: customer relationship management, human-machine interaction, text mining, uplift modeling
Procedia PDF Downloads 2441489 Peace through Language Policy as a Solution to the Ethnic Conflict in Sri Lanka
Authors: R. M. W. Rajapakshe
Abstract:
Sri Lanka, which is officially called the Democratic Socialist Republic of Sri Lanka is an island nation situated near India. It is a multi-lingual, multi- religious and multi – ethnic country, where Sinhalese form the majority and the Tamils form the largest ethnic minority. The composition of the population (ethnic basis) in Sri Lanka is as follows: Sinhalese: 74.5%, Tamil (Sri Lankan): 12.6%, Muslim: 7.5 %, Tamil (Indian): 5.5%, Malay: 0.3%, Burgher: 0.3 %, other: 0.2 %. The Tamil people use the Tamil language as their mother tongue and the Sinhala people use the Sinhala language as their mother tongue. A very few people in both communities use English as their mother tongue and however, a large number of people use English as a second language. The Sinhala Language was declared the only official language in Sri Lanka in 1959. However, it was not acceptable to Tamil politicians as well as to the common Tamil people and it was the beginning of long standing ethnic crisis which later became a military war where a lot of blood was shed. As a solution to the above ethnic crisis the thirteenth amendment to the constitution of Sri Lanka was introduced in 1987 and according to it both Sinhala and Tamil were declared official languages and English as the link language in Sri Lanka. Thus, a new programme namely, second language teaching programme under which Sinhala was taught to Tamil students and Tamil was taught to Sinhala students, was introduced at government schools. Language teaching includes knowledge of the culture of the target language. As all cultures are mixed and have common features students have reduced their enmity about the other community and learned to respect the other culture. On the other hand as all languages are mixed, students came to the understanding that there are no pure languages. Thus, they learned to respect the other language. In the case of Sri Lanka the Sinhala language is mixed with the Tamil language and vice versa. Thus, the development of second language teaching is the prominent way to solve the above ethnic problem and this study clearly shows it. However, the above programme suffers with lack of trained second language teachers, infrastructure facilities and insufficient funds and, they can be considered as the main obstacles to develop the second language teaching programme. Yet, there are no satisfactory answers to those problems. The data were collected from relevant books, articles and other documents based on research and forty five recordings, each with one hour duration, of natural conversations covering all factions of the Sinhala community.Keywords: ethnic crisis, official language, second language teaching, Sinhala, Tami
Procedia PDF Downloads 3461488 LED Lighting Interviews and Assessment in Forest Machines
Authors: Rauno Pääkkönen, Fabriziomaria Gobba, Leena Korpinen
Abstract:
The objective of the study is to assess the implementation of LED lighting into forest machine work in the dark. In addition, the paper includes a wide variety of important and relevant safety and health parameters. In modern, computerized work in the cab of forest machines, artificial illumination is a demanding task when performing duties, such as the visual inspections of wood and computer calculations. We interviewed entrepreneurs and gathered the following as the most pertinent themes: (1) safety, (2) practical problems, and (3) work with LED lighting. The most important comments were in regards to the practical problems of LED lighting. We found indications of technical problems in implementing LED lighting, like snow and dirt on the surfaces of lamps that dim the emission of light. Moreover, service work in the dark forest is dangerous and increases the risks of on-site accidents. We also concluded that the amount of blue light to the eyes should be assessed, especially, when the drivers are working in a semi-dark cab.Keywords: forest machines, health, LED, safety
Procedia PDF Downloads 4321487 The Turkish Version of the Carer’s Assessment of Satisfaction Index (CASI-TR): Its Cultural Adaptation, Validation, and Reliability
Authors: Cemile Kütmeç Yilmaz, Güler Duru Asiret, Gulcan Bagcivan
Abstract:
The aim of this study was to evaluate the reliability and validity of the Turkish version of the Carer’s Assessment of Satisfaction Index (CASI-TR). The study was conducted between the dates of June 2016 and September 2017 at the Training and Research Hospital of Aksaray University with the caregiving family members of the inpatients with chronic diseases. For this study, the sample size was calculated as at least 10 individuals for each item (item number (30)X10=300). The study sample included 300 caregiving family members, who provided primer care for at least three months for a patient (who had at least one chronic disease and received inpatient treatment in general internal medicine and palliative care units). Data were collected by using a demographic questionnaire and CASI-TR. Descriptive statistics, and psychometric tests were used for the data analysis. Of those caregivers, 76.7% were female, 86.3% were 65 years old and below, 43.7% were primary school graduates, 87% were married, 86% were not working, 66.3% were housewives, and 60.3% defined their income status as having an income covering one’s expenses. Care recipients often had problems in terms of walking, sleep, balance, feeding and urinary incontinence. The Cronbach Alpha value calculated for the CASI-TR (30 items) was 0,949. Internal consistency coefficients calculated for subscales were: 0.922 for the subscale of ‘caregiver satisfaction related to care recipient’, 0.875 for the subscale of ‘caregiver satisfaction related to themselves’, and 0.723 for the subscale of ‘dynamics of interpersonal relations’. Factor analysis revealed that three factors accounted for 57.67% of the total variance, with an eigenvalue of >1. assessed in terms of significance, we saw that the items came together in a significant manner. The factor load of the items were between 0.311 and 0.874. These results show that the CASI-TR is a valid and reliable scale. The adoption of the translated CASI in Turkey is found reliable and valid to assessing the satisfaction of caregivers. CASI-TR can be used easily in clinics or house visits by nurses and other health professionals for assessing caregiver satisfaction from caregiving.Keywords: carer’s assessment of satisfaction index, caregiver, validity, reliability
Procedia PDF Downloads 2061486 Different Approaches to Teaching a Database Course to Undergraduate and Graduate Students
Authors: Samah Senbel
Abstract:
Database Design is a fundamental part of the Computer Science and Information technology curricula in any school, as well as in the study of management, business administration, and data analytics. In this study, we compare the performance of two groups of students studying the same database design and implementation course at Sacred Heart University in the fall of 2018. Both courses used the same textbook and were taught by the same professor, one for seven graduate students and one for 26 undergraduate students (juniors). The undergraduate students were aged around 20 years old with little work experience, while the graduate students averaged 35 years old and all were employed in computer-related or management-related jobs. The textbook used was 'Database Systems, Design, Implementation, and Management' by Coronel and Morris, and the course was designed to follow the textbook roughly a chapter per week. The first 6 weeks covered the design aspect of a database, followed by a paper exam. The next 6 weeks covered the implementation aspect of the database using SQL followed by a lab exam. Since the undergraduate students are on a 16 week semester, we spend the last three weeks of the course covering NoSQL. This part of the course was not included in this study. After the course was over, we analyze the results of the two groups of students. An interesting discrepancy was observed: In the database design part of the course, the average grade of the graduate students was 92%, while that of the undergraduate students was 77% for the same exam. In the implementation part of the course, we observe the opposite: the average grade of the graduate students was 65% while that of the undergraduate students was 73%. The overall grades were quite similar: the graduate average was 78% and that of the undergraduates was 75%. Based on these results, we concluded that having both classes follow the same time schedule was not beneficial, and an adjustment is needed. The graduates could spend less time on design and the undergraduates would benefit from more design time. In the fall of 2019, 30 students registered for the undergraduate course and 15 students registered for the graduate course. To test our conclusion, the undergraduates spend about 67% of time (eight classes) on the design part of the course and 33% (four classes) on the implementation part, using the exact exams as the previous year. This resulted in an improvement in their average grades on the design part from 77% to 83% and also their implementation average grade from 73% to 79%. In conclusion, we recommend using two separate schedules for teaching the database design course. For undergraduate students, it is important to spend more time on the design part rather than the implementation part of the course. While for the older graduate students, we recommend spending more time on the implementation part, as it seems that is the part they struggle with, even though they have a higher understanding of the design component of databases.Keywords: computer science education, database design, graduate and undergraduate students, pedagogy
Procedia PDF Downloads 1231485 Statistical Analysis of Natural Images after Applying ICA and ISA
Authors: Peyman Sheikholharam Mashhadi
Abstract:
Difficulties in analyzing real world images in classical image processing and machine vision framework have motivated researchers towards considering the biology-based vision. It is a common belief that mammalian visual cortex has been adapted to the statistics of the real world images through the evolution process. There are two well-known successful models of mammalian visual cortical cells: Independent Component Analysis (ICA) and Independent Subspace Analysis (ISA). In this paper, we statistically analyze the dependencies which remain in the components after applying these models to the natural images. Also, we investigate the response of feature detectors to gratings with various parameters in order to find optimal parameters of the feature detectors. Finally, the selectiveness of feature detectors to phase, in both models is considered.Keywords: statistics, independent component analysis, independent subspace analysis, phase, natural images
Procedia PDF Downloads 3401484 Leaching of Copper from Copper Ore Using Sulphuric Acid in the Presence of Hydrogen Peroxide as an Oxidizing Agent: An Optimized Process
Authors: Hilary Rutto
Abstract:
Leaching with acids are the most commonly reagents used to remove copper ions from its copper ores. It is important that the process conditions are optimized to improve the leaching efficiency. In the present study the effects of pH, oxidizing agent (hydrogen peroxide), stirring speed, solid to liquid ratio and acid concentration on the leaching of copper ions from it ore were investigated using a pH Stat apparatus. Copper ions were analyzed at the end of each experiment using Atomic Absorption (AAS) machine. Results showed that leaching efficiency improved with an increase in acid concentration, stirring speed, oxidizing agent, pH and decreased with an increase in the solid to liquid ratio.Keywords: leaching, copper, oxidizing agent, pH stat apparatus
Procedia PDF Downloads 3781483 Analysis of Influencing Factors on Infield-Logistics: A Survey of Different Farm Types in Germany
Authors: Michael Mederle, Heinz Bernhardt
Abstract:
The Management of machine fleets or autonomous vehicle control will considerably increase efficiency in future agricultural production. Especially entire process chains, e.g. harvesting complexes with several interacting combine harvesters, grain carts, and removal trucks, provide lots of optimization potential. Organization and pre-planning ensure to get these efficiency reserves accessible. One way to achieve this is to optimize infield path planning. Particularly autonomous machinery requires precise specifications about infield logistics to be navigated effectively and process optimized in the fields individually or in machine complexes. In the past, a lot of theoretical optimization has been done regarding infield logistics, mainly based on field geometry. However, there are reasons why farmers often do not apply the infield strategy suggested by mathematical route planning tools. To make the computational optimization more useful for farmers this study focuses on these influencing factors by expert interviews. As a result practice-oriented navigation not only to the field but also within the field will be possible. The survey study is intended to cover the entire range of German agriculture. Rural mixed farms with simple technology equipment are considered as well as large agricultural cooperatives which farm thousands of hectares using track guidance and various other electronic assistance systems. First results show that farm managers using guidance systems increasingly attune their infield-logistics on direction giving obstacles such as power lines. In consequence, they can avoid inefficient boom flippings while doing plant protection with the sprayer. Livestock farmers rather focus on the application of organic manure with its specific requirements concerning road conditions, landscape terrain or field access points. Cultivation of sugar beets makes great demands on infield patterns because of its particularities such as the row crop system or high logistics demands. Furthermore, several machines working in the same field simultaneously influence each other, regardless whether or not they are of the equal type. Specific infield strategies always are based on interactions of several different influences and decision criteria. Single working steps like tillage, seeding, plant protection or harvest mostly cannot be considered each individually. The entire production process has to be taken into consideration to detect the right infield logistics. One long-term objective of this examination is to integrate the obtained influences on infield strategies as decision criteria into an infield navigation tool. In this way, path planning will become more practical for farmers which is a basic requirement for automatic vehicle control and increasing process efficiency.Keywords: autonomous vehicle control, infield logistics, path planning, process optimizing
Procedia PDF Downloads 2331482 Assessment of Radiation Protection Measures in Diagnosis and Treatment: A Critical Review
Authors: Buhari Samaila, Buhari Maidamma
Abstract:
Background: The use of ionizing radiation in medical diagnostics and treatment is indispensable for accurate imaging and effective cancer therapies. However, radiation exposure carries inherent risks, necessitating strict protection measures to safeguard both patients and healthcare workers. This review critically examines the existing radiation protection measures in diagnostic radiology and radiotherapy, highlighting technological advancements, regulatory frameworks, and challenges. Objective: The objective of this review is to critically evaluate the effectiveness of current radiation protection measures in diagnostic and therapeutic radiology, focusing on minimizing patient and staff exposure to ionizing radiation while ensuring optimal clinical outcomes and propose future directions for improvement. Method: A comprehensive literature review was conducted, covering scientific studies, regulatory guidelines, and international standards on radiation protection in both diagnostic radiology and radiotherapy. Emphasis was placed on ALARA principles, dose optimization techniques, and protective measures for both patients and healthcare workers. Results: Radiation protection measures in diagnostic radiology include the use of shielding devices, minimizing exposure times, and employing advanced imaging technologies to reduce dose. In radiotherapy, accurate treatment planning and image-guided techniques enhance patient safety, while shielding and dose monitoring safeguard healthcare personnel. Challenges such as limited infrastructure in low-income settings and gaps in healthcare worker training persist, impacting the overall efficacy of protection strategies. Conclusion: While significant advancements have been made in radiation protection, challenges remain in optimizing safety, especially in resource-limited settings. Future efforts should focus on enhancing training, investing in advanced technologies, and strengthening regulatory compliance to ensure continuous improvement in radiation safety practices.Keywords: radiation protection, diagnostic radiology, radiotherapy, ALARA, patient safety, healthcare worker safety
Procedia PDF Downloads 271481 A Simplified, Low-Cost Mechanical Design for an Automated Motorized Mechanism to Clean Large Diameter Pipes
Authors: Imad Khan, Imran Shafi, Sarmad Farooq
Abstract:
Large diameter pipes, barrels, tubes, and ducts are used in a variety of applications covering civil and defense-related technologies. This may include heating/cooling networks, sign poles, bracing, casing, and artillery and tank gun barrels. These large diameter assemblies require regular inspection and cleaning to increase their life and reduce replacement costs. This paper describes the design, development, and testing results of an efficient yet simplified, low maintenance mechanical design controlled with minimal essential electronics using an electric motor for a non-technical staff. The proposed solution provides a simplified user interface and an automated cleaning mechanism that requires a single user to optimally clean pipes and barrels in the range of 105 mm to 203 mm caliber. The proposed system employs linear motion of specially designed brush along the barrel using a chain of specific strength and a pulley anchor attached to both ends of the barrel. A specially designed and manufactured gearbox is coupled with an AC motor to allow movement of contact brush with high torque to allow efficient cleaning. A suitably powered AC motor is fixed to the front adapter mounted on the muzzle side whereas the rear adapter has a pulley-based anchor mounted towards the breach block in case of a gun barrel. A mix of soft nylon and hard copper bristles-based large surface brush is connected through a strong steel chain to motor and anchor pulley. The system is equipped with limit switches to auto switch the direction when one end is reached on its operation. The testing results based on carefully established performance indicators indicate the superiority of the proposed user-friendly cleaning mechanism vis-à-vis its life cycle cost.Keywords: pipe cleaning mechanism, limiting switch, pipe cleaning robot, large pipes
Procedia PDF Downloads 1111480 Statistically Significant Differences of Carbon Dioxide and Carbon Monoxide Emission in Photocopying Process
Authors: Kiurski S. Jelena, Kecić S. Vesna, Oros B. Ivana
Abstract:
Experimental results confirmed the temporal variation of carbon dioxide and carbon monoxide concentration during the working shift of the photocopying process in a small photocopying shop in Novi Sad, Serbia. The statistically significant differences of target gases were examined with two-way analysis of variance without replication followed by Scheffe's post hoc test. The existence of statistically significant differences was obtained for carbon monoxide emission which is pointed out with F-values (12.37 and 31.88) greater than Fcrit (6.94) in contrary to carbon dioxide emission (F-values of 1.23 and 3.12 were less than Fcrit). Scheffe's post hoc test indicated that sampling point A (near the photocopier machine) and second time interval contribute the most on carbon monoxide emission.Keywords: analysis of variance, carbon dioxide, carbon monoxide, photocopying indoor, Scheffe's test
Procedia PDF Downloads 3281479 Recursive Parametric Identification of a Doubly Fed Induction Generator-Based Wind Turbine
Authors: A. El Kachani, E. Chakir, A. Ait Laachir, A. Niaaniaa, J. Zerouaoui
Abstract:
This document presents an adaptive controller based on recursive parametric identification applied to a wind turbine based on the doubly-fed induction machine (DFIG), to compensate the faults and guarantee efficient of the DFIG. The proposed adaptive controller is based on the recursive least square algorithm which considers that the best estimator for the vector parameter is the vector x minimizing a quadratic criterion. Furthermore, this method can improve the rapidity and precision of the controller based on a model. The proposed controller is validated via simulation on a 5.5 kW DFIG-based wind turbine. The results obtained seem to be good. In addition, they show the advantages of an adaptive controller based on recursive least square algorithm.Keywords: adaptive controller, recursive least squares algorithm, wind turbine, doubly fed induction generator
Procedia PDF Downloads 2921478 Conditions for Model Matching of Switched Asynchronous Sequential Machines with Output Feedback
Authors: Jung–Min Yang
Abstract:
Solvability of the model matching problem for input/output switched asynchronous sequential machines is discussed in this paper. The control objective is to determine the existence condition and design algorithm for a corrective controller that can match the stable-state behavior of the closed-loop system to that of a reference model. Switching operations and correction procedures are incorporated using output feedback so that the controlled switched machine can show the desired input/output behavior. A matrix expression is presented to address reachability of switched asynchronous sequential machines with output equivalence with respect to a model. The presented reachability condition for the controller design is validated in a simple example.Keywords: asynchronous sequential machines, corrective control, model matching, input/output control
Procedia PDF Downloads 3441477 Sentiment Analysis of Consumers’ Perceptions on Social Media about the Main Mobile Providers in Jamaica
Authors: Sherrene Bogle, Verlia Bogle, Tyrone Anderson
Abstract:
In recent years, organizations have become increasingly interested in the possibility of analyzing social media as a means of gaining meaningful feedback about their products and services. The aspect based sentiment analysis approach is used to predict the sentiment for Twitter datasets for Digicel and Lime, the main mobile companies in Jamaica, using supervised learning classification techniques. The results indicate an average of 82.2 percent accuracy in classifying tweets when comparing three separate classification algorithms against the purported baseline of 70 percent and an average root mean squared error of 0.31. These results indicate that the analysis of sentiment on social media in order to gain customer feedback can be a viable solution for mobile companies looking to improve business performance.Keywords: machine learning, sentiment analysis, social media, supervised learning
Procedia PDF Downloads 446