Search results for: radiation processing
4440 Neutron Contamination in 18 MV Medical Linear Accelerator
Authors: Onur Karaman, A. Gunes Tanir
Abstract:
Photon radiation therapy used to treat cancer is one of the most important methods. However, photon beam collimator materials in Linear Accelerator (LINAC) head generally contains heavy elements is used and the interaction of bremsstrahlung photon with such heavy nuclei, the neutron can be produced inside the treatment rooms. In radiation therapy, neutron contamination contributes to the risk of secondary malignancies in patients, also physicians working in this field. Since the neutron is more dangerous than photon, it is important to determine neutron dose during radiotherapy treatment. In this study, it is aimed to analyze the effect of field size, distance from axis and depth on the amount of in-field and out-field neutron contamination for ElektaVmat accelerator with 18 MV nominal energy. The photon spectra at the distance of 75, 150, 225, 300 cm from target and on the isocenter of beam were scored for 5x5, 10x10, 20x20, 30x30 and 40x40 cm2 fields. Results demonstrated that the neutron spectra and dose are dependent on field size and distances. Beyond 225 cm of isocenter, the dependence of the neutron dose on field size is minimal. As a result, it is concluded that as the open field increases, neutron dose determined decreases. It is important to remember that when treating with high energy photons, the dose from contamination neutrons must be considered as it is much greater than the photon dose.Keywords: radiotherapy, neutron contamination, linear accelerators, photon
Procedia PDF Downloads 3484439 Segmentation of Gray Scale Images of Dropwise Condensation on Textured Surfaces
Authors: Helene Martin, Solmaz Boroomandi Barati, Jean-Charles Pinoli, Stephane Valette, Yann Gavet
Abstract:
In the present work we developed an image processing algorithm to measure water droplets characteristics during dropwise condensation on pillared surfaces. The main problem in this process is the similarity between shape and size of water droplets and the pillars. The developed method divides droplets into four main groups based on their size and applies the corresponding algorithm to segment each group. These algorithms generate binary images of droplets based on both their geometrical and intensity properties. The information related to droplets evolution during time including mean radius and drops number per unit area are then extracted from the binary images. The developed image processing algorithm is verified using manual detection and applied to two different sets of images corresponding to two kinds of pillared surfaces.Keywords: dropwise condensation, textured surface, image processing, watershed
Procedia PDF Downloads 2244438 Image Processing on Geosynthetic Reinforced Layers to Evaluate Shear Strength and Variations of the Strain Profiles
Authors: S. K. Khosrowshahi, E. Güler
Abstract:
This study investigates the reinforcement function of geosynthetics on the shear strength and strain profile of sand. Conducting a series of simple shear tests, the shearing behavior of the samples under static and cyclic loads was evaluated. Three different types of geosynthetics including geotextile and geonets were used as the reinforcement materials. An image processing analysis based on the optical flow method was performed to measure the lateral displacements and estimate the shear strains. It is shown that besides improving the shear strength, the geosynthetic reinforcement leads a remarkable reduction on the shear strains. The improved layer reduces the required thickness of the soil layer to resist against shear stresses. Consequently, the geosynthetic reinforcement can be considered as a proper approach for the sustainable designs, especially in the projects with huge amount of geotechnical applications like subgrade of the pavements, roadways, and railways.Keywords: image processing, soil reinforcement, geosynthetics, simple shear test, shear strain profile
Procedia PDF Downloads 2204437 Radon and Thoron Determination in Natural Ancient Mine Using Nuclear Track Detectors: Radiation Dose Assessment
Authors: L. Oufni, M. Amrane, R. Rabi
Abstract:
Radon (and thoron) is a naturally occurring radioactive noble gas, having variable distribution in the geological environment. The exposure of human beings to ionizing radiation from natural sources is a continuing and inescapable feature of life on earth. Radon, thoron and their short-lived decay products in the atmosphere are the most important contributors to human exposure from natural sources. The aim of this study is to determine alpha-and beta-activities per unit volume of air due to radon (222Rn), thoron (220Rn) and their progenies in the air of ancient mine of Aouli in which there is no working activity is situated at approximately 25 km north of the city of Midelt (Morocco), by using LR-115 type II and CR-39 solid state nuclear track detectors (SSNTDs). Equilibrium factors between radon and its daughters and between thoron and its progeny were evaluated in the studied atmospheres. The committed equivalent doses due to the 218Po and 214Po radon short-lived progeny were evaluated in different tissues of the respiratory tract of the visitors of the considered ancient mine. The visitors in these mines spent a good amount of time. It was essential to let the staff know about these values and take the needed steps to prevent any health complications.Keywords: radon, thoron, concentration, exposure dose, SSNTD, mine
Procedia PDF Downloads 5384436 The Effect of Irgafos 168 in the Thermostabilization of High Density Polyethylene
Authors: Mahdi Almaky
Abstract:
The thermostabilization of High Density Polyethylene (HDPE) is realized through the action of primary antioxidant such as phenolic antioxidants and secondary antioxidants as aryl phosphates. The efficiency of two secondary antioxidants, commercially named Irgafos 168 and Weston 399, was investigated using different physical, mechanical, spectroscopic, and calorimetric methods. The effect of both antioxidants on the processing stability and long term stability of HDPE produced in Ras Lanuf oil and gas processing Company were measured and compared. The combination of Irgafos 168 with Irganox 1010, as used in smaller concentration, results in a synergetic effect against thermo-oxidation and protect better than the combination of Weston 399 with Irganox 1010 against the colour change at processing temperature and during long term oxidation process.Keywords: thermostabilization, high density polyethylene, primary antioxidant, phenolic antioxidant, Irgafos 168, Irganox 1010, Weston 399
Procedia PDF Downloads 3564435 A Study on Thermal and Flow Characteristics by Solar Radiation for Single-Span Greenhouse by Computational Fluid Dynamics Simulation
Authors: Jonghyuk Yoon, Hyoungwoon Song
Abstract:
Recently, there are lots of increasing interest in a smart farming that represents application of modern Information and Communication Technologies (ICT) into agriculture since it provides a methodology to optimize production efficiencies by managing growing conditions of crops automatically. In order to obtain high performance and stability for smart greenhouse, it is important to identify the effect of various working parameters such as capacity of ventilation fan, vent opening area and etc. In the present study, a 3-dimensional CFD (Computational Fluid Dynamics) simulation for single-span greenhouse was conducted using the commercial program, Ansys CFX 18.0. The numerical simulation for single-span greenhouse was implemented to figure out the internal thermal and flow characteristics. In order to numerically model solar radiation that spread over a wide range of wavelengths, the multiband model that discretizes the spectrum into finite bands of wavelength based on Wien’s law is applied to the simulation. In addition, absorption coefficient of vinyl varied with the wavelength bands is also applied based on Beer-Lambert Law. To validate the numerical method applied herein, the numerical results of the temperature at specific monitoring points were compared with the experimental data. The average error rates (12.2~14.2%) between them was shown and numerical results of temperature distribution are in good agreement with the experimental data. The results of the present study can be useful information for the design of various greenhouses. This work was supported by Korea Institute of Planning and Evaluation for Technology in Food, Agriculture, Forestry and Fisheries (IPET) through Advanced Production Technology Development Program, funded by Ministry of Agriculture, Food and Rural Affairs (MAFRA)(315093-03).Keywords: single-span greenhouse, CFD (computational fluid dynamics), solar radiation, multiband model, absorption coefficient
Procedia PDF Downloads 1374434 Printed Thai Character Recognition Using Particle Swarm Optimization Algorithm
Authors: Phawin Sangsuvan, Chutimet Srinilta
Abstract:
This Paper presents the applications of Particle Swarm Optimization (PSO) Method for Thai optical character recognition (OCR). OCR consists of the pre-processing, character recognition and post-processing. Before enter into recognition process. The Character must be “Prepped” by pre-processing process. The PSO is an optimization method that belongs to the swarm intelligence family based on the imitation of social behavior patterns of animals. Route of each particle is determined by an individual data among neighborhood particles. The interaction of the particles with neighbors is the advantage of Particle Swarm to determine the best solution. So PSO is interested by a lot of researchers in many difficult problems including character recognition. As the previous this research used a Projection Histogram to extract printed digits features and defined the simple Fitness Function for PSO. The results reveal that PSO gives 67.73% for testing dataset. So in the future there can be explored enhancement the better performance of PSO with improve the Fitness Function.Keywords: character recognition, histogram projection, particle swarm optimization, pattern recognition techniques
Procedia PDF Downloads 4784433 Microdosimetry in Biological Cells: A Monte Carlo Method
Authors: Hamidreza Jabal Ameli, Anahita Movahedi
Abstract:
Purpose: In radionuclide therapy, radioactive atoms are coupled to monoclonal antibodies (mAbs) for treating cancer tumor while limiting radiation to healthy tissues. We know that tumoral and normal tissues are not equally sensitive to radiation. In fact, biological effects such as cellular repair processes or the presence of less radiosensitive cells such as hypoxic cells should be taken account. For this reason, in this paper, we want to calculate biological effect dose (BED) inside tumoral area and healthy cells around tumors. Methods: In this study, deposited doses of a radionuclide, gold-198, inside cells lattice and surrounding healthy tissues were calculated with Monte Carlo method. The elemental compositions and density of malignant and healthy tissues were obtained from ICRU Report 44. For reaching to real condition of oxygen effects, the necrosis and hypoxia area inside tumors has been assessed. Results: With regard to linear-quadratic expression which was defined in Monte Carlo, results showed that a large amount of BED is deposited in the well-oxygenated part of the hypoxia area compared to necrosis area. Moreover, there is a significant difference between the curves of absorbed dose with BED and without BED.Keywords: biological dose, monte carlo, hypoxia, radionuclide therapy
Procedia PDF Downloads 4904432 Evaluation of a Data Fusion Algorithm for Detecting and Locating a Radioactive Source through Monte Carlo N-Particle Code Simulation and Experimental Measurement
Authors: Hadi Ardiny, Amir Mohammad Beigzadeh
Abstract:
Through the utilization of a combination of various sensors and data fusion methods, the detection of potential nuclear threats can be significantly enhanced by extracting more information from different data. In this research, an experimental and modeling approach was employed to track a radioactive source by combining a surveillance camera and a radiation detector (NaI). To run this experiment, three mobile robots were utilized, with one of them equipped with a radioactive source. An algorithm was developed in identifying the contaminated robot through correlation between camera images and camera data. The computer vision method extracts the movements of all robots in the XY plane coordinate system, and the detector system records the gamma-ray count. The position of the robots and the corresponding count of the moving source were modeled using the MCNPX simulation code while considering the experimental geometry. The results demonstrated a high level of accuracy in finding and locating the target in both the simulation model and experimental measurement. The modeling techniques prove to be valuable in designing different scenarios and intelligent systems before initiating any experiments.Keywords: nuclear threats, radiation detector, MCNPX simulation, modeling techniques, intelligent systems
Procedia PDF Downloads 1264431 Parallel Vector Processing Using Multi Level Orbital DATA
Authors: Nagi Mekhiel
Abstract:
Many applications use vector operations by applying single instruction to multiple data that map to different locations in conventional memory. Transferring data from memory is limited by access latency and bandwidth affecting the performance gain of vector processing. We present a memory system that makes all of its content available to processors in time so that processors need not to access the memory, we force each location to be available to all processors at a specific time. The data move in different orbits to become available to other processors in higher orbits at different time. We use this memory to apply parallel vector operations to data streams at first orbit level. Data processed in the first level move to upper orbit one data element at a time, allowing a processor in that orbit to apply another vector operation to deal with serial code limitations inherited in all parallel applications and interleaved it with lower level vector operations.Keywords: Memory Organization, Parallel Processors, Serial Code, Vector Processing
Procedia PDF Downloads 2714430 Multi-Stage Classification for Lung Lesion Detection on CT Scan Images Applying Medical Image Processing Technique
Authors: Behnaz Sohani, Sahand Shahalinezhad, Amir Rahmani, Aliyu Aliyu
Abstract:
Recently, medical imaging and specifically medical image processing is becoming one of the most dynamically developing areas of medical science. It has led to the emergence of new approaches in terms of the prevention, diagnosis, and treatment of various diseases. In the process of diagnosis of lung cancer, medical professionals rely on computed tomography (CT) scans, in which failure to correctly identify masses can lead to incorrect diagnosis or sampling of lung tissue. Identification and demarcation of masses in terms of detecting cancer within lung tissue are critical challenges in diagnosis. In this work, a segmentation system in image processing techniques has been applied for detection purposes. Particularly, the use and validation of a novel lung cancer detection algorithm have been presented through simulation. This has been performed employing CT images based on multilevel thresholding. The proposed technique consists of segmentation, feature extraction, and feature selection and classification. More in detail, the features with useful information are selected after featuring extraction. Eventually, the output image of lung cancer is obtained with 96.3% accuracy and 87.25%. The purpose of feature extraction applying the proposed approach is to transform the raw data into a more usable form for subsequent statistical processing. Future steps will involve employing the current feature extraction method to achieve more accurate resulting images, including further details available to machine vision systems to recognise objects in lung CT scan images.Keywords: lung cancer detection, image segmentation, lung computed tomography (CT) images, medical image processing
Procedia PDF Downloads 1014429 A Systematic Review of Sensory Processing Patterns of Children with Autism Spectrum Disorders
Authors: Ala’a F. Jaber, Bara’ah A. Bsharat, Noor T. Ismael
Abstract:
Background: Sensory processing is a fundamental skill needed for the successful performance of daily living activities. These skills are impaired as parts of the neurodevelopmental process issues among children with autism spectrum disorder (ASD). This systematic review aimed to summarize the evidence on the differences in sensory processing and motor characteristic between children with ASD and children with TD. Method: This systematic review followed the guidelines of the preferred reporting items for systematic reviews and meta-analysis. The search terms included sensory, motor, condition, and child-related terms or phrases. The electronic search utilized Academic Search Ultimate, CINAHL Plus with Full Text, ERIC, MEDLINE, MEDLINE Complete, Psychology, and Behavioral Sciences Collection, and SocINDEX with full-text databases. The hand search included looking for potential studies in the references of related studies. The inclusion criteria included studies published in English between years 2009-2020 that included children aged 3-18 years with a confirmed ASD diagnosis, according to the DSM-V criteria, included a control group of typical children, included outcome measures related to the sensory processing and/or motor functions, and studies available in full-text. The review of included studies followed the Oxford Centre for Evidence-Based Medicine guidelines, and the Guidelines for Critical Review Form of Quantitative Studies, and the guidelines for conducting systematic reviews by the American Occupational Therapy Association. Results: Eighty-eight full-text studies related to the differences between children with ASD and children with TD in terms of sensory processing and motor characteristics were reviewed, of which eighteen articles were included in the quantitative synthesis. The results reveal that children with ASD had more extreme sensory processing patterns than children with TD, like hyper-responsiveness and hypo-responsiveness to sensory stimuli. Also, children with ASD had limited gross and fine motor abilities and lower strength, endurance, balance, eye-hand coordination, movement velocity, cadence, dexterity with a higher rate of gait abnormalities than children with TD. Conclusion: This systematic review provided preliminary evidence suggesting that motor functioning should be addressed in the evaluation and intervention for children with ASD, and sensory processing should be supported among children with TD. More future research should investigate whether how the performance and engagement in daily life activities are affected by sensory processing and motor skills.Keywords: sensory processing, occupational therapy, children, motor skills
Procedia PDF Downloads 1294428 Development of Multi-Leaf Collimator-Based Isocenter Verification Tool Using Electrical Portal Imaging Device for Stereotactic Radiosurgery
Authors: Panatda Intanin, Sangutid Thongsawad, Chirapha Tannanonta, Todsaporn Fuangrod
Abstract:
Stereotactic radiosurgery (SRS) is a highly precision delivery technique that requires comprehensive quality assurance (QA) tests prior to treatment delivery. An isocenter of delivery beam plays a critical role that affect the treatment accuracy. The uncertainty of isocenter is traditionally accessed using circular cone equipment, Winston-Lutz (WL) phantom and film. This technique is considered time consuming and highly dependent on the observer. In this work, the development of multileaf collimator (MLC)-based isocenter verification tool using electronic portal imaging device (EPID) was proposed and evaluated. A mechanical isocenter alignment with ball bearing diameter 5 mm and circular cone diameter 10 mm fixed to gantry head defines the radiation field was set as the conventional WL test method. The conventional setup was to compare to the proposed setup; using MLC (10 x 10 mm) to define the radiation filed instead of cone. This represents more realistic delivery field than using circular cone equipment. The acquisition from electronic portal imaging device (EPID) and radiographic film were performed in both experiments. The gantry angles were set as following: 0°, 90°, 180° and 270°. A software tool was in-house developed using MATLAB/SIMULINK programming to determine the centroid of radiation field and shadow of WL phantom automatically. This presents higher accuracy than manual measurement. The deviation between centroid of both cone-based and MLC-based WL tests were quantified. To compare between film and EPID image, the deviation for all gantry angle was 0.26±0.19mm and 0.43±0.30 for cone-based and MLC-based WL tests. For the absolute deviation calculation on EPID images between cone and MLC-based WL test was 0.59±0.28 mm and the absolute deviation on film images was 0.14±0.13 mm. Therefore, the MLC-based isocenter verification using EPID present high sensitivity tool for SRS QA.Keywords: isocenter verification, quality assurance, EPID, SRS
Procedia PDF Downloads 1544427 Robust and Real-Time Traffic Counting System
Authors: Hossam M. Moftah, Aboul Ella Hassanien
Abstract:
In the recent years the importance of automatic traffic control has increased due to the traffic jams problem especially in big cities for signal control and efficient traffic management. Traffic counting as a kind of traffic control is important to know the road traffic density in real time. This paper presents a fast and robust traffic counting system using different image processing techniques. The proposed system is composed of the following four fundamental building phases: image acquisition, pre-processing, object detection, and finally counting the connected objects. The object detection phase is comprised of the following five steps: subtracting the background, converting the image to binary, closing gaps and connecting nearby blobs, image smoothing to remove noises and very small objects, and detecting the connected objects. Experimental results show the great success of the proposed approach.Keywords: traffic counting, traffic management, image processing, object detection, computer vision
Procedia PDF Downloads 2954426 Effects of Cell Phone Electromagnetic Radiation on the Brain System
Authors: A. Alao Olumuyiwa
Abstract:
Health hazards reported to be associated with exposure to electromagnetic radiations which include brain tumors, genotoxic effects, neurological effects, immune system deregulation, allergic responses and some cardiovascular effects are discussed under a closed tabular model in this study. This review however showed that there is strong and robust evidence that chronic exposures to electromagnetic frequency across the spectrum, through strength, consistency, biological plausibility and many dose-response relationships, may result in brain cancer and other carcinogenic disease symptoms. There is therefore no safe threshold because of the genotoxic nature of the mechanism that may however be involved. The discussed study explains that the cell phone has induced effects upon the blood –brain barrier permeability and the cerebellum exposure to continuous long hours RF radiation may result in significant increase in albumin extravasations. A physical Biomodeling approach is however employed to review this health effects using Specific Absorption Rate (SAR) of different GSM machines to critically examine the symptoms such as a decreased loco motor activity, increased grooming and reduced memory functions in a variety of animal spices in classified grouped and sub grouped models.Keywords: brain cancer, electromagnetic radiations, physical biomodeling, specific absorption rate (SAR)
Procedia PDF Downloads 3484425 Measurement and Analysis of Radiation Doses to Radiosensitive Organs from CT Examination of the Cervical Spine Using Radiochromic Films and Monte Carlo Simulation Based Software
Authors: Khaled Soliman, Abdullah Alrushoud, Abdulrahman Alkhalifah, Raed Albathi, Salman Altymiat
Abstract:
Radiation dose received by patients undergoing Computed Tomography (CT) examination of the cervical spine was evaluated using Gafchromic XR-QA2 films and CT-EXPO software (ver. 2.3), in order to document our clinical dose values and to compare our results with other benchmarks reported in the current literature. Radiochromic films were recently used as practical dosimetry tool that provides dose profile information not available using the standard ionisation chamber routinely used in CT dosimetry. We have developed an in-house program to use the films in order to calculate the Entrance Dose Length Product (EDLP) in (mGy.cm) and to relate the EDLP to various organ doses calculated using the CT-EXPO software. We also calculated conversion factor in (mSv/mGy.cm) relating the EDLP to the effective dose (ED) from the examination using CT-EXPO software. Variability among different types of CT scanners and dose modulation methods are reported from at least three major CT brands available at our medical institution. Our work describes the dosimetry method and results are reported. The method can be used as in-vivo dosimetry method. But this work only reports results obtained from adult female anthropomorphic Phantom studies.Keywords: CT dosimetry, gafchromic films, XR-QA2, CT-Expo software
Procedia PDF Downloads 4714424 Sulfate Radicals Applied to the Elimination of Selected Pollutants in Water Matrices
Authors: F. Javier Benitez, Juan L. Acero, Francisco J. Real, Elena Rodriguez
Abstract:
Five selected pollutants which are frequently present in waters and wastewaters have been degraded by the advanced oxidation process constituted by UV radiation activated with the additional presence of persulfate (UV/PS). These pollutants were 1H-benzotriazole (BZ), N,N-diethyl-m-toluamide or DEET (DT), chlorophene (CP), 3-methylindole (ML), and nortriptyline hydrochloride (NH).While UV radiation alone almost not degraded these substances, the addition of PS generated the very reactive and oxidizing sulfate radical SO₄⁻. The kinetic study provided the second order rate constants for the reaction between this radical and each pollutant. An increasing dose of PS led to an increase in the degradation rate, being the highest results obtained at near neutral pH. Several water matrices were tested, and the presence of bicarbonate showed different effects: a decrease in the elimination of DT, BZ, and NH; and an increase in the oxidation of CP and ML. The additional presence of humic acids (AH) decreased this degradation, because of several effects: light screening and radical scavenging. The presence of several natural substances in waters (both types, inorganic and organic matter) usually diminishes the oxidation rates of organic pollutants, but this combination UV/PS process seems to be an efficient solution for the removal of the selected contaminants when are present in contaminated waters.Keywords: water purification, UV activated persulfate, kinetic study, sulfate radicals
Procedia PDF Downloads 1314423 Yoghurt Kepel Stelechocarpus burahol as an Effort of Functional Food Diversification from Region of Yogyakarta
Authors: Dian Nur Amalia, Rifqi Dhiemas Aji, Tri Septa Wahyuningsih, Endang Wahyuni
Abstract:
Kepel fruit (Stelechocarpus burahol) is a scarce fruit that belongs as a logogram of Daerah Istimewa Yogyakarta. Kepel fruit can be used as substance of beauty treatment product, such as deodorant and good for skin health, and also contains antioxidant compound. Otherwise, this fruit is scarcely cultivated by people because of its image as a palace fruit and also the flesh percentage just a little, so it has low economic value. The flesh of kepel fruit is about 49% of its whole fruit. This little part as supporting point why kepel fruit has to be extracted and processed with the other product. Yoghurt is milk processing product that also have a role as functional food. Economically, the price of yoghurt is higher than whole milk or other milk processing product. Yoghurt is usually added with flavor of dye from plant or from chemical substance. Kepel fruit has a role as flavor in yoghurt, besides as product that good for digestion, yoghurt with kepel also has function as “beauty” food. Writing method that used is literature study by looking for the potential of kepel fruit as a local fruit of Yogyakarta and yoghurt as milk processing product. The process just like making common yoghurt because kepel fruit just have a role as flavor substance, so it does not affect to the other processing of yoghurt. Food diversification can be done as an effort to increase the value of local resources that proper to compete in Asean Economic Community (AEC), one of the way is producing kepel yoghurt.Keywords: kepel, yoghurt, Daerah Istimewa Yogyakarta, functional food
Procedia PDF Downloads 3204422 Effect of Hydrogen Peroxide Concentration Produced by Cold Atmospheric Plasma on Inactivation of Escherichia Coli in Water
Authors: Zohreh Rashmei
Abstract:
Introduction: Plasma inactivation is one of the emerging technologies in biomedical field and has been applied to the inactivation of microorganisms in water. The inactivation effect has been attributed to the presence of active plasma species, i.e. OH, O, O3, H2O2, UV and electric fields, generated by the discharge of plasma. Material and Method: To evaluate germicidal effects of plasma, the electric spark discharge device was used. After the effect of the plasma samples were collected for culture medium agar plate count. In addition to biological experiments, the concentration of hydrogen peroxide was also measured. Results: The results showed that Plasma is able to inactivate a high concentration of E. coli. After a short period of plasma radiation on the surface of water, the amount log8 reduced the microbial load. Starting plasma radiation on the surface of the water, the measurements show of production and increasing the amount of hydrogen peroxide in water. So that at the end of the experiment, the concentration of hydrogen peroxide to about 100 mg / l increased. Conclusion: Increasing the concentration of hydrogen peroxide is directly related to the reduction of microbial load. The results of E. coli culture in media containing certain concentrations of H2O2 showed that E. coli can not to grow in a medium containing more than 2/5 mg/l of H2O2. Surely we can say that the main cause of killing bacteria is a molecule of H2O2.Keywords: plasma, hydrogen peroxide, disinfection, E. coli
Procedia PDF Downloads 1444421 Research on the Risks of Railroad Receiving and Dispatching Trains Operators: Natural Language Processing Risk Text Mining
Authors: Yangze Lan, Ruihua Xv, Feng Zhou, Yijia Shan, Longhao Zhang, Qinghui Xv
Abstract:
Receiving and dispatching trains is an important part of railroad organization, and the risky evaluation of operating personnel is still reflected by scores, lacking further excavation of wrong answers and operating accidents. With natural language processing (NLP) technology, this study extracts the keywords and key phrases of 40 relevant risk events about receiving and dispatching trains and reclassifies the risk events into 8 categories, such as train approach and signal risks, dispatching command risks, and so on. Based on the historical risk data of personnel, the K-Means clustering method is used to classify the risk level of personnel. The result indicates that the high-risk operating personnel need to strengthen the training of train receiving and dispatching operations towards essential trains and abnormal situations.Keywords: receiving and dispatching trains, natural language processing, risk evaluation, K-means clustering
Procedia PDF Downloads 934420 Development of Prediction Models of Day-Ahead Hourly Building Electricity Consumption and Peak Power Demand Using the Machine Learning Method
Authors: Dalin Si, Azizan Aziz, Bertrand Lasternas
Abstract:
To encourage building owners to purchase electricity at the wholesale market and reduce building peak demand, this study aims to develop models that predict day-ahead hourly electricity consumption and demand using artificial neural network (ANN) and support vector machine (SVM). All prediction models are built in Python, with tool Scikit-learn and Pybrain. The input data for both consumption and demand prediction are time stamp, outdoor dry bulb temperature, relative humidity, air handling unit (AHU), supply air temperature and solar radiation. Solar radiation, which is unavailable a day-ahead, is predicted at first, and then this estimation is used as an input to predict consumption and demand. Models to predict consumption and demand are trained in both SVM and ANN, and depend on cooling or heating, weekdays or weekends. The results show that ANN is the better option for both consumption and demand prediction. It can achieve 15.50% to 20.03% coefficient of variance of root mean square error (CVRMSE) for consumption prediction and 22.89% to 32.42% CVRMSE for demand prediction, respectively. To conclude, the presented models have potential to help building owners to purchase electricity at the wholesale market, but they are not robust when used in demand response control.Keywords: building energy prediction, data mining, demand response, electricity market
Procedia PDF Downloads 3174419 Porosity and Ultraviolet Protection Ability of Woven Fabrics
Authors: Polona Dobnik Dubrovski, Abhijit Majumdar
Abstract:
The increasing awareness of negative effects of ultraviolet radiation and regular, effective protection are actual themes in many countries. Woven fabrics as clothing items can provide convenient personal protection however not all fabrics offer sufficient UV protection. Porous structure of the material has a great effect on UPF. The paper is focused on an overview of porosity in woven fabrics, including the determination of porosity parameters on the basis of an ideal geometrical model of porous structure. Our experiment was focused on 100% cotton woven fabrics in a grey state with the same yarn fineness (14 tex) and different thread densities (to achieve relative fabric density between 59 % and 87 %) and different type of weaves (plain, 4-end twill, 5-end satin). The results of the research dealing with the modelling of UPF and the influence of volume and open porosity of tested samples on UPF are exposed. The results show that open porosity should be lower than 12 % to achieve good UV protection according to AS/NZ standard of tested samples. The results also indicate that there is no direct correlation between volume porosity and UPF, moreover, volume porosity namely depends on the type of weave and affects UPF as well. Plain fabrics did not offer any UV protection, while twill and satin fabrics offered good UV protection when volume porosity was less than 64 % and 66 %, respectively.Keywords: fabric engineering, UV radiation, porous materials, woven fabric construction, modelling
Procedia PDF Downloads 2704418 Vibration Response of Soundboards of Classical Guitars
Authors: Meng Koon Lee, Mohammad Hosseini Fouladi, Satesh Narayana Namasivayam
Abstract:
Research is focused on the response of soundboards of Classical guitars at frequencies up to 5 kHz as the soundboard is a major contributor to acoustic radiation at high frequencies when compared to the bridge and sound hole. A thin rectangular plate of variable thickness that is simply-supported on all sides is used as an analytical model of the research. This model is used to study the response of the guitar soundboard as the latter can be considered as a modified form of a rectangular plate. Homotopy Perturbation Method (HPM) is selected as a mathematical method to obtain an analytical solution of the 4th-order parabolic partial differential equation of motion of the rectangular plate of constant thickness viewed as a linear problem. This procedure is generalized to the nonlinear problem of the rectangular plate with variable thickness and an analytical solution can also be obtained. Sound power is used as a parameter to investigate the acoustic radiation of soundboards made from spruce using various bracing patterns. The sound power of soundboards made from Malaysian softwood such as damar minyak, sempilor or podo are investigated to determine the viability of replacing spruce as future materials for soundboards of Classical guitars.Keywords: rectangular plates, analytical solution, homotopy perturbation, natural frequencies
Procedia PDF Downloads 3914417 Design of New Alloys from Al-Ti-Zn-Mg-Cu System by in situ Al3Ti Formation
Authors: Joao Paulo De Oliveira Paschoal, Andre Victor Rodrigues Dantas, Fernando Almeida Da Silva Fernandes, Eugenio Jose Zoqui
Abstract:
With the adoption of High Pressure Die Casting technologies for the production of automotive bodies by the famous Giga Castings, the technology of processing metal alloys in the semi-solid state (SSM) becomes interesting because it allows for higher product quality, such as lower porosity and shrinkage voids. However, the alloys currently processed are derived from the foundry industry and are based on the Al-Si-(Cu-Mg) system. High-strength alloys, such as those of the Al-Zn-Mg-Cu system, are not usually processed, but the benefits of using this system, which is susceptible to heat treatments, can be associated with the advantages obtained by processing in the semi-solid state, promoting new possibilities for production routes and improving product performance. The current work proposes a new range of alloys to be processed in the semi-solid state through the modification of aluminum alloys of the Al-Zn-Mg-Cu system by the in-situ formation of Al3Ti intermetallic. Such alloys presented the thermodynamic stability required for semi-solid processing, with a sensitivity below 0.03(Celsius degrees * -1), in a wide temperature range. Furthermore, these alloys presented high hardness after aging heat treatment, reaching 190HV. Therefore, they are excellent candidates for the manufacture of parts that require low levels of defects and high mechanical strength.Keywords: aluminum alloys, semisolid metals processing, intermetallics, heat treatment, titanium aluminide
Procedia PDF Downloads 204416 TomoTherapy® System Repositioning Accuracy According to Treatment Localization
Authors: Veronica Sorgato, Jeremy Belhassen, Philippe Chartier, Roddy Sihanath, Nicolas Docquiere, Jean-Yves Giraud
Abstract:
We analyzed the image-guided radiotherapy method used by the TomoTherapy® System (Accuray Corp.) for patient repositioning in clinical routine. The TomoTherapy® System computes X, Y, Z and roll displacements to match the reference CT, on which the dosimetry has been performed, with the pre-treatment MV CT. The accuracy of the repositioning method has been studied according to the treatment localization. For this, a database of 18774 treatment sessions, performed during 2 consecutive years (2016-2017 period) has been used. The database includes the X, Y, Z and roll displacements proposed by TomoTherapy® System as well as the manual correction of these proposals applied by the radiation therapist. This manual correction aims to further improve the repositioning based on the clinical situation and depends on the structures surrounding the target tumor tissue. The statistical analysis performed on the database aims to define repositioning limits to be used as security and guiding tool for the manual adjustment implemented by the radiation therapist. This tool will participate not only to notify potential repositioning errors but also to further improve patient positioning for optimal treatment.Keywords: accuracy, IGRT MVCT, image-guided radiotherapy megavoltage computed tomography, statistical analysis, tomotherapy, localization
Procedia PDF Downloads 2264415 A Fast Parallel and Distributed Type-2 Fuzzy Algorithm Based on Cooperative Mobile Agents Model for High Performance Image Processing
Authors: Fatéma Zahra Benchara, Mohamed Youssfi, Omar Bouattane, Hassan Ouajji, Mohamed Ouadi Bensalah
Abstract:
The aim of this paper is to present a distributed implementation of the Type-2 Fuzzy algorithm in a parallel and distributed computing environment based on mobile agents. The proposed algorithm is assigned to be implemented on a SPMD (Single Program Multiple Data) architecture which is based on cooperative mobile agents as AVPE (Agent Virtual Processing Element) model in order to improve the processing resources needed for performing the big data image segmentation. In this work we focused on the application of this algorithm in order to process the big data MRI (Magnetic Resonance Images) image of size (n x m). It is encapsulated on the Mobile agent team leader in order to be split into (m x n) pixels one per AVPE. Each AVPE perform and exchange the segmentation results and maintain asynchronous communication with their team leader until the convergence of this algorithm. Some interesting experimental results are obtained in terms of accuracy and efficiency analysis of the proposed implementation, thanks to the mobile agents several interesting skills introduced in this distributed computational model.Keywords: distributed type-2 fuzzy algorithm, image processing, mobile agents, parallel and distributed computing
Procedia PDF Downloads 4294414 Accuracy of Computed Tomography Dose Monitor Values: A Multicentric Study in India
Authors: Adhimoolam Saravana Kumar, K. N. Govindarajan, B. Devanand, R. Rajakumar
Abstract:
The quality of Computed Tomography (CT) procedures has improved in recent years due to technological developments and increased diagnostic ability of CT scanners. Due to the fact that CT doses are the peak among diagnostic radiology practices, it is of great significance to be aware of patient’s CT radiation dose whenever a CT examination is preferred. CT radiation dose delivered to patients in the form of volume CT dose index (CTDIvol) values, is displayed on scanner monitors at the end of each examination and it is an important fact to assure that this information is accurate. The objective of this study was to estimate the CTDIvol values for great number of patients during the most frequent CT examinations, to study the comparison between CT dose monitor values and measured ones, as well as to highlight the fluctuation of CTDIvol values for the same CT examination at different centres and scanner models. The output CT dose indices measurements were carried out on single and multislice scanners for available kV, 5 mm slice thickness, 100 mA and FOV combination used. The 100 CT scanners were involved in this study. Data with regard to 15,000 examinations in patients, who underwent routine head, chest and abdomen CT were collected using a questionnaire sent to a large number of hospitals. Out of the 15,000 examinations, 5000 were head CT examinations, 5000 were chest CT examinations and 5000 were abdominal CT examinations. Comprehensive quality assurance (QA) was performed for all the machines involved in this work. Followed by QA, CT phantom dose measurements were carried out in South India using actual scanning parameters used clinically by the hospitals. From this study, we have measured the mean divergence between the measured and displayed CTDIvol values were 5.2, 8.4, and -5.7 for selected head, chest and abdomen procedures for protocols as mentioned above, respectively. Thus, this investigation revealed an observable change in CT practices, with a much wider range of studies being performed currently in South India. This reflects the improved capacity of CT scanners to scan longer scan lengths and at finer resolutions as permitted by helical and multislice technology. Also, some of the CT scanners have used smaller slice thickness for routine CT procedures to achieve better resolution and image quality. It leads to an increase in the patient radiation dose as well as the measured CTDIv, so it is suggested that such CT scanners should select appropriate slice thickness and scanning parameters in order to reduce the patient dose. If these routine scan parameters for head, chest and abdomen procedures are optimized than the dose indices would be optimal and lead to the lowering of the CT doses. In South Indian region all the CT machines were routinely tested for QA once in a year as per AERB requirements.Keywords: CT dose index, weighted CTDI, volumetric CTDI, radiation dose
Procedia PDF Downloads 2584413 Thermodynamic Analysis and Experimental Study of Agricultural Waste Plasma Processing
Authors: V. E. Messerle, A. B. Ustimenko, O. A. Lavrichshev
Abstract:
A large amount of manure and its irrational use negatively affect the environment. As compared with biomass fermentation, plasma processing of manure enhances makes it possible to intensify the process of obtaining fuel gas, which consists mainly of synthesis gas (CO + H₂), and increase plant productivity by 150–200 times. This is achieved due to the high temperature in the plasma reactor and a multiple reduction in waste processing time. This paper examines the plasma processing of biomass using the example of dried mixed animal manure (dung with a moisture content of 30%). Characteristic composition of dung, wt.%: Н₂О – 30, С – 29.07, Н – 4.06, О – 32.08, S – 0.26, N – 1.22, P₂O₅ – 0.61, K₂O – 1.47, СаО – 0.86, MgO – 0.37. The thermodynamic code TERRA was used to numerically analyze dung plasma gasification and pyrolysis. Plasma gasification and pyrolysis of dung were analyzed in the temperature range 300–3,000 K and pressure 0.1 MPa for the following thermodynamic systems: 100% dung + 25% air (plasma gasification) and 100% dung + 25% nitrogen (plasma pyrolysis). Calculations were conducted to determine the composition of the gas phase, the degree of carbon gasification, and the specific energy consumption of the processes. At an optimum temperature of 1,500 K, which provides both complete gasification of dung carbon and the maximum yield of combustible components (99.4 vol.% during dung gasification and 99.5 vol.% during pyrolysis), and decomposition of toxic compounds of furan, dioxin, and benz(a)pyrene, the following composition of combustible gas was obtained, vol.%: СО – 29.6, Н₂ – 35.6, СО₂ – 5.7, N₂ – 10.6, H₂O – 17.9 (gasification) and СО – 30.2, Н₂ – 38.3, СО₂ – 4.1, N₂ – 13.3, H₂O – 13.6 (pyrolysis). The specific energy consumption of gasification and pyrolysis of dung at 1,500 K is 1.28 and 1.33 kWh/kg, respectively. An installation with a DC plasma torch with a rated power of 100 kW and a plasma reactor with a dung capacity of 50 kg/h was used for dung processing experiments. The dung was gasified in an air (or nitrogen during pyrolysis) plasma jet, which provided a mass-average temperature in the reactor volume of at least 1,600 K. The organic part of the dung was gasified, and the inorganic part of the waste was melted. For pyrolysis and gasification of dung, the specific energy consumption was 1.5 kWh/kg and 1.4 kWh/kg, respectively. The maximum temperature in the reactor reached 1,887 K. At the outlet of the reactor, a gas of the following composition was obtained, vol.%: СO – 25.9, H₂ – 32.9, СO₂ – 3.5, N₂ – 37.3 (pyrolysis in nitrogen plasma); СO – 32.6, H₂ – 24.1, СO₂ – 5.7, N₂ – 35.8 (air plasma gasification). The specific heat of combustion of the combustible gas formed during pyrolysis and plasma-air gasification of agricultural waste is 10,500 and 10,340 kJ/kg, respectively. Comparison of the integral indicators of dung plasma processing showed satisfactory agreement between the calculation and experiment.Keywords: agricultural waste, experiment, plasma gasification, thermodynamic calculation
Procedia PDF Downloads 424412 Experimental Study of Hydrothermal Properties of Cool Pavements to Mitigate Urban Heat Islands
Authors: Youssef Wardeh, Elias Kinab, Pierre Rahme, Gilles Escadeillas, Stephane Ginestet
Abstract:
Urban heat islands designate a local phenomenon that appears in high density cities. This results in a rise ofambient temperature in the urban area compared to the neighboring rural area. Solar radiation plays an important role in this phenomenon since it is partially absorbed by the materials, especially roads and parking lots. Cool pavements constitute an innovative and promising technique to mitigate urban heat islands. The cool pavements studied in this work allow to limit the increase of the surface temperature, thanks to evaporation of the water conducted through capillary pores, from the humidified base to the surface exposed to solar radiation. However, the performance or the cooling capacity of a pavement sometimes remained difficult to characterize. In this work, a new definition of the cooling capacity of a pavement is presented, and a correlation between the latter and the hydrothermal properties of cool pavements is revealed. Firstly, several porous concrete pavements were characterized through their hydrothermal properties, which can be related to the cooling effect, such as albedo, thermal conductivity, water absorption, etc. Secondly, these pavements initially saturated and continuously supplied with water through their bases, were exposed to external solar radiation during three sunny summer days, and their surface temperatures were measured. For draining pavements, a strong second-degreepolynomial correlation(R² = 0.945) was found between the cooling capacity and the term, which reflects the interconnection of capillary water to the surface. Moreover, it was noticed that the cooling capacity reaches its maximum for an optimal range of capillary pores for which the capillary rise is stronger than gravity. For non-draining pavements, a good negative linear correlation (R² = 0.828) was obtained between the cooling capacity and the term, which expresses the ability to heat the capillary water by the energystored far from the surface, and, therefore, the dominance of the evaporation process by diffusion. The latest tests showed that this process is, however, likely to be disturbed by the material resistance to the water vapor diffusion.Keywords: urban heat islands, cool pavement, cooling capacity, hydrothermal properties, evaporation
Procedia PDF Downloads 994411 Open-Source YOLO CV For Detection of Dust on Solar PV Surface
Authors: Jeewan Rai, Kinzang, Yeshi Jigme Choden
Abstract:
Accumulation of dust on solar panels impacts the overall efficiency and the amount of energy they produce. While various techniques exist for detecting dust to schedule cleaning, many of these methods use MATLAB image processing tools and other licensed software, which can be financially burdensome. This study will investigate the efficiency of a free open-source computer vision library using the YOLO algorithm. The proposed approach has been tested on images of solar panels with varying dust levels through an experiment setup. The experimental findings illustrated the effectiveness of using the YOLO-based image classification method and the overall dust detection approach with an accuracy of 90% in distinguishing between clean and dusty panels. This open-source solution provides a cost effective and accessible alternative to commercial image processing tools, offering solutions for optimizing solar panel maintenance and enhancing energy production.Keywords: YOLO, openCV, dust detection, solar panels, computer vision, image processing
Procedia PDF Downloads 36