Search results for: heatmap visualization techniques
5956 Application of Deep Learning Algorithms in Agriculture: Early Detection of Crop Diseases
Authors: Manaranjan Pradhan, Shailaja Grover, U. Dinesh Kumar
Abstract:
Farming community in India, as well as other parts of the world, is one of the highly stressed communities due to reasons such as increasing input costs (cost of seeds, fertilizers, pesticide), droughts, reduced revenue leading to farmer suicides. Lack of integrated farm advisory system in India adds to the farmers problems. Farmers need right information during the early stages of crop’s lifecycle to prevent damage and loss in revenue. In this paper, we use deep learning techniques to develop an early warning system for detection of crop diseases using images taken by farmers using their smart phone. The research work leads to building a smart assistant using analytics and big data which could help the farmers with early diagnosis of the crop diseases and corrective actions. The classical approach for crop disease management has been to identify diseases at crop level. Recently, ImageNet Classification using the convolutional neural network (CNN) has been successfully used to identify diseases at individual plant level. Our model uses convolution filters, max pooling, dense layers and dropouts (to avoid overfitting). The models are built for binary classification (healthy or not healthy) and multi class classification (identifying which disease). Transfer learning is used to modify the weights of parameters learnt through ImageNet dataset and apply them on crop diseases, which reduces number of epochs to learn. One shot learning is used to learn from very few images, while data augmentation techniques are used to improve accuracy with images taken from farms by using techniques such as rotation, zoom, shift and blurred images. Models built using combination of these techniques are more robust for deploying in the real world. Our model is validated using tomato crop. In India, tomato is affected by 10 different diseases. Our model achieves an accuracy of more than 95% in correctly classifying the diseases. The main contribution of our research is to create a personal assistant for farmers for managing plant disease, although the model was validated using tomato crop, it can be easily extended to other crops. The advancement of technology in computing and availability of large data has made possible the success of deep learning applications in computer vision, natural language processing, image recognition, etc. With these robust models and huge smartphone penetration, feasibility of implementation of these models is high resulting in timely advise to the farmers and thus increasing the farmers' income and reducing the input costs.Keywords: analytics in agriculture, CNN, crop disease detection, data augmentation, image recognition, one shot learning, transfer learning
Procedia PDF Downloads 1205955 Methodology: A Review in Modelling and Predictability of Embankment in Soft Ground
Authors: Bhim Kumar Dahal
Abstract:
Transportation network development in the developing country is in rapid pace. The majority of the network belongs to railway and expressway which passes through diverse topography, landform and geological conditions despite the avoidance principle during route selection. Construction of such networks demand many low to high embankment which required improvement in the foundation soil. This paper is mainly focused on the various advanced ground improvement techniques used to improve the soft soil, modelling approach and its predictability for embankments construction. The ground improvement techniques can be broadly classified in to three groups i.e. densification group, drainage and consolidation group and reinforcement group which are discussed with some case studies. Various methods were used in modelling of the embankments from simple 1-dimensional to complex 3-dimensional model using variety of constitutive models. However, the reliability of the predictions is not found systematically improved with the level of sophistication. And sometimes the predictions are deviated more than 60% to the monitored value besides using same level of erudition. This deviation is found mainly due to the selection of constitutive model, assumptions made during different stages, deviation in the selection of model parameters and simplification during physical modelling of the ground condition. This deviation can be reduced by using optimization process, optimization tools and sensitivity analysis of the model parameters which will guide to select the appropriate model parameters.Keywords: cement, improvement, physical properties, strength
Procedia PDF Downloads 1765954 Noise Barrier Technique as a Way to Improve the Sonic Urban Environment along Existing Roadways Assessment: El-Gish Road Street, Alexandria, Egypt
Authors: Nihal Atif Salim
Abstract:
To improve the quality of life in cities, a variety of interventions are used. Noise is a substantial and important sort of pollution that has a negative impact on the urban environment and human health. According to the complaint survey, it ranks second among environmental contamination complaints (conducted by EEAA in 2019). The most significant source of noise in the city is traffic noise. In order to improve the sound urban environment, many physical techniques are applied. In the local area, noise barriers are considered as one of the most appropriate physical techniques along existing traffic routes. Alexandria is Egypt's second-largest city after Cairo. It is located along the Mediterranean Sea, and El- Gish Road is one of the city's main arteries. It impacts the waterfront promenade that extends along with the city by a high level of traffic noise. The purpose of this paper is to clarify the design considerations for the most appropriate noise barrier type along with the promenade, with the goal of improving the Quality of Life (QOL) and the sonic urban environment specifically. The proposed methodology focuses on how noise affects human perception and the environment. Then it delves into the various physical noise control approaches. After that, the paper discusses sustainable design decisions making. Finally, look into the importance of incorporating sustainability into design decisions making. Three stages will be followed in the case study. The first stage involves doing a site inspection and using specific sound measurement equipment (a noise level meter) to measure the noise level along the promenade at many sites, and the findings will be shown on a noise map. The second step is to inquire about the site's user experience. The third step is to investigate the various types of noise barriers and their effects on QOL along existing routes in order to select the most appropriate type. The goal of this research is to evaluate the suitable design of noise barriers that fulfill environmental and social perceptions while maintaining a balanced approach to the noise issue in order to improve QOL along existing roadways in the local area.Keywords: noise pollution, sonic urban environment, traffic noise, noise barrier, acoustic sustainability, noise reduction techniques
Procedia PDF Downloads 1395953 Language Errors Used in “The Space between Us” Movie and Their Effects on Translation Quality: Translation Study toward Discourse Analysis Approach
Authors: Mochamad Nuruz Zaman, Mangatur Rudolf Nababan, M. A. Djatmika
Abstract:
Both society and education areas teach to have good communication for building the interpersonal skills up. Everyone has the capacity to understand something new, either well comprehension or worst understanding. Worst understanding makes the language errors when the interactions are done by someone in the first meeting, and they do not know before it because of distance area. “The Space between Us” movie delivers the love-adventure story between Mars Boy and Earth Girl. They are so many missing conversations because of the different climate and environment. As the moviegoer also must be focused on the subtitle in order to enjoy well the movie. Furthermore, Indonesia subtitle and English conversation on the movie still have overlapping understanding in the translation. Translation hereby consists of source language -SL- (English conversation) and target language -TL- (Indonesia subtitle). These research gap above is formulated in research question by how the language errors happened in that movie and their effects on translation quality which is deepest analyzed by translation study toward discourse analysis approach. The research goal is to expand the language errors and their translation qualities in order to create a good atmosphere in movie media. The research is studied by embedded research in qualitative design. The research locations consist of setting, participant, and event as focused determined boundary. Sources of datum are “The Space between Us” movie and informant (translation quality rater). The sampling is criterion-based sampling (purposive sampling). Data collection techniques use content analysis and questioner. Data validation applies data source and method triangulation. Data analysis delivers domain, taxonomy, componential, and cultural theme analysis. Data findings on the language errors happened in the movie are referential, register, society, textual, receptive, expressive, individual, group, analogical, transfer, local, and global errors. Data discussions on their effects to translation quality are concentrated by translation techniques on their data findings; they are amplification, borrowing, description, discursive creation, established equivalent, generalization, literal, modulation, particularization, reduction, substitution, and transposition.Keywords: discourse analysis, language errors, The Space between Us movie, translation techniques, translation quality instruments
Procedia PDF Downloads 2195952 Improvement of the Traditional Techniques of Artistic Casting through the Development of Open Source 3D Printing Technologies Based on Digital Ultraviolet Light Processing
Authors: Drago Diaz Aleman, Jose Luis Saorin Perez, Cecile Meier, Itahisa Perez Conesa, Jorge De La Torre Cantero
Abstract:
Traditional manufacturing techniques used in artistic contexts compete with highly productive and efficient industrial procedures. The craft techniques and associated business models tend to disappear under the pressure of the appearance of mass-produced products that compete in all niche markets, including those traditionally reserved for the work of art. The surplus value derived from the prestige of the author, the exclusivity of the product or the mastery of the artist, do not seem to be sufficient reasons to preserve this productive model. In the last years, the adoption of open source digital manufacturing technologies in small art workshops can favor their permanence by assuming great advantages such as easy accessibility, low cost, and free modification, adapting to specific needs of each workshop. It is possible to use pieces modeled by computer and made with FDM (Fused Deposition Modeling) 3D printers that use PLA (polylactic acid) in the procedures of artistic casting. Models printed by PLA are limited to approximate minimum sizes of 3 cm, and optimal layer height resolution is 0.1 mm. Due to these limitations, it is not the most suitable technology for artistic casting processes of smaller pieces. An alternative to solve size limitation, are printers from the type (SLS) "selective sintering by laser". And other possibility is a laser hardens, by layers, metal powder and called DMLS (Direct Metal Laser Sintering). However, due to its high cost, it is a technology that is difficult to introduce in small artistic foundries. The low-cost DLP (Digital Light Processing) type printers can offer high resolutions for a reasonable cost (around 0.02 mm on the Z axis and 0.04 mm on the X and Y axes), and can print models with castable resins that allow the subsequent direct artistic casting in precious metals or their adaptation to processes such as electroforming. In this work, the design of a DLP 3D printer is detailed, using backlit LCD screens with ultraviolet light. Its development is totally "open source" and is proposed as a kit made up of electronic components, based on Arduino and easy to access mechanical components in the market. The CAD files of its components can be manufactured in low-cost FDM 3D printers. The result is less than 500 Euros, high resolution and open-design with free access that allows not only its manufacture but also its improvement. In future works, we intend to carry out different comparative analyzes, which allow us to accurately estimate the print quality, as well as the real cost of the artistic works made with it.Keywords: traditional artistic techniques, DLP 3D printer, artistic casting, electroforming
Procedia PDF Downloads 1425951 Performance Evaluation and Economic Analysis of Minimum Quantity Lubrication with Pressurized/Non-Pressurized Air and Nanofluid Mixture
Authors: M. Amrita, R. R. Srikant, A. V. Sita Rama Raju
Abstract:
Water miscible cutting fluids are conventionally used to lubricate and cool the machining zone. But issues related to health hazards, maintenance and disposal costs have limited their usage, leading to application of Minimum Quantity Lubrication (MQL). To increase the effectiveness of MQL, nanocutting fluids are proposed. In the present work, water miscible nanographite cutting fluids of varying concentration are applied at cutting zone by two systems A and B. System A utilizes high pressure air and supplies cutting fluid at a flow rate of 1ml/min. System B uses low pressure air and supplies cutting fluid at a flow rate of 5ml/min. Their performance in machining is evaluated by measuring cutting temperatures, tool wear, cutting forces and surface roughness and compared with dry machining and flood machining. Application of nano cutting fluid using both systems showed better performance than dry machining. Cutting temperatures and cutting forces obtained by both techniques are more than flood machining. But tool wear and surface roughness showed improvement compared to flood machining. Economic analysis has been carried out in all the cases to decide the applicability of the techniques.Keywords: economic analysis, machining, minimum quantity lubrication, nanofluid
Procedia PDF Downloads 3815950 Mapping of Alteration Zones in Mineral Rich Belt of South-East Rajasthan Using Remote Sensing Techniques
Authors: Mrinmoy Dhara, Vivek K. Sengar, Shovan L. Chattoraj, Soumiya Bhattacharjee
Abstract:
Remote sensing techniques have emerged as an asset for various geological studies. Satellite images obtained by different sensors contain plenty of information related to the terrain. Digital image processing further helps in customized ways for the prospecting of minerals. In this study, an attempt has been made to map the hydrothermally altered zones using multispectral and hyperspectral datasets of South East Rajasthan. Advanced Space-borne Thermal Emission and Reflection Radiometer (ASTER) and Hyperion (Level1R) dataset have been processed to generate different Band Ratio Composites (BRCs). For this study, ASTER derived BRCs were generated to delineate the alteration zones, gossans, abundant clays and host rocks. ASTER and Hyperion images were further processed to extract mineral end members and classified mineral maps have been produced using Spectral Angle Mapper (SAM) method. Results were validated with the geological map of the area which shows positive agreement with the image processing outputs. Thus, this study concludes that the band ratios and image processing in combination play significant role in demarcation of alteration zones which may provide pathfinders for mineral prospecting studies.Keywords: ASTER, hyperion, band ratios, alteration zones, SAM
Procedia PDF Downloads 2805949 Vehicular Speed Detection Camera System Using Video Stream
Authors: C. A. Anser Pasha
Abstract:
In this paper, a new Vehicular Speed Detection Camera System that is applicable as an alternative to traditional radars with the same accuracy or even better is presented. The real-time measurement and analysis of various traffic parameters such as speed and number of vehicles are increasingly required in traffic control and management. Image processing techniques are now considered as an attractive and flexible method for automatic analysis and data collections in traffic engineering. Various algorithms based on image processing techniques have been applied to detect multiple vehicles and track them. The SDCS processes can be divided into three successive phases; the first phase is Objects detection phase, which uses a hybrid algorithm based on combining an adaptive background subtraction technique with a three-frame differencing algorithm which ratifies the major drawback of using only adaptive background subtraction. The second phase is Objects tracking, which consists of three successive operations - object segmentation, object labeling, and object center extraction. Objects tracking operation takes into consideration the different possible scenarios of the moving object like simple tracking, the object has left the scene, the object has entered the scene, object crossed by another object, and object leaves and another one enters the scene. The third phase is speed calculation phase, which is calculated from the number of frames consumed by the object to pass by the scene.Keywords: radar, image processing, detection, tracking, segmentation
Procedia PDF Downloads 4685948 The Application of Sensory Integration Techniques in Science Teaching Students with Autism
Authors: Joanna Estkowska
Abstract:
The Sensory Integration Method is aimed primarily at children with learning disabilities. It can also be used as a complementary method in treatment of children with cerebral palsy, autistic, mentally handicapped, blind and deaf. Autism is holistic development disorder that manifests itself in the specific functioning of a child. The most characteristic are: disorders in communication, difficulties in social relations, rigid patterns of behavior and impairment in sensory processing. In addition to these disorders may occur abnormal intellectual development, attention deficit disorders, perceptual disorders and others. This study was focused on the application sensory integration techniques in science education of autistic students. The lack of proper sensory integration causes problems with complicated processes such as motor coordination, movement planning, visual or auditory perception, speech, writing, reading or counting. Good functioning and cooperation of proprioceptive, tactile and vestibular sense affect the child’s mastery of skills that require coordination of both sides of the body and synchronization of the cerebral hemispheres. These include, for example, all sports activities, precise manual skills such writing, as well as, reading and counting skills. All this takes place in stages. Achieving skills from the first stage determines the development of fitness from the next level. Any deficit in the scope of the first three stages can affect the development of new skills. This ultimately reflects on the achievements at school and in further professional and personal life. After careful analysis symptoms from the emotional and social spheres appear to be secondary to deficits of sensory integration. During our research, the students gained knowledge and skills in the classroom of experience by learning biology, chemistry and physics with application sensory integration techniques. Sensory integration therapy aims to teach the child an adequate response to stimuli coming to him from both the outside world and the body. Thanks to properly selected exercises, a child can improve perception and interpretation skills, motor skills, coordination of movements, attention and concentration or self-awareness, as well as social and emotional functioning.Keywords: autism spectrum disorder, science education, sensory integration, special educational needs
Procedia PDF Downloads 1865947 Exploring Influence Range of Tainan City Using Electronic Toll Collection Big Data
Authors: Chen Chou, Feng-Tyan Lin
Abstract:
Big Data has been attracted a lot of attentions in many fields for analyzing research issues based on a large number of maternal data. Electronic Toll Collection (ETC) is one of Intelligent Transportation System (ITS) applications in Taiwan, used to record starting point, end point, distance and travel time of vehicle on the national freeway. This study, taking advantage of ETC big data, combined with urban planning theory, attempts to explore various phenomena of inter-city transportation activities. ETC, one of government's open data, is numerous, complete and quick-update. One may recall that living area has been delimited with location, population, area and subjective consciousness. However, these factors cannot appropriately reflect what people’s movement path is in daily life. In this study, the concept of "Living Area" is replaced by "Influence Range" to show dynamic and variation with time and purposes of activities. This study uses data mining with Python and Excel, and visualizes the number of trips with GIS to explore influence range of Tainan city and the purpose of trips, and discuss living area delimited in current. It dialogues between the concepts of "Central Place Theory" and "Living Area", presents the new point of view, integrates the application of big data, urban planning and transportation. The finding will be valuable for resource allocation and land apportionment of spatial planning.Keywords: Big Data, ITS, influence range, living area, central place theory, visualization
Procedia PDF Downloads 2795946 A Semiotic Approach to Vulnerability in Conducting Gesture and Singing Posture
Authors: Johann Van Niekerk
Abstract:
The disciplines of conducting (instrumental or choral) and of singing presume a willingness toward an open posture and, in many cases, demand it for effective communication and technique. Yet, this very openness, with the "spread-eagle" gesture as an extreme, is oftentimes counterintuitive for musicians and within the trajectory of human evolution. Conversely, it is in this very gesture of "taking up space" that confidence-gaining techniques such as the popular "power pose" are based. This paper consists primarily of a literature review, exploring the topics of physical openness and vulnerability, considering the semiotics of the "spread-eagle" and its accompanying letter X. A major finding of this research is the discrepancy between evolutionary instinct towards physical self-protection and “folding in” and the demands of the discipline of physical and gestural openness, expansiveness and vulnerability. A secondary finding is ways in which encouragement of confidence-gaining techniques may be more effective in obtaining the required results than insistence on vulnerability, which is influenced by various cultural contexts and socialization. Choral conductors and music educators are constantly seeking ways to promote engagement and healthy singing. Much of the information and direction toward this goal is gleaned by students from conducting gestures and other pedagogies employed in the rehearsal. The findings of this research provide yet another avenue toward reaching the goals required for sufficient and effective teaching and artistry on the part of instructors and students alike.Keywords: conducting, gesture, music, pedagogy, posture, vulnerability
Procedia PDF Downloads 825945 Simulation Study of Asphaltene Deposition and Solubility of CO2 in the Brine during Cyclic CO2 Injection Process in Unconventional Tight Reservoirs
Authors: Rashid S. Mohammad, Shicheng Zhang, Sun Lu, Syed Jamal-Ud-Din, Xinzhe Zhao
Abstract:
A compositional reservoir simulation model (CMG-GEM) was used for cyclic CO2 injection process in unconventional tight reservoir. Cyclic CO2 injection is an enhanced oil recovery process consisting of injection, shut-in, and production. The study of cyclic CO2 injection and hydrocarbon recovery in ultra-low permeability reservoirs is mainly a function of rock, fluid, and operational parameters. CMG-GEM was used to study several design parameters of cyclic CO2 injection process to distinguish the parameters with maximum effect on the oil recovery and to comprehend the behavior of cyclic CO2 injection in tight reservoir. On the other hand, permeability reduction induced by asphaltene precipitation is one of the major issues in the oil industry due to its plugging onto the porous media which reduces the oil productivity. In addition to asphaltene deposition, solubility of CO2 in the aquifer is one of the safest and permanent trapping techniques when considering CO2 storage mechanisms in geological formations. However, the effects of the above uncertain parameters on the process of CO2 enhanced oil recovery have not been understood systematically. Hence, it is absolutely necessary to study the most significant parameters which dominate the process. The main objective of this study is to improve techniques for designing cyclic CO2 injection process while considering the effects of asphaltene deposition and solubility of CO2 in the brine in order to prevent asphaltene precipitation, minimize CO2 emission, optimize cyclic CO2 injection, and maximize oil production.Keywords: tight reservoirs, cyclic O₂ injection, asphaltene, solubility, reservoir simulation
Procedia PDF Downloads 3875944 Narrative Family Therapy and the Treatment of Perinatal Mood and Anxiety Disorders
Authors: Jamie E. Banker
Abstract:
For many families, pregnancy and the postpartum time are filled with both anticipation and change. For some pregnant or postpartum women, this time is marked by the onset of a mood or anxiety disorder. Experiencing a mood or anxiety disorders during this time of life differs from depression or anxiety at other times of life. Not only because of the physical changes occurring in the mother’s body but also the mental and physical preparation necessary to redefine family roles, responsibilities, and develop new identities in the life transition. The presence of a mood or anxiety disorder can influence the way in which a mother defines herself and can complicate her understanding of her abilities and competencies as a mother. The complexity of experiencing a mood or anxiety disorder in the midst of these changes necessitates specific treatment interventions to match both the symptomatology and psychological adjustments. This study explores the use of narrative family therapy techniques when treating a mother who is experiencing postpartum depression. Externalization is a common technique used in narrative family therapy and can help client’s separate their identity from the problems they are experiencing. This is crucial to a new mom who is in the middle of defining her identity during her transition to parenthood. The goal of this study is to examine how the use of externalization techniques help postpartum women separate their mood and anxiety symptoms from their identity as a mother. An exploratory case study design was conducted in a single setting, private practice therapy office, and explored how a narrative family therapy approach can be used to treat perinatal mood and anxiety disorders. The therapy sessions were audio recorded and transcribed. Constructivism and narrative theory are used as theoretical frameworks and data from the therapy sessions, and a follow-up survey was triangulated and analyzed. During the course of the treatment, the participant reports using the new externalizing labels for her symptoms. Within one month of treatment, the participant reports that she could stop herself from thinking the harmful thoughts faster, and within three months, the harmful thoughts went away. The main themes in this study were building courage and less self-blame. This case highlights the role narrative family therapy can play in the treatment of perinatal mood and anxiety disorders and the importance of separating a women’s mood from her identity as a mother. This conceptual framework was beneficial to the postpartum mother when treating perinatal mood and anxiety disorder symptoms.Keywords: externalizing techniques, narrative family therapy, perinatal mood and anxiety disorders, postpartum depression
Procedia PDF Downloads 2755943 Dying and Sexuality − Controversial Motive in Contemporary Cinema
Authors: Małgorzata Jakubowska, Monika Michałowska
Abstract:
Since the beginning of the cinematographic industry, there has been a visible interest in two leading themes: death and sexuality. One of the reasons of the unfading popularity of these motives was the fact that death or sex employed as leitmotivs attracted great attention of the viewers, and this guaranteed a financial success. What seems interesting is the fact that the themes of death and sexuality/eroticism seem to be mutually exclusive in the mainstream movies to such extent that they almost never appear together on the screen. As leitmotivs they describe opposite experiences of human life, one refers to affirmation of life, the other points to atrophy and decay. This film paradigm is rarely challenged. Thus, a relatively less attention has been devoted so far to entwining dying and sexuality/eroticism in one movie. In our paper, we wish to have a closer look at the visualizations of dying with focus on the aspect of sexuality/eroticism. Our analysis will concentrate on the contemporary European and American cinema, and especially the recent productions that contribute to the cultural phenomenon of entwining the two realms of human life. We will investigate the main clichés, plot and visual schemes, motives and narrative techniques on the examples of Sweet November (2001), A Little Bit of Heaven (2011) and Now is good (2012). We will also shed some light on the recent film productions that seem to provide a shift in portraying the realms of dying and sexuality concentrating on The Garden of Earthly Delights (2003) as the most paradigmatic example.Keywords: contemporary cinema, dying and sexuality, narrative techniques, plot and visual schemes
Procedia PDF Downloads 3985942 Automatic Early Breast Cancer Segmentation Enhancement by Image Analysis and Hough Transform
Authors: David Jurado, Carlos Ávila
Abstract:
Detection of early signs of breast cancer development is crucial to quickly diagnose the disease and to define adequate treatment to increase the survival probability of the patient. Computer Aided Detection systems (CADs), along with modern data techniques such as Machine Learning (ML) and Neural Networks (NN), have shown an overall improvement in digital mammography cancer diagnosis, reducing the false positive and false negative rates becoming important tools for the diagnostic evaluations performed by specialized radiologists. However, ML and NN-based algorithms rely on datasets that might bring issues to the segmentation tasks. In the present work, an automatic segmentation and detection algorithm is described. This algorithm uses image processing techniques along with the Hough transform to automatically identify microcalcifications that are highly correlated with breast cancer development in the early stages. Along with image processing, automatic segmentation of high-contrast objects is done using edge extraction and circle Hough transform. This provides the geometrical features needed for an automatic mask design which extracts statistical features of the regions of interest. The results shown in this study prove the potential of this tool for further diagnostics and classification of mammographic images due to the low sensitivity to noisy images and low contrast mammographies.Keywords: breast cancer, segmentation, X-ray imaging, hough transform, image analysis
Procedia PDF Downloads 845941 Extended Literature Review on Sustainable Energy by Using Multi-Criteria Decision Making Techniques
Authors: Koray Altintas, Ozalp Vayvay
Abstract:
Increased global issues such as depletion of sources, environmental problems and social inequality triggered public awareness towards finding sustainable solutions in order to ensure the well-being of the current as well as future generations. Since energy plays a significant role in improved social and economic well-being and is imperative on both industrial and commercial wealth creation, it is a must to develop a standardized set of metrics which makes it possible to indicate the present condition relative to conditions in the past and to develop any perspective which is required to frame actions for the future. This is not an easy task by considering the complexity of the issue which requires integrating economic, environmental and social aspects of sustainable energy. Multi-criteria decision making (MCDM) can be considered as a form of integrated sustainability evaluation and a decision support approach that can be used to solve complex problems featuring; conflicting objectives, different forms of data and information, multi-interests and perspectives. On that matter, MCDM methods are useful for providing solutions to complex energy management problems. The aim of this study is to review MCDM approaches that can be used for examining sustainable energy management. This study presents an insight into MCDM techniques and methods that can be useful for engineers, researchers and policy makers working in the energy sector.Keywords: sustainable energy, sustainability criteria, multi-criteria decision making, sustainability dimensions
Procedia PDF Downloads 3335940 Patient Scheduling Improvement in a Cancer Treatment Clinic Using Optimization Techniques
Authors: Maryam Haghi, Ivan Contreras, Nadia Bhuiyan
Abstract:
Chemotherapy is one of the most popular and effective cancer treatments offered to patients in outpatient oncology centers. In such clinics, patients first consult with an oncologist and the oncologist may prescribe a chemotherapy treatment plan for the patient based on the blood test results and the examination of the health status. Then, when the plan is determined, a set of chemotherapy and consultation appointments should be scheduled for the patient. In this work, a comprehensive mathematical formulation for planning and scheduling different types of chemotherapy patients over a planning horizon considering blood test, consultation, pharmacy and treatment stages has been proposed. To be more realistic and to provide an applicable model, this study is focused on a case study related to a major outpatient cancer treatment clinic in Montreal, Canada. Comparing the results of the proposed model with the current practice of the clinic under study shows significant improvements regarding different performance measures. These major improvements in the patients’ schedules reveal that using optimization techniques in planning and scheduling of patients in such highly demanded cancer treatment clinics is an essential step to provide a good coordination between different involved stages which ultimately increases the efficiency of the entire system and promotes the staff and patients' satisfaction.Keywords: chemotherapy patients scheduling, integer programming, integrated scheduling, staff balancing
Procedia PDF Downloads 1755939 A Discourse on the Rhythmic Pattern Employed in Yoruba Sakara Music of Nigeria
Authors: Oludare Olupemi Ezekiel
Abstract:
This research examines the rhythmic structure of Sakara music by tracing its roots and analyzing the various rhythmic patterns of this neo-traditional genre, as well as the contributions of the major exponents and contemporary practitioners, using these as a model for understanding and establishing African rhythms. Biography of the major exponents and contemporary practitioners, interviews and participant observational methods were used to elicit information. Samples of the genre which were chosen at random were transcribed, notated and analyzed for academic use and documentation. The research affirmed that rhythms such as the Hemiola, Cross-rhythm, Clave or Bell rhythm, Percussive, Speech and Melodic rhythm and other relevant rhythmic theories were prevalent and applicable to Sakara music, while making important contributions to musical scholarship through its analysis of the music. The analysis and discussions carried out in the research pointed towards a conclusion that the Yoruba musicians are guided by some preconceptions and sound musical considerations in making their rhythmic patterns, used as compositional techniques and not mere incidental occurrence. These rhythmic patterns, with its consequential socio-cultural connotations, enhance musical values and national identity in Nigeria. The study concludes by recommending that musicologists need to carry out more research into this and other neo-traditional genres in order to advance the globalisation of African music.Keywords: compositional techniques, globalisation, identity, neo-traditional, rhythmic theory, Sakara music
Procedia PDF Downloads 4465938 Classification of Digital Chest Radiographs Using Image Processing Techniques to Aid in Diagnosis of Pulmonary Tuberculosis
Authors: A. J. S. P. Nileema, S. Kulatunga , S. H. Palihawadana
Abstract:
Computer aided detection (CAD) system was developed for the diagnosis of pulmonary tuberculosis using digital chest X-rays with MATLAB image processing techniques using a statistical approach. The study comprised of 200 digital chest radiographs collected from the National Hospital for Respiratory Diseases - Welisara, Sri Lanka. Pre-processing was done to remove identification details. Lung fields were segmented and then divided into four quadrants; right upper quadrant, left upper quadrant, right lower quadrant, and left lower quadrant using the image processing techniques in MATLAB. Contrast, correlation, homogeneity, energy, entropy, and maximum probability texture features were extracted using the gray level co-occurrence matrix method. Descriptive statistics and normal distribution analysis were performed using SPSS. Depending on the radiologists’ interpretation, chest radiographs were classified manually into PTB - positive (PTBP) and PTB - negative (PTBN) classes. Features with standard normal distribution were analyzed using an independent sample T-test for PTBP and PTBN chest radiographs. Among the six features tested, contrast, correlation, energy, entropy, and maximum probability features showed a statistically significant difference between the two classes at 95% confidence interval; therefore, could be used in the classification of chest radiograph for PTB diagnosis. With the resulting value ranges of the five texture features with normal distribution, a classification algorithm was then defined to recognize and classify the quadrant images; if the texture feature values of the quadrant image being tested falls within the defined region, it will be identified as a PTBP – abnormal quadrant and will be labeled as ‘Abnormal’ in red color with its border being highlighted in red color whereas if the texture feature values of the quadrant image being tested falls outside of the defined value range, it will be identified as PTBN–normal and labeled as ‘Normal’ in blue color but there will be no changes to the image outline. The developed classification algorithm has shown a high sensitivity of 92% which makes it an efficient CAD system and with a modest specificity of 70%.Keywords: chest radiographs, computer aided detection, image processing, pulmonary tuberculosis
Procedia PDF Downloads 1275937 The Superiority of 18F-Sodium Fluoride PET/CT for Detecting Bone Metastases in Comparison with Other Bone Diagnostic Imaging Modalities
Authors: Mojtaba Mirmontazemi, Habibollah Dadgar
Abstract:
Bone is the most common metastasis site in some advanced malignancies, such as prostate and breast cancer. Bone metastasis generally indicates fewer prognostic factors in these patients. Different radiological and molecular imaging modalities are used for detecting bone lesions. Molecular imaging including computed tomography, magnetic resonance imaging, planar bone scintigraphy, single-photon emission tomography, and positron emission tomography as noninvasive visualization of the biological occurrences has the potential to exact examination, characterization, risk stratification and comprehension of human being diseases. Also, it is potent to straightly visualize targets, specify clearly cellular pathways and provide precision medicine for molecular targeted therapies. These advantages contribute implement personalized treatment for each patient. Currently, NaF PET/CT has significantly replaced standard bone scintigraphy for the detection of bone metastases. On one hand, 68Ga-PSMA PET/CT has gained high attention for accurate staging of primary prostate cancer and restaging after biochemical recurrence. On the other hand, FDG PET/CT is not commonly used in osseous metastases of prostate and breast cancer as well as its usage is limited to staging patients with aggressive primary tumors or localizing the site of disease. In this article, we examine current studies about FDG, NaF, and PSMA PET/CT images in bone metastases diagnostic utility and assess response to treatment in patients with breast and prostate cancer.Keywords: skeletal metastases, fluorodeoxyglucose, sodium fluoride, molecular imaging, precision medicine, prostate cancer (68Ga-PSMA-11)
Procedia PDF Downloads 1105936 Effectiveness and Efficiency of Unified Philippines Accident Reporting and Database System in Optimizing Road Crash Data Usage with Various Stakeholders
Authors: Farhad Arian Far, Anjanette Q. Eleazar, Francis Aldrine A. Uy, Mary Joyce Anne V. Uy
Abstract:
The Unified Philippine Accident Reporting and Database System (UPARDS), is a newly developed system by Dr. Francis Aldrine Uy of the Mapua Institute of Technology. The main purpose is to provide an advanced road accident investigation tool, record keeping and analysis system for stakeholders such as Philippine National Police (PNP), Metro Manila Development Authority (MMDA), Department of Public Works and Highways (DPWH), Department of Health (DOH), and insurance companies. The system is composed of 2 components, the mobile application for road accident investigators that takes advantage of available technology to advance data gathering and the web application that integrates all accident data for the use of all stakeholders. The researchers with the cooperation of PNP’s Vehicle Traffic Investigation Sector of the City of Manila, conducted the field-testing of the application in fifteen (15) accident cases. Simultaneously, the researchers also distributed surveys to PNP, Manila Doctors Hospital, and Charter Ping An Insurance Company to gather their insights regarding the web application. The survey was designed on information systems theory called Technology Acceptance Model. The results of the surveys revealed that the respondents were greatly satisfied with the visualization and functions of the applications as it proved to be effective and far more efficient in comparison with the conventional pen-and-paper method. In conclusion, the pilot study was able to address the need for improvement of the current system.Keywords: accident, database, investigation, mobile application, pilot testing
Procedia PDF Downloads 4435935 The Power of in situ Characterization Techniques in Heterogeneous Catalysis: A Case Study of Deacon Reaction
Authors: Ramzi Farra, Detre Teschner, Marc Willinger, Robert Schlögl
Abstract:
Introduction: The conventional approach of characterizing solid catalysts under static conditions, i.e., before and after reaction, does not provide sufficient knowledge on the physicochemical processes occurring under dynamic conditions at the molecular level. Hence, the necessity of improving new in situ characterizing techniques with the potential of being used under real catalytic reaction conditions is highly desirable. In situ Prompt Gamma Activation Analysis (PGAA) is a rapidly developing chemical analytical technique that enables us experimentally to assess the coverage of surface species under catalytic turnover and correlate these with the reactivity. The catalytic HCl oxidation (Deacon reaction) over bulk ceria will serve as our example. Furthermore, the in situ Transmission Electron Microscopy is a powerful technique that can contribute to the study of atmosphere and temperature induced morphological or compositional changes of a catalyst at atomic resolution. The application of such techniques (PGAA and TEM) will pave the way to a greater and deeper understanding of the dynamic nature of active catalysts. Experimental/Methodology: In situ Prompt Gamma Activation Analysis (PGAA) experiments were carried out to determine the Cl uptake and the degree of surface chlorination under reaction conditions by varying p(O2), p(HCl), p(Cl2), and the reaction temperature. The abundance and dynamic evolution of OH groups on working catalyst under various steady-state conditions were studied by means of in situ FTIR with a specially designed homemade transmission cell. For real in situ TEM we use a commercial in situ holder with a home built gas feeding system and gas analytics. Conclusions: Two complimentary in situ techniques, namely in situ PGAA and in situ FTIR were utilities to investigate the surface coverage of the two most abundant species (Cl and OH). The OH density and Cl uptake were followed under multiple steady-state conditions as a function of p(O2), p(HCl), p(Cl2), and temperature. These experiments have shown that, the OH density positively correlates with the reactivity whereas Cl negatively. The p(HCl) experiments give rise to increased activity accompanied by Cl-coverage increase (opposite trend to p(O2) and T). Cl2 strongly inhibits the reaction, but no measurable increase of the Cl uptake was found. After considering all previous observations we conclude that only a minority of the available adsorption sites contribute to the reactivity. In addition, the mechanism of the catalysed reaction was proposed. The chlorine-oxygen competition for the available active sites renders re-oxidation as the rate-determining step of the catalysed reaction. Further investigations using in situ TEM are planned and will be conducted in the near future. Such experiments allow us to monitor active catalysts at the atomic scale under the most realistic conditions of temperature and pressure. The talk will shed a light on the potential and limitations of in situ PGAA and in situ TEM in the study of catalyst dynamics.Keywords: CeO2, deacon process, in situ PGAA, in situ TEM, in situ FTIR
Procedia PDF Downloads 2925934 Study on Improvement the Performance of Construction Project Using Lean Principles
Authors: Sumaya Adina
Abstract:
The productivity of the construction industry has faced numerous challenges, rising costs, and scarce resources over the past forty years; therefore, one approach for improving and enhancing the framework is the use of lean techniques. Lean method outcomes from the use of a brand-form of manufacturing control in production. At a time when sustainability and efficiency are essential, lean offers a clear path to make the construction industry fit for the future. An excessive number of construction professionals and experts have efficiently optimised development initiatives using lean construction (LC) techniques to reduce waste, maximise value creation, and focus on the process that creates real added value and continuous improvement, strengthening flexibility and adaptability. The present research has been undertaken to study the improvement in the performance of construction projects using lean principles. The study work is divided into three stages. Initially, a questionnaire survey was conducted on visual management techniques to improve the performance of the construction projects. The questionnaire was distributed to civil engineers, architects, site managers, project managers, and full-time executives, with nearly 100 questionnaires shared with respondents. A total of 83 responses were received to determine the reliability of the data, and analysis was done using SPSS software. In the second stage, the impact of value stream mapping on the real-time project is determined and its performance in the form of time and cost reduction is evaluated. The case study examines a bunker-building project located in Kabul Afghanistan; the project was planned conventionally without considering the lean concepts. To reduce overall kinds of waste in the project, a plan was developed using the Vico Control software to visualize the value stream of the project. Finally, the impact of value stream mapping on the project's total cash flow is evaluated and compared by plotting the total cash flow curve using Vico software. As a result, labour costs were reduced by 33%. The duration of the project was reduced by 17% reducing the duration of the project also improved the cash flow of the entire project by 14% and increased the cash flow from negative 67% to negative 44%.Keywords: lean construction, cost and time overrun, value stream mapping, construction effeciency
Procedia PDF Downloads 95933 Public Behavior When Encountered with a Road Traffic Accident
Authors: H. N. S. Silva, S. N. Silva
Abstract:
Introduction: The latest WHO data published in 2014 states that Sri Lanka has reached 2,773 of total deaths and over 14000 individuals’ sustained injuries due to RTAs each year. It was noticed in previous studies that policemen, three wheel drivers and also pedestrians were the first to respond to RTAs but the victim’s condition was aggravated due to unskilled attempts made by the responders while management of the victim’s wounds, moving and positioning of the victims and also mainly while transportation of the victims. Objective: To observe the practices of the urban public in Sri Lanka who are encountered with RTAs. Methods: A qualitative study was done to analyze public behavior seen on video recordings of scenes of accidents purposefully selected from social media, news websites, YouTube and Google. Results: The results showed that all individuals who tried to help during the RTA were middle aged men, who were mainly pedestrians, motorcyclists and policemen during that moment. Vast majority were very keen to actively help the victims to get to hospital as soon as possible and actively participated in providing 'aid'. But main problem was the first aid attempts were disorganized and uncoordinated. Even though all individuals knew how to control external bleeding, none of them was aware of spinal prevention techniques or management of limb injuries. Most of the transportation methods and transfer techniques used were inappropriate and more injury prone. Conclusions: The public actively engages in providing aid despite their inappropriate practices in giving first aid.Keywords: encountered, pedestrians, road traffic accidents, urban public
Procedia PDF Downloads 2875932 3D Geomechanical Model the Best Solution of the 21st Century for Perforation's Problems
Authors: Luis Guiliana, Andrea Osorio
Abstract:
The lack of comprehension of the reservoir geomechanics conditions may cause operational problems that cost to the industry billions of dollars per year. The drilling operations at the Ceuta Field, Area 2 South, Maracaibo Lake, have been very expensive due to problems associated with drilling. The principal objective of this investigation is to develop a 3D geomechanical model in this area, in order to optimize the future drillings in the field. For this purpose, a 1D geomechanical model was built at first instance, following the workflow of the MEM (Mechanical Earth Model), this consists of the following steps: 1) Data auditing, 2) Analysis of drilling events and structural model, 3) Mechanical stratigraphy, 4) Overburden stress, 5) Pore pressure, 6) Rock mechanical properties, 7) Horizontal stresses, 8) Direction of the horizontal stresses, 9) Wellbore stability. The 3D MEM was developed through the geostatistic model of the Eocene C-SUP VLG-3676 reservoir and the 1D MEM. With this data the geomechanical grid was embedded. The analysis of the results threw, that the problems occurred in the wells that were examined were mainly due to wellbore stability issues. It was determined that the stress field change as the stratigraphic column deepens, it is normal to strike-slip at the Middle Miocene and Lower Miocene, and strike-slipe to reverse at the Eocene. In agreement to this, at the level of the Eocene, the most advantageous direction to drill is parallel to the maximum horizontal stress (157º). The 3D MEM allowed having a tridimensional visualization of the rock mechanical properties, stresses and operational windows (mud weight and pressures) variations. This will facilitate the optimization of the future drillings in the area, including those zones without any geomechanics information.Keywords: geomechanics, MEM, drilling, stress
Procedia PDF Downloads 2735931 Sensor and Sensor System Design, Selection and Data Fusion Using Non-Deterministic Multi-Attribute Tradespace Exploration
Authors: Matthew Yeager, Christopher Willy, John Bischoff
Abstract:
The conceptualization and design phases of a system lifecycle consume a significant amount of the lifecycle budget in the form of direct tasking and capital, as well as the implicit costs associated with unforeseeable design errors that are only realized during downstream phases. Ad hoc or iterative approaches to generating system requirements oftentimes fail to consider the full array of feasible systems or product designs for a variety of reasons, including, but not limited to: initial conceptualization that oftentimes incorporates a priori or legacy features; the inability to capture, communicate and accommodate stakeholder preferences; inadequate technical designs and/or feasibility studies; and locally-, but not globally-, optimized subsystems and components. These design pitfalls can beget unanticipated developmental or system alterations with added costs, risks and support activities, heightening the risk for suboptimal system performance, premature obsolescence or forgone development. Supported by rapid advances in learning algorithms and hardware technology, sensors and sensor systems have become commonplace in both commercial and industrial products. The evolving array of hardware components (i.e. sensors, CPUs, modular / auxiliary access, etc…) as well as recognition, data fusion and communication protocols have all become increasingly complex and critical for design engineers during both concpetualization and implementation. This work seeks to develop and utilize a non-deterministic approach for sensor system design within the multi-attribute tradespace exploration (MATE) paradigm, a technique that incorporates decision theory into model-based techniques in order to explore complex design environments and discover better system designs. Developed to address the inherent design constraints in complex aerospace systems, MATE techniques enable project engineers to examine all viable system designs, assess attribute utility and system performance, and better align with stakeholder requirements. Whereas such previous work has been focused on aerospace systems and conducted in a deterministic fashion, this study addresses a wider array of system design elements by incorporating both traditional tradespace elements (e.g. hardware components) as well as popular multi-sensor data fusion models and techniques. Furthermore, statistical performance features to this model-based MATE approach will enable non-deterministic techniques for various commercial systems that range in application, complexity and system behavior, demonstrating a significant utility within the realm of formal systems decision-making.Keywords: multi-attribute tradespace exploration, data fusion, sensors, systems engineering, system design
Procedia PDF Downloads 1875930 Speeding Up Lenia: A Comparative Study Between Existing Implementations and CUDA C++ with OpenGL Interop
Authors: L. Diogo, A. Legrand, J. Nguyen-Cao, J. Rogeau, S. Bornhofen
Abstract:
Lenia is a system of cellular automata with continuous states, space and time, which surprises not only with the emergence of interesting life-like structures but also with its beauty. This paper reports ongoing research on a GPU implementation of Lenia using CUDA C++ and OpenGL Interoperability. We demonstrate how CUDA as a low-level GPU programming paradigm allows optimizing performance and memory usage of the Lenia algorithm. A comparative analysis through experimental runs with existing implementations shows that the CUDA implementation outperforms the others by one order of magnitude or more. Cellular automata hold significant interest due to their ability to model complex phenomena in systems with simple rules and structures. They allow exploring emergent behavior such as self-organization and adaptation, and find applications in various fields, including computer science, physics, biology, and sociology. Unlike classic cellular automata which rely on discrete cells and values, Lenia generalizes the concept of cellular automata to continuous space, time and states, thus providing additional fluidity and richness in emerging phenomena. In the current literature, there are many implementations of Lenia utilizing various programming languages and visualization libraries. However, each implementation also presents certain drawbacks, which serve as motivation for further research and development. In particular, speed is a critical factor when studying Lenia, for several reasons. Rapid simulation allows researchers to observe the emergence of patterns and behaviors in more configurations, on bigger grids and over longer periods without annoying waiting times. Thereby, they enable the exploration and discovery of new species within the Lenia ecosystem more efficiently. Moreover, faster simulations are beneficial when we include additional time-consuming algorithms such as computer vision or machine learning to evolve and optimize specific Lenia configurations. We developed a Lenia implementation for GPU using the C++ and CUDA programming languages, and CUDA/OpenGL Interoperability for immediate rendering. The goal of our experiment is to benchmark this implementation compared to the existing ones in terms of speed, memory usage, configurability and scalability. In our comparison we focus on the most important Lenia implementations, selected for their prominence, accessibility and widespread use in the scientific community. The implementations include MATLAB, JavaScript, ShaderToy GLSL, Jupyter, Rust and R. The list is not exhaustive but provides a broad view of the principal current approaches and their respective strengths and weaknesses. Our comparison primarily considers computational performance and memory efficiency, as these factors are critical for large-scale simulations, but we also investigate the ease of use and configurability. The experimental runs conducted so far demonstrate that the CUDA C++ implementation outperforms the other implementations by one order of magnitude or more. The benefits of using the GPU become apparent especially with larger grids and convolution kernels. However, our research is still ongoing. We are currently exploring the impact of several software design choices and optimization techniques, such as convolution with Fast Fourier Transforms (FFT), various GPU memory management scenarios, and the trade-off between speed and accuracy using single versus double precision floating point arithmetic. The results will give valuable insights into the practice of parallel programming of the Lenia algorithm, and all conclusions will be thoroughly presented in the conference paper. The final version of our CUDA C++ implementation will be published on github and made freely accessible to the Alife community for further development.Keywords: artificial life, cellular automaton, GPU optimization, Lenia, comparative analysis.
Procedia PDF Downloads 445929 Assessment of Air Pollutant Dispersion and Soil Contamination: The Critical Role of MATLAB Modeling in Evaluating Emissions from the Covanta Municipal Solid Waste Incineration Facility
Authors: Jadon Matthiasa, Cindy Donga, Ali Al Jibouria, Hsin Kuo
Abstract:
The environmental impact of emissions from the Covanta Waste-to-Energy facility in Burnaby, BC, was comprehensively evaluated, focusing on the dispersion of air pollutants and the subsequent assessment of heavy metal contamination in surrounding soils. A Gaussian Plume Model, implemented in MATLAB, was utilized to simulate the dispersion of key pollutants to understand their atmospheric behaviour and potential deposition patterns. The MATLAB code developed for this study enhanced the accuracy of pollutant concentration predictions and provided capabilities for visualizing pollutant dispersion in 3D plots. Furthermore, the code could predict the maximum concentration of pollutants at ground level, eliminating the need to use the Ranchoux model for predictions. Complementing the modelling approach, empirical soil sampling and analysis were conducted to evaluate heavy metal concentrations in the vicinity of the facility. This integrated methodology underscored the importance of computational modelling in air pollution assessment and highlighted the necessity of soil analysis to obtain a holistic understanding of environmental impacts. The findings emphasized the effectiveness of current emissions controls while advocating for ongoing monitoring to safeguard public health and environmental integrity.Keywords: air emissions, Gaussian Plume Model, MATLAB, soil contamination, air pollution monitoring, waste-to-energy, pollutant dispersion visualization, heavy metal analysis, environmental impact assessment, emission control effectiveness
Procedia PDF Downloads 205928 New Variational Approach for Contrast Enhancement of Color Image
Authors: Wanhyun Cho, Seongchae Seo, Soonja Kang
Abstract:
In this work, we propose a variational technique for image contrast enhancement which utilizes global and local information around each pixel. The energy functional is defined by a weighted linear combination of three terms which are called on a local, a global contrast term and dispersion term. The first one is a local contrast term that can lead to improve the contrast of an input image by increasing the grey-level differences between each pixel and its neighboring to utilize contextual information around each pixel. The second one is global contrast term, which can lead to enhance a contrast of image by minimizing the difference between its empirical distribution function and a cumulative distribution function to make the probability distribution of pixel values becoming a symmetric distribution about median. The third one is a dispersion term that controls the departure between new pixel value and pixel value of original image while preserving original image characteristics as well as possible. Second, we derive the Euler-Lagrange equation for true image that can achieve the minimum of a proposed functional by using the fundamental lemma for the calculus of variations. And, we considered the procedure that this equation can be solved by using a gradient decent method, which is one of the dynamic approximation techniques. Finally, by conducting various experiments, we can demonstrate that the proposed method can enhance the contrast of colour images better than existing techniques.Keywords: color image, contrast enhancement technique, variational approach, Euler-Lagrang equation, dynamic approximation method, EME measure
Procedia PDF Downloads 4505927 Contextual Toxicity Detection with Data Augmentation
Authors: Julia Ive, Lucia Specia
Abstract:
Understanding and detecting toxicity is an important problem to support safer human interactions online. Our work focuses on the important problem of contextual toxicity detection, where automated classifiers are tasked with determining whether a short textual segment (usually a sentence) is toxic within its conversational context. We use “toxicity” as an umbrella term to denote a number of variants commonly named in the literature, including hate, abuse, offence, among others. Detecting toxicity in context is a non-trivial problem and has been addressed by very few previous studies. These previous studies have analysed the influence of conversational context in human perception of toxicity in controlled experiments and concluded that humans rarely change their judgements in the presence of context. They have also evaluated contextual detection models based on state-of-the-art Deep Learning and Natural Language Processing (NLP) techniques. Counterintuitively, they reached the general conclusion that computational models tend to suffer performance degradation in the presence of context. We challenge these empirical observations by devising better contextual predictive models that also rely on NLP data augmentation techniques to create larger and better data. In our study, we start by further analysing the human perception of toxicity in conversational data (i.e., tweets), in the absence versus presence of context, in this case, previous tweets in the same conversational thread. We observed that the conclusions of previous work on human perception are mainly due to data issues: The contextual data available does not provide sufficient evidence that context is indeed important (even for humans). The data problem is common in current toxicity datasets: cases labelled as toxic are either obviously toxic (i.e., overt toxicity with swear, racist, etc. words), and thus context does is not needed for a decision, or are ambiguous, vague or unclear even in the presence of context; in addition, the data contains labeling inconsistencies. To address this problem, we propose to automatically generate contextual samples where toxicity is not obvious (i.e., covert cases) without context or where different contexts can lead to different toxicity judgements for the same tweet. We generate toxic and non-toxic utterances conditioned on the context or on target tweets using a range of techniques for controlled text generation(e.g., Generative Adversarial Networks and steering techniques). On the contextual detection models, we posit that their poor performance is due to limitations on both of the data they are trained on (same problems stated above) and the architectures they use, which are not able to leverage context in effective ways. To improve on that, we propose text classification architectures that take the hierarchy of conversational utterances into account. In experiments benchmarking ours against previous models on existing and automatically generated data, we show that both data and architectural choices are very important. Our model achieves substantial performance improvements as compared to the baselines that are non-contextual or contextual but agnostic of the conversation structure.Keywords: contextual toxicity detection, data augmentation, hierarchical text classification models, natural language processing
Procedia PDF Downloads 171