Search results for: microscopic techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7126

Search results for: microscopic techniques

5896 Methodology: A Review in Modelling and Predictability of Embankment in Soft Ground

Authors: Bhim Kumar Dahal

Abstract:

Transportation network development in the developing country is in rapid pace. The majority of the network belongs to railway and expressway which passes through diverse topography, landform and geological conditions despite the avoidance principle during route selection. Construction of such networks demand many low to high embankment which required improvement in the foundation soil. This paper is mainly focused on the various advanced ground improvement techniques used to improve the soft soil, modelling approach and its predictability for embankments construction. The ground improvement techniques can be broadly classified in to three groups i.e. densification group, drainage and consolidation group and reinforcement group which are discussed with some case studies.  Various methods were used in modelling of the embankments from simple 1-dimensional to complex 3-dimensional model using variety of constitutive models. However, the reliability of the predictions is not found systematically improved with the level of sophistication.  And sometimes the predictions are deviated more than 60% to the monitored value besides using same level of erudition. This deviation is found mainly due to the selection of constitutive model, assumptions made during different stages, deviation in the selection of model parameters and simplification during physical modelling of the ground condition. This deviation can be reduced by using optimization process, optimization tools and sensitivity analysis of the model parameters which will guide to select the appropriate model parameters.

Keywords: cement, improvement, physical properties, strength

Procedia PDF Downloads 174
5895 Noise Barrier Technique as a Way to Improve the Sonic Urban Environment along Existing Roadways Assessment: El-Gish Road Street, Alexandria, Egypt

Authors: Nihal Atif Salim

Abstract:

To improve the quality of life in cities, a variety of interventions are used. Noise is a substantial and important sort of pollution that has a negative impact on the urban environment and human health. According to the complaint survey, it ranks second among environmental contamination complaints (conducted by EEAA in 2019). The most significant source of noise in the city is traffic noise. In order to improve the sound urban environment, many physical techniques are applied. In the local area, noise barriers are considered as one of the most appropriate physical techniques along existing traffic routes. Alexandria is Egypt's second-largest city after Cairo. It is located along the Mediterranean Sea, and El- Gish Road is one of the city's main arteries. It impacts the waterfront promenade that extends along with the city by a high level of traffic noise. The purpose of this paper is to clarify the design considerations for the most appropriate noise barrier type along with the promenade, with the goal of improving the Quality of Life (QOL) and the sonic urban environment specifically. The proposed methodology focuses on how noise affects human perception and the environment. Then it delves into the various physical noise control approaches. After that, the paper discusses sustainable design decisions making. Finally, look into the importance of incorporating sustainability into design decisions making. Three stages will be followed in the case study. The first stage involves doing a site inspection and using specific sound measurement equipment (a noise level meter) to measure the noise level along the promenade at many sites, and the findings will be shown on a noise map. The second step is to inquire about the site's user experience. The third step is to investigate the various types of noise barriers and their effects on QOL along existing routes in order to select the most appropriate type. The goal of this research is to evaluate the suitable design of noise barriers that fulfill environmental and social perceptions while maintaining a balanced approach to the noise issue in order to improve QOL along existing roadways in the local area.

Keywords: noise pollution, sonic urban environment, traffic noise, noise barrier, acoustic sustainability, noise reduction techniques

Procedia PDF Downloads 137
5894 Influence of Microstructure on Deformation Mechanisms and Mechanical Properties of Additively Manufactured Steel

Authors: Etienne Bonnaud, David Lindell

Abstract:

Correlations between microstructure, deformation mechanisms, and mechanical properties in additively manufactured 316L steel components have been investigated. Mechanical properties in the vertical direction (building direction) and in the horizontal direction (in plane directions) are markedly different. Vertically built specimens show lower yield stress but higher elongation than their horizontally built counterparts. Microscopic observations by electron back scattered diffraction (EBSD) for both build orientations reveal a strong [110] fiber texture in the build direction but different grain morphologies. These microstructures are used as input in subsequent crystal plasticity numerical simulations to understand their influence on the deformation mechanisms and the mechanical properties. Mean field simulations using a visco plastic self consistent (VPSC) model were carried out first but did not give results consistent with the tensile test experiments. A more detailed full-field model had to be used based on the Visco Plastic Fast Fourier Transform (VPFTT) method. A more accurate microstructure description was then input to the simulation model, where thin vertical regions of smaller grains were also taken into account. It turned out that these small grain clusters were responsible for the discrepancies in yield stress and hardening. Texture and morphology have a strong effect on mechanical properties. The different mechanical behaviors between vertically and horizontally printed specimens could be explained by means of numerical full-field crystal plasticity simulations, and the presence of thin clusters of smaller grains was shown to play a central role in the deformation mechanisms.

Keywords: additive manufacturing, crystal plasticity, full-field simulations, mean-field simulations, texture

Procedia PDF Downloads 69
5893 Language Errors Used in “The Space between Us” Movie and Their Effects on Translation Quality: Translation Study toward Discourse Analysis Approach

Authors: Mochamad Nuruz Zaman, Mangatur Rudolf Nababan, M. A. Djatmika

Abstract:

Both society and education areas teach to have good communication for building the interpersonal skills up. Everyone has the capacity to understand something new, either well comprehension or worst understanding. Worst understanding makes the language errors when the interactions are done by someone in the first meeting, and they do not know before it because of distance area. “The Space between Us” movie delivers the love-adventure story between Mars Boy and Earth Girl. They are so many missing conversations because of the different climate and environment. As the moviegoer also must be focused on the subtitle in order to enjoy well the movie. Furthermore, Indonesia subtitle and English conversation on the movie still have overlapping understanding in the translation. Translation hereby consists of source language -SL- (English conversation) and target language -TL- (Indonesia subtitle). These research gap above is formulated in research question by how the language errors happened in that movie and their effects on translation quality which is deepest analyzed by translation study toward discourse analysis approach. The research goal is to expand the language errors and their translation qualities in order to create a good atmosphere in movie media. The research is studied by embedded research in qualitative design. The research locations consist of setting, participant, and event as focused determined boundary. Sources of datum are “The Space between Us” movie and informant (translation quality rater). The sampling is criterion-based sampling (purposive sampling). Data collection techniques use content analysis and questioner. Data validation applies data source and method triangulation. Data analysis delivers domain, taxonomy, componential, and cultural theme analysis. Data findings on the language errors happened in the movie are referential, register, society, textual, receptive, expressive, individual, group, analogical, transfer, local, and global errors. Data discussions on their effects to translation quality are concentrated by translation techniques on their data findings; they are amplification, borrowing, description, discursive creation, established equivalent, generalization, literal, modulation, particularization, reduction, substitution, and transposition.

Keywords: discourse analysis, language errors, The Space between Us movie, translation techniques, translation quality instruments

Procedia PDF Downloads 218
5892 Improvement of the Traditional Techniques of Artistic Casting through the Development of Open Source 3D Printing Technologies Based on Digital Ultraviolet Light Processing

Authors: Drago Diaz Aleman, Jose Luis Saorin Perez, Cecile Meier, Itahisa Perez Conesa, Jorge De La Torre Cantero

Abstract:

Traditional manufacturing techniques used in artistic contexts compete with highly productive and efficient industrial procedures. The craft techniques and associated business models tend to disappear under the pressure of the appearance of mass-produced products that compete in all niche markets, including those traditionally reserved for the work of art. The surplus value derived from the prestige of the author, the exclusivity of the product or the mastery of the artist, do not seem to be sufficient reasons to preserve this productive model. In the last years, the adoption of open source digital manufacturing technologies in small art workshops can favor their permanence by assuming great advantages such as easy accessibility, low cost, and free modification, adapting to specific needs of each workshop. It is possible to use pieces modeled by computer and made with FDM (Fused Deposition Modeling) 3D printers that use PLA (polylactic acid) in the procedures of artistic casting. Models printed by PLA are limited to approximate minimum sizes of 3 cm, and optimal layer height resolution is 0.1 mm. Due to these limitations, it is not the most suitable technology for artistic casting processes of smaller pieces. An alternative to solve size limitation, are printers from the type (SLS) "selective sintering by laser". And other possibility is a laser hardens, by layers, metal powder and called DMLS (Direct Metal Laser Sintering). However, due to its high cost, it is a technology that is difficult to introduce in small artistic foundries. The low-cost DLP (Digital Light Processing) type printers can offer high resolutions for a reasonable cost (around 0.02 mm on the Z axis and 0.04 mm on the X and Y axes), and can print models with castable resins that allow the subsequent direct artistic casting in precious metals or their adaptation to processes such as electroforming. In this work, the design of a DLP 3D printer is detailed, using backlit LCD screens with ultraviolet light. Its development is totally "open source" and is proposed as a kit made up of electronic components, based on Arduino and easy to access mechanical components in the market. The CAD files of its components can be manufactured in low-cost FDM 3D printers. The result is less than 500 Euros, high resolution and open-design with free access that allows not only its manufacture but also its improvement. In future works, we intend to carry out different comparative analyzes, which allow us to accurately estimate the print quality, as well as the real cost of the artistic works made with it.

Keywords: traditional artistic techniques, DLP 3D printer, artistic casting, electroforming

Procedia PDF Downloads 141
5891 Performance Evaluation and Economic Analysis of Minimum Quantity Lubrication with Pressurized/Non-Pressurized Air and Nanofluid Mixture

Authors: M. Amrita, R. R. Srikant, A. V. Sita Rama Raju

Abstract:

Water miscible cutting fluids are conventionally used to lubricate and cool the machining zone. But issues related to health hazards, maintenance and disposal costs have limited their usage, leading to application of Minimum Quantity Lubrication (MQL). To increase the effectiveness of MQL, nanocutting fluids are proposed. In the present work, water miscible nanographite cutting fluids of varying concentration are applied at cutting zone by two systems A and B. System A utilizes high pressure air and supplies cutting fluid at a flow rate of 1ml/min. System B uses low pressure air and supplies cutting fluid at a flow rate of 5ml/min. Their performance in machining is evaluated by measuring cutting temperatures, tool wear, cutting forces and surface roughness and compared with dry machining and flood machining. Application of nano cutting fluid using both systems showed better performance than dry machining. Cutting temperatures and cutting forces obtained by both techniques are more than flood machining. But tool wear and surface roughness showed improvement compared to flood machining. Economic analysis has been carried out in all the cases to decide the applicability of the techniques.

Keywords: economic analysis, machining, minimum quantity lubrication, nanofluid

Procedia PDF Downloads 379
5890 Mapping of Alteration Zones in Mineral Rich Belt of South-East Rajasthan Using Remote Sensing Techniques

Authors: Mrinmoy Dhara, Vivek K. Sengar, Shovan L. Chattoraj, Soumiya Bhattacharjee

Abstract:

Remote sensing techniques have emerged as an asset for various geological studies. Satellite images obtained by different sensors contain plenty of information related to the terrain. Digital image processing further helps in customized ways for the prospecting of minerals. In this study, an attempt has been made to map the hydrothermally altered zones using multispectral and hyperspectral datasets of South East Rajasthan. Advanced Space-borne Thermal Emission and Reflection Radiometer (ASTER) and Hyperion (Level1R) dataset have been processed to generate different Band Ratio Composites (BRCs). For this study, ASTER derived BRCs were generated to delineate the alteration zones, gossans, abundant clays and host rocks. ASTER and Hyperion images were further processed to extract mineral end members and classified mineral maps have been produced using Spectral Angle Mapper (SAM) method. Results were validated with the geological map of the area which shows positive agreement with the image processing outputs. Thus, this study concludes that the band ratios and image processing in combination play significant role in demarcation of alteration zones which may provide pathfinders for mineral prospecting studies.

Keywords: ASTER, hyperion, band ratios, alteration zones, SAM

Procedia PDF Downloads 278
5889 Vehicular Speed Detection Camera System Using Video Stream

Authors: C. A. Anser Pasha

Abstract:

In this paper, a new Vehicular Speed Detection Camera System that is applicable as an alternative to traditional radars with the same accuracy or even better is presented. The real-time measurement and analysis of various traffic parameters such as speed and number of vehicles are increasingly required in traffic control and management. Image processing techniques are now considered as an attractive and flexible method for automatic analysis and data collections in traffic engineering. Various algorithms based on image processing techniques have been applied to detect multiple vehicles and track them. The SDCS processes can be divided into three successive phases; the first phase is Objects detection phase, which uses a hybrid algorithm based on combining an adaptive background subtraction technique with a three-frame differencing algorithm which ratifies the major drawback of using only adaptive background subtraction. The second phase is Objects tracking, which consists of three successive operations - object segmentation, object labeling, and object center extraction. Objects tracking operation takes into consideration the different possible scenarios of the moving object like simple tracking, the object has left the scene, the object has entered the scene, object crossed by another object, and object leaves and another one enters the scene. The third phase is speed calculation phase, which is calculated from the number of frames consumed by the object to pass by the scene.

Keywords: radar, image processing, detection, tracking, segmentation

Procedia PDF Downloads 466
5888 The Application of Sensory Integration Techniques in Science Teaching Students with Autism

Authors: Joanna Estkowska

Abstract:

The Sensory Integration Method is aimed primarily at children with learning disabilities. It can also be used as a complementary method in treatment of children with cerebral palsy, autistic, mentally handicapped, blind and deaf. Autism is holistic development disorder that manifests itself in the specific functioning of a child. The most characteristic are: disorders in communication, difficulties in social relations, rigid patterns of behavior and impairment in sensory processing. In addition to these disorders may occur abnormal intellectual development, attention deficit disorders, perceptual disorders and others. This study was focused on the application sensory integration techniques in science education of autistic students. The lack of proper sensory integration causes problems with complicated processes such as motor coordination, movement planning, visual or auditory perception, speech, writing, reading or counting. Good functioning and cooperation of proprioceptive, tactile and vestibular sense affect the child’s mastery of skills that require coordination of both sides of the body and synchronization of the cerebral hemispheres. These include, for example, all sports activities, precise manual skills such writing, as well as, reading and counting skills. All this takes place in stages. Achieving skills from the first stage determines the development of fitness from the next level. Any deficit in the scope of the first three stages can affect the development of new skills. This ultimately reflects on the achievements at school and in further professional and personal life. After careful analysis symptoms from the emotional and social spheres appear to be secondary to deficits of sensory integration. During our research, the students gained knowledge and skills in the classroom of experience by learning biology, chemistry and physics with application sensory integration techniques. Sensory integration therapy aims to teach the child an adequate response to stimuli coming to him from both the outside world and the body. Thanks to properly selected exercises, a child can improve perception and interpretation skills, motor skills, coordination of movements, attention and concentration or self-awareness, as well as social and emotional functioning.

Keywords: autism spectrum disorder, science education, sensory integration, special educational needs

Procedia PDF Downloads 184
5887 A Semiotic Approach to Vulnerability in Conducting Gesture and Singing Posture

Authors: Johann Van Niekerk

Abstract:

The disciplines of conducting (instrumental or choral) and of singing presume a willingness toward an open posture and, in many cases, demand it for effective communication and technique. Yet, this very openness, with the "spread-eagle" gesture as an extreme, is oftentimes counterintuitive for musicians and within the trajectory of human evolution. Conversely, it is in this very gesture of "taking up space" that confidence-gaining techniques such as the popular "power pose" are based. This paper consists primarily of a literature review, exploring the topics of physical openness and vulnerability, considering the semiotics of the "spread-eagle" and its accompanying letter X. A major finding of this research is the discrepancy between evolutionary instinct towards physical self-protection and “folding in” and the demands of the discipline of physical and gestural openness, expansiveness and vulnerability. A secondary finding is ways in which encouragement of confidence-gaining techniques may be more effective in obtaining the required results than insistence on vulnerability, which is influenced by various cultural contexts and socialization. Choral conductors and music educators are constantly seeking ways to promote engagement and healthy singing. Much of the information and direction toward this goal is gleaned by students from conducting gestures and other pedagogies employed in the rehearsal. The findings of this research provide yet another avenue toward reaching the goals required for sufficient and effective teaching and artistry on the part of instructors and students alike.

Keywords: conducting, gesture, music, pedagogy, posture, vulnerability

Procedia PDF Downloads 79
5886 Sensitivity and Uncertainty Analysis of One Dimensional Shape Memory Alloy Constitutive Models

Authors: A. B. M. Rezaul Islam, Ernur Karadogan

Abstract:

Shape memory alloys (SMAs) are known for their shape memory effect and pseudoelasticity behavior. Their thermomechanical behaviors are modeled by numerous researchers using microscopic thermodynamic and macroscopic phenomenological point of view. Tanaka, Liang-Rogers and Ivshin-Pence models are some of the most popular SMA macroscopic phenomenological constitutive models. They describe SMA behavior in terms of stress, strain and temperature. These models involve material parameters and they have associated uncertainty present in them. At different operating temperatures, the uncertainty propagates to the output when the material is subjected to loading followed by unloading. The propagation of uncertainty while utilizing these models in real-life application can result in performance discrepancies or failure at extreme conditions. To resolve this, we used probabilistic approach to perform the sensitivity and uncertainty analysis of Tanaka, Liang-Rogers, and Ivshin-Pence models. Sobol and extended Fourier Amplitude Sensitivity Testing (eFAST) methods have been used to perform the sensitivity analysis for simulated isothermal loading/unloading at various operating temperatures. As per the results, it is evident that the models vary due to the change in operating temperature and loading condition. The average and stress-dependent sensitivity indices present the most significant parameters at several temperatures. This work highlights the sensitivity and uncertainty analysis results and shows comparison of them at different temperatures and loading conditions for all these models. The analysis presented will aid in designing engineering applications by eliminating the probability of model failure due to the uncertainty in the input parameters. Thus, it is recommended to have a proper understanding of sensitive parameters and the uncertainty propagation at several operating temperatures and loading conditions as per Tanaka, Liang-Rogers, and Ivshin-Pence model.

Keywords: constitutive models, FAST sensitivity analysis, sensitivity analysis, sobol, shape memory alloy, uncertainty analysis

Procedia PDF Downloads 143
5885 Simulation Study of Asphaltene Deposition and Solubility of CO2 in the Brine during Cyclic CO2 Injection Process in Unconventional Tight Reservoirs

Authors: Rashid S. Mohammad, Shicheng Zhang, Sun Lu, Syed Jamal-Ud-Din, Xinzhe Zhao

Abstract:

A compositional reservoir simulation model (CMG-GEM) was used for cyclic CO2 injection process in unconventional tight reservoir. Cyclic CO2 injection is an enhanced oil recovery process consisting of injection, shut-in, and production. The study of cyclic CO2 injection and hydrocarbon recovery in ultra-low permeability reservoirs is mainly a function of rock, fluid, and operational parameters. CMG-GEM was used to study several design parameters of cyclic CO2 injection process to distinguish the parameters with maximum effect on the oil recovery and to comprehend the behavior of cyclic CO2 injection in tight reservoir. On the other hand, permeability reduction induced by asphaltene precipitation is one of the major issues in the oil industry due to its plugging onto the porous media which reduces the oil productivity. In addition to asphaltene deposition, solubility of CO2 in the aquifer is one of the safest and permanent trapping techniques when considering CO2 storage mechanisms in geological formations. However, the effects of the above uncertain parameters on the process of CO2 enhanced oil recovery have not been understood systematically. Hence, it is absolutely necessary to study the most significant parameters which dominate the process. The main objective of this study is to improve techniques for designing cyclic CO2 injection process while considering the effects of asphaltene deposition and solubility of CO2 in the brine in order to prevent asphaltene precipitation, minimize CO2 emission, optimize cyclic CO2 injection, and maximize oil production.

Keywords: tight reservoirs, cyclic O₂ injection, asphaltene, solubility, reservoir simulation

Procedia PDF Downloads 384
5884 Narrative Family Therapy and the Treatment of Perinatal Mood and Anxiety Disorders

Authors: Jamie E. Banker

Abstract:

For many families, pregnancy and the postpartum time are filled with both anticipation and change. For some pregnant or postpartum women, this time is marked by the onset of a mood or anxiety disorder. Experiencing a mood or anxiety disorders during this time of life differs from depression or anxiety at other times of life. Not only because of the physical changes occurring in the mother’s body but also the mental and physical preparation necessary to redefine family roles, responsibilities, and develop new identities in the life transition. The presence of a mood or anxiety disorder can influence the way in which a mother defines herself and can complicate her understanding of her abilities and competencies as a mother. The complexity of experiencing a mood or anxiety disorder in the midst of these changes necessitates specific treatment interventions to match both the symptomatology and psychological adjustments. This study explores the use of narrative family therapy techniques when treating a mother who is experiencing postpartum depression. Externalization is a common technique used in narrative family therapy and can help client’s separate their identity from the problems they are experiencing. This is crucial to a new mom who is in the middle of defining her identity during her transition to parenthood. The goal of this study is to examine how the use of externalization techniques help postpartum women separate their mood and anxiety symptoms from their identity as a mother. An exploratory case study design was conducted in a single setting, private practice therapy office, and explored how a narrative family therapy approach can be used to treat perinatal mood and anxiety disorders. The therapy sessions were audio recorded and transcribed. Constructivism and narrative theory are used as theoretical frameworks and data from the therapy sessions, and a follow-up survey was triangulated and analyzed. During the course of the treatment, the participant reports using the new externalizing labels for her symptoms. Within one month of treatment, the participant reports that she could stop herself from thinking the harmful thoughts faster, and within three months, the harmful thoughts went away. The main themes in this study were building courage and less self-blame. This case highlights the role narrative family therapy can play in the treatment of perinatal mood and anxiety disorders and the importance of separating a women’s mood from her identity as a mother. This conceptual framework was beneficial to the postpartum mother when treating perinatal mood and anxiety disorder symptoms.

Keywords: externalizing techniques, narrative family therapy, perinatal mood and anxiety disorders, postpartum depression

Procedia PDF Downloads 273
5883 Fabrication of Ligand Coated Lipid-Based Nanoparticles for Synergistic Treatment of Autoimmune Disease

Authors: Asiya Mahtab, Sushama Talegaonkar

Abstract:

The research is aimed at developing targeted lipid-based nanocarrier systems of chondroitin sulfate (CS) to deliver an antirheumatic drug to the inflammatory site in arthritic paw. Lipid-based nanoparticle (TEF-lipo) was prepared by using a thin-film hydration method. The coating of prepared drug-loaded nanoparticles was done by the ionic interaction mechanism. TEF-lipo and CS-coated lipid nanoparticle (CS-lipo) were characterized for mean droplet size, zeta potential, and surface morphology. TEF-lipo and CS-lipo were further subjected to in vitro cell line studies on RAW 264.7 murine macrophage, U937, and MG 63 cell lines. The pharmacodynamic study was performed to establish the effectiveness of the prepared lipid-based conventional and targeted nanoparticles in comparison to pure drugs. Droplet size and zeta potential of TEF-lipo were found to be 128. 92 ± 5.42 nm and +12.6 ± 1.2 mV. It was observed that after the coating of TEF-lipo with CS, particle size increased to 155.6± 2.12 nm and zeta potential changed to -10.2± 1.4mV. Transmission electron microscopic analysis revealed that the nanovesicles were uniformly dispersed and detached from each other. Formulations followed sustained release pattern up to 24 h. Results of cell line studies ind icated that CS-lipo formulation showed the highest cytotoxic potential, thereby proving its enhanced ability to kill the RAW 264.7 murine macrophage and U937 cells when compared with other formulations. It is clear from our in vivo pharmacodynamic results that targeted nanocarriers had a higher inhibitory effect on arthritis progression than nontargeted nanocarriers or free drugs. Results demonstrate that this approach will provide effective treatment for rheumatoid arthritis, and CS served as a potential prophylactic against the advancement of cartilage degeneration.

Keywords: adjuvant induced arthritis, chondroitin sulfate, rheumatoid arthritis, teriflunomide

Procedia PDF Downloads 135
5882 Dying and Sexuality − Controversial Motive in Contemporary Cinema

Authors: Małgorzata Jakubowska, Monika Michałowska

Abstract:

Since the beginning of the cinematographic industry, there has been a visible interest in two leading themes: death and sexuality. One of the reasons of the unfading popularity of these motives was the fact that death or sex employed as leitmotivs attracted great attention of the viewers, and this guaranteed a financial success. What seems interesting is the fact that the themes of death and sexuality/eroticism seem to be mutually exclusive in the mainstream movies to such extent that they almost never appear together on the screen. As leitmotivs they describe opposite experiences of human life, one refers to affirmation of life, the other points to atrophy and decay. This film paradigm is rarely challenged. Thus, a relatively less attention has been devoted so far to entwining dying and sexuality/eroticism in one movie. In our paper, we wish to have a closer look at the visualizations of dying with focus on the aspect of sexuality/eroticism. Our analysis will concentrate on the contemporary European and American cinema, and especially the recent productions that contribute to the cultural phenomenon of entwining the two realms of human life. We will investigate the main clichés, plot and visual schemes, motives and narrative techniques on the examples of Sweet November (2001), A Little Bit of Heaven (2011) and Now is good (2012). We will also shed some light on the recent film productions that seem to provide a shift in portraying the realms of dying and sexuality concentrating on The Garden of Earthly Delights (2003) as the most paradigmatic example.

Keywords: contemporary cinema, dying and sexuality, narrative techniques, plot and visual schemes

Procedia PDF Downloads 396
5881 Automatic Early Breast Cancer Segmentation Enhancement by Image Analysis and Hough Transform

Authors: David Jurado, Carlos Ávila

Abstract:

Detection of early signs of breast cancer development is crucial to quickly diagnose the disease and to define adequate treatment to increase the survival probability of the patient. Computer Aided Detection systems (CADs), along with modern data techniques such as Machine Learning (ML) and Neural Networks (NN), have shown an overall improvement in digital mammography cancer diagnosis, reducing the false positive and false negative rates becoming important tools for the diagnostic evaluations performed by specialized radiologists. However, ML and NN-based algorithms rely on datasets that might bring issues to the segmentation tasks. In the present work, an automatic segmentation and detection algorithm is described. This algorithm uses image processing techniques along with the Hough transform to automatically identify microcalcifications that are highly correlated with breast cancer development in the early stages. Along with image processing, automatic segmentation of high-contrast objects is done using edge extraction and circle Hough transform. This provides the geometrical features needed for an automatic mask design which extracts statistical features of the regions of interest. The results shown in this study prove the potential of this tool for further diagnostics and classification of mammographic images due to the low sensitivity to noisy images and low contrast mammographies.

Keywords: breast cancer, segmentation, X-ray imaging, hough transform, image analysis

Procedia PDF Downloads 82
5880 Extended Literature Review on Sustainable Energy by Using Multi-Criteria Decision Making Techniques

Authors: Koray Altintas, Ozalp Vayvay

Abstract:

Increased global issues such as depletion of sources, environmental problems and social inequality triggered public awareness towards finding sustainable solutions in order to ensure the well-being of the current as well as future generations. Since energy plays a significant role in improved social and economic well-being and is imperative on both industrial and commercial wealth creation, it is a must to develop a standardized set of metrics which makes it possible to indicate the present condition relative to conditions in the past and to develop any perspective which is required to frame actions for the future. This is not an easy task by considering the complexity of the issue which requires integrating economic, environmental and social aspects of sustainable energy. Multi-criteria decision making (MCDM) can be considered as a form of integrated sustainability evaluation and a decision support approach that can be used to solve complex problems featuring; conflicting objectives, different forms of data and information, multi-interests and perspectives. On that matter, MCDM methods are useful for providing solutions to complex energy management problems. The aim of this study is to review MCDM approaches that can be used for examining sustainable energy management. This study presents an insight into MCDM techniques and methods that can be useful for engineers, researchers and policy makers working in the energy sector.

Keywords: sustainable energy, sustainability criteria, multi-criteria decision making, sustainability dimensions

Procedia PDF Downloads 329
5879 Patient Scheduling Improvement in a Cancer Treatment Clinic Using Optimization Techniques

Authors: Maryam Haghi, Ivan Contreras, Nadia Bhuiyan

Abstract:

Chemotherapy is one of the most popular and effective cancer treatments offered to patients in outpatient oncology centers. In such clinics, patients first consult with an oncologist and the oncologist may prescribe a chemotherapy treatment plan for the patient based on the blood test results and the examination of the health status. Then, when the plan is determined, a set of chemotherapy and consultation appointments should be scheduled for the patient. In this work, a comprehensive mathematical formulation for planning and scheduling different types of chemotherapy patients over a planning horizon considering blood test, consultation, pharmacy and treatment stages has been proposed. To be more realistic and to provide an applicable model, this study is focused on a case study related to a major outpatient cancer treatment clinic in Montreal, Canada. Comparing the results of the proposed model with the current practice of the clinic under study shows significant improvements regarding different performance measures. These major improvements in the patients’ schedules reveal that using optimization techniques in planning and scheduling of patients in such highly demanded cancer treatment clinics is an essential step to provide a good coordination between different involved stages which ultimately increases the efficiency of the entire system and promotes the staff and patients' satisfaction.

Keywords: chemotherapy patients scheduling, integer programming, integrated scheduling, staff balancing

Procedia PDF Downloads 174
5878 A Discourse on the Rhythmic Pattern Employed in Yoruba Sakara Music of Nigeria

Authors: Oludare Olupemi Ezekiel

Abstract:

This research examines the rhythmic structure of Sakara music by tracing its roots and analyzing the various rhythmic patterns of this neo-traditional genre, as well as the contributions of the major exponents and contemporary practitioners, using these as a model for understanding and establishing African rhythms. Biography of the major exponents and contemporary practitioners, interviews and participant observational methods were used to elicit information. Samples of the genre which were chosen at random were transcribed, notated and analyzed for academic use and documentation. The research affirmed that rhythms such as the Hemiola, Cross-rhythm, Clave or Bell rhythm, Percussive, Speech and Melodic rhythm and other relevant rhythmic theories were prevalent and applicable to Sakara music, while making important contributions to musical scholarship through its analysis of the music. The analysis and discussions carried out in the research pointed towards a conclusion that the Yoruba musicians are guided by some preconceptions and sound musical considerations in making their rhythmic patterns, used as compositional techniques and not mere incidental occurrence. These rhythmic patterns, with its consequential socio-cultural connotations, enhance musical values and national identity in Nigeria. The study concludes by recommending that musicologists need to carry out more research into this and other neo-traditional genres in order to advance the globalisation of African music.

Keywords: compositional techniques, globalisation, identity, neo-traditional, rhythmic theory, Sakara music

Procedia PDF Downloads 441
5877 Classification of Digital Chest Radiographs Using Image Processing Techniques to Aid in Diagnosis of Pulmonary Tuberculosis

Authors: A. J. S. P. Nileema, S. Kulatunga , S. H. Palihawadana

Abstract:

Computer aided detection (CAD) system was developed for the diagnosis of pulmonary tuberculosis using digital chest X-rays with MATLAB image processing techniques using a statistical approach. The study comprised of 200 digital chest radiographs collected from the National Hospital for Respiratory Diseases - Welisara, Sri Lanka. Pre-processing was done to remove identification details. Lung fields were segmented and then divided into four quadrants; right upper quadrant, left upper quadrant, right lower quadrant, and left lower quadrant using the image processing techniques in MATLAB. Contrast, correlation, homogeneity, energy, entropy, and maximum probability texture features were extracted using the gray level co-occurrence matrix method. Descriptive statistics and normal distribution analysis were performed using SPSS. Depending on the radiologists’ interpretation, chest radiographs were classified manually into PTB - positive (PTBP) and PTB - negative (PTBN) classes. Features with standard normal distribution were analyzed using an independent sample T-test for PTBP and PTBN chest radiographs. Among the six features tested, contrast, correlation, energy, entropy, and maximum probability features showed a statistically significant difference between the two classes at 95% confidence interval; therefore, could be used in the classification of chest radiograph for PTB diagnosis. With the resulting value ranges of the five texture features with normal distribution, a classification algorithm was then defined to recognize and classify the quadrant images; if the texture feature values of the quadrant image being tested falls within the defined region, it will be identified as a PTBP – abnormal quadrant and will be labeled as ‘Abnormal’ in red color with its border being highlighted in red color whereas if the texture feature values of the quadrant image being tested falls outside of the defined value range, it will be identified as PTBN–normal and labeled as ‘Normal’ in blue color but there will be no changes to the image outline. The developed classification algorithm has shown a high sensitivity of 92% which makes it an efficient CAD system and with a modest specificity of 70%.

Keywords: chest radiographs, computer aided detection, image processing, pulmonary tuberculosis

Procedia PDF Downloads 125
5876 Advancements in Arthroscopic Surgery Techniques for Anterior Cruciate Ligament (ACL) Reconstruction

Authors: Islam Sherif, Ahmed Ashour, Ahmed Hassan, Hatem Osman

Abstract:

Anterior Cruciate Ligament (ACL) injuries are common among athletes and individuals participating in sports with sudden stops, pivots, and changes in direction. Arthroscopic surgery is the gold standard for ACL reconstruction, aiming to restore knee stability and function. Recent years have witnessed significant advancements in arthroscopic surgery techniques, graft materials, and technological innovations, revolutionizing the field of ACL reconstruction. This presentation delves into the latest advancements in arthroscopic surgery techniques for ACL reconstruction and their potential impact on patient outcomes. Traditionally, autografts from the patellar tendon, hamstring tendon, or quadriceps tendon have been commonly used for ACL reconstruction. However, recent studies have explored the use of allografts, synthetic scaffolds, and tissue-engineered grafts as viable alternatives. This abstract evaluates the benefits and potential drawbacks of each graft type, considering factors such as graft incorporation, strength, and risk of graft failure. Moreover, the application of augmented reality (AR) and virtual reality (VR) technologies in surgical planning and intraoperative navigation has gained traction. AR and VR platforms provide surgeons with detailed 3D anatomical reconstructions of the knee joint, enhancing preoperative visualization and aiding in graft tunnel placement during surgery. We discuss the integration of AR and VR in arthroscopic ACL reconstruction procedures, evaluating their accuracy, cost-effectiveness, and overall impact on surgical outcomes. Beyond graft selection and surgical navigation, patient-specific planning has gained attention in recent research. Advanced imaging techniques, such as MRI-based personalized planning, enable surgeons to tailor ACL reconstruction procedures to each patient's unique anatomy. By accounting for individual variations in the femoral and tibial insertion sites, this personalized approach aims to optimize graft placement and potentially improve postoperative knee kinematics and stability. Furthermore, rehabilitation and postoperative care play a crucial role in the success of ACL reconstruction. This abstract explores novel rehabilitation protocols, emphasizing early mobilization, neuromuscular training, and accelerated recovery strategies. Integrating technology, such as wearable sensors and mobile applications, into postoperative care can facilitate remote monitoring and timely intervention, contributing to enhanced rehabilitation outcomes. In conclusion, this presentation provides an overview of the cutting-edge advancements in arthroscopic surgery techniques for ACL reconstruction. By embracing innovative graft materials, augmented reality, patient-specific planning, and technology-driven rehabilitation, orthopedic surgeons and sports medicine specialists can achieve superior outcomes in ACL injury management. These developments hold great promise for improving the functional outcomes and long-term success rates of ACL reconstruction, benefitting athletes and patients alike.

Keywords: arthroscopic surgery, ACL, autograft, allograft, graft materials, ACL reconstruction, synthetic scaffolds, tissue-engineered graft, virtual reality, augmented reality, surgical planning, intra-operative navigation

Procedia PDF Downloads 91
5875 The Power of in situ Characterization Techniques in Heterogeneous Catalysis: A Case Study of Deacon Reaction

Authors: Ramzi Farra, Detre Teschner, Marc Willinger, Robert Schlögl

Abstract:

Introduction: The conventional approach of characterizing solid catalysts under static conditions, i.e., before and after reaction, does not provide sufficient knowledge on the physicochemical processes occurring under dynamic conditions at the molecular level. Hence, the necessity of improving new in situ characterizing techniques with the potential of being used under real catalytic reaction conditions is highly desirable. In situ Prompt Gamma Activation Analysis (PGAA) is a rapidly developing chemical analytical technique that enables us experimentally to assess the coverage of surface species under catalytic turnover and correlate these with the reactivity. The catalytic HCl oxidation (Deacon reaction) over bulk ceria will serve as our example. Furthermore, the in situ Transmission Electron Microscopy is a powerful technique that can contribute to the study of atmosphere and temperature induced morphological or compositional changes of a catalyst at atomic resolution. The application of such techniques (PGAA and TEM) will pave the way to a greater and deeper understanding of the dynamic nature of active catalysts. Experimental/Methodology: In situ Prompt Gamma Activation Analysis (PGAA) experiments were carried out to determine the Cl uptake and the degree of surface chlorination under reaction conditions by varying p(O2), p(HCl), p(Cl2), and the reaction temperature. The abundance and dynamic evolution of OH groups on working catalyst under various steady-state conditions were studied by means of in situ FTIR with a specially designed homemade transmission cell. For real in situ TEM we use a commercial in situ holder with a home built gas feeding system and gas analytics. Conclusions: Two complimentary in situ techniques, namely in situ PGAA and in situ FTIR were utilities to investigate the surface coverage of the two most abundant species (Cl and OH). The OH density and Cl uptake were followed under multiple steady-state conditions as a function of p(O2), p(HCl), p(Cl2), and temperature. These experiments have shown that, the OH density positively correlates with the reactivity whereas Cl negatively. The p(HCl) experiments give rise to increased activity accompanied by Cl-coverage increase (opposite trend to p(O2) and T). Cl2 strongly inhibits the reaction, but no measurable increase of the Cl uptake was found. After considering all previous observations we conclude that only a minority of the available adsorption sites contribute to the reactivity. In addition, the mechanism of the catalysed reaction was proposed. The chlorine-oxygen competition for the available active sites renders re-oxidation as the rate-determining step of the catalysed reaction. Further investigations using in situ TEM are planned and will be conducted in the near future. Such experiments allow us to monitor active catalysts at the atomic scale under the most realistic conditions of temperature and pressure. The talk will shed a light on the potential and limitations of in situ PGAA and in situ TEM in the study of catalyst dynamics.

Keywords: CeO2, deacon process, in situ PGAA, in situ TEM, in situ FTIR

Procedia PDF Downloads 288
5874 Public Behavior When Encountered with a Road Traffic Accident

Authors: H. N. S. Silva, S. N. Silva

Abstract:

Introduction: The latest WHO data published in 2014 states that Sri Lanka has reached 2,773 of total deaths and over 14000 individuals’ sustained injuries due to RTAs each year. It was noticed in previous studies that policemen, three wheel drivers and also pedestrians were the first to respond to RTAs but the victim’s condition was aggravated due to unskilled attempts made by the responders while management of the victim’s wounds, moving and positioning of the victims and also mainly while transportation of the victims. Objective: To observe the practices of the urban public in Sri Lanka who are encountered with RTAs. Methods: A qualitative study was done to analyze public behavior seen on video recordings of scenes of accidents purposefully selected from social media, news websites, YouTube and Google. Results: The results showed that all individuals who tried to help during the RTA were middle aged men, who were mainly pedestrians, motorcyclists and policemen during that moment. Vast majority were very keen to actively help the victims to get to hospital as soon as possible and actively participated in providing 'aid'. But main problem was the first aid attempts were disorganized and uncoordinated. Even though all individuals knew how to control external bleeding, none of them was aware of spinal prevention techniques or management of limb injuries. Most of the transportation methods and transfer techniques used were inappropriate and more injury prone. Conclusions: The public actively engages in providing aid despite their inappropriate practices in giving first aid.

Keywords: encountered, pedestrians, road traffic accidents, urban public

Procedia PDF Downloads 285
5873 Sensor and Sensor System Design, Selection and Data Fusion Using Non-Deterministic Multi-Attribute Tradespace Exploration

Authors: Matthew Yeager, Christopher Willy, John Bischoff

Abstract:

The conceptualization and design phases of a system lifecycle consume a significant amount of the lifecycle budget in the form of direct tasking and capital, as well as the implicit costs associated with unforeseeable design errors that are only realized during downstream phases. Ad hoc or iterative approaches to generating system requirements oftentimes fail to consider the full array of feasible systems or product designs for a variety of reasons, including, but not limited to: initial conceptualization that oftentimes incorporates a priori or legacy features; the inability to capture, communicate and accommodate stakeholder preferences; inadequate technical designs and/or feasibility studies; and locally-, but not globally-, optimized subsystems and components. These design pitfalls can beget unanticipated developmental or system alterations with added costs, risks and support activities, heightening the risk for suboptimal system performance, premature obsolescence or forgone development. Supported by rapid advances in learning algorithms and hardware technology, sensors and sensor systems have become commonplace in both commercial and industrial products. The evolving array of hardware components (i.e. sensors, CPUs, modular / auxiliary access, etc…) as well as recognition, data fusion and communication protocols have all become increasingly complex and critical for design engineers during both concpetualization and implementation. This work seeks to develop and utilize a non-deterministic approach for sensor system design within the multi-attribute tradespace exploration (MATE) paradigm, a technique that incorporates decision theory into model-based techniques in order to explore complex design environments and discover better system designs. Developed to address the inherent design constraints in complex aerospace systems, MATE techniques enable project engineers to examine all viable system designs, assess attribute utility and system performance, and better align with stakeholder requirements. Whereas such previous work has been focused on aerospace systems and conducted in a deterministic fashion, this study addresses a wider array of system design elements by incorporating both traditional tradespace elements (e.g. hardware components) as well as popular multi-sensor data fusion models and techniques. Furthermore, statistical performance features to this model-based MATE approach will enable non-deterministic techniques for various commercial systems that range in application, complexity and system behavior, demonstrating a significant utility within the realm of formal systems decision-making.

Keywords: multi-attribute tradespace exploration, data fusion, sensors, systems engineering, system design

Procedia PDF Downloads 183
5872 Effects of Non-Motorized Vehicles on a Selected Intersection in Dhaka City for Non Lane Based Heterogeneous Traffic Using VISSIM 5.3

Authors: A. C. Dey, H. M. Ahsan

Abstract:

Heterogeneous traffic composed of both motorized and non-motorized vehicles that are a common feature of urban Bangladeshi roads. Popular non-motorized vehicles include rickshaws, rickshaw-van, and bicycle. These modes performed an important role in moving people and goods in the absence of a dependable mass transport system. However, rickshaws play a major role in meeting the demand for door-to-door public transport services to the city dwellers. But there is no separate lane for non-motorized vehicles in this city. Non-motorized vehicles generally occupy the outermost or curb-side lanes, however, at intersections non-motorized vehicles get mixed with the motorized vehicles. That’s why the conventional models fail to analyze the situation completely. Microscopic traffic simulation software VISSIM 5.3, itself a lane base software but default behavioral parameters [such as driving behavior, lateral distances, overtaking tendency, CCO=0.4m, CC1=1.5s] are modified for calibrating a model to analyze the effects of non-motorized traffic at an intersection (Mirpur-10) in a non-lane based mixed traffic condition. It is seen from field data that NMV occupies an average 20% of the total number of vehicles almost all the link roads. Due to the large share of non-motorized vehicles, capacity significantly drop. After analyzing simulation raw data, significant variation is noticed. Such as the average vehicular speed is reduced by 25% and the number of vehicles decreased by 30% only for the presence of NMV. Also the variation of lateral occupancy and queue delay time increase by 2.37% and 33.75% respectively. Thus results clearly show the negative effects of non-motorized vehicles on capacity at an intersection. So special management technics or restriction of NMV at major intersections may be an effective solution to improve this existing critical condition.

Keywords: lateral occupancy, non lane based intersection, nmv, queue delay time, VISSIM 5.3

Procedia PDF Downloads 155
5871 New Variational Approach for Contrast Enhancement of Color Image

Authors: Wanhyun Cho, Seongchae Seo, Soonja Kang

Abstract:

In this work, we propose a variational technique for image contrast enhancement which utilizes global and local information around each pixel. The energy functional is defined by a weighted linear combination of three terms which are called on a local, a global contrast term and dispersion term. The first one is a local contrast term that can lead to improve the contrast of an input image by increasing the grey-level differences between each pixel and its neighboring to utilize contextual information around each pixel. The second one is global contrast term, which can lead to enhance a contrast of image by minimizing the difference between its empirical distribution function and a cumulative distribution function to make the probability distribution of pixel values becoming a symmetric distribution about median. The third one is a dispersion term that controls the departure between new pixel value and pixel value of original image while preserving original image characteristics as well as possible. Second, we derive the Euler-Lagrange equation for true image that can achieve the minimum of a proposed functional by using the fundamental lemma for the calculus of variations. And, we considered the procedure that this equation can be solved by using a gradient decent method, which is one of the dynamic approximation techniques. Finally, by conducting various experiments, we can demonstrate that the proposed method can enhance the contrast of colour images better than existing techniques.

Keywords: color image, contrast enhancement technique, variational approach, Euler-Lagrang equation, dynamic approximation method, EME measure

Procedia PDF Downloads 447
5870 Contextual Toxicity Detection with Data Augmentation

Authors: Julia Ive, Lucia Specia

Abstract:

Understanding and detecting toxicity is an important problem to support safer human interactions online. Our work focuses on the important problem of contextual toxicity detection, where automated classifiers are tasked with determining whether a short textual segment (usually a sentence) is toxic within its conversational context. We use “toxicity” as an umbrella term to denote a number of variants commonly named in the literature, including hate, abuse, offence, among others. Detecting toxicity in context is a non-trivial problem and has been addressed by very few previous studies. These previous studies have analysed the influence of conversational context in human perception of toxicity in controlled experiments and concluded that humans rarely change their judgements in the presence of context. They have also evaluated contextual detection models based on state-of-the-art Deep Learning and Natural Language Processing (NLP) techniques. Counterintuitively, they reached the general conclusion that computational models tend to suffer performance degradation in the presence of context. We challenge these empirical observations by devising better contextual predictive models that also rely on NLP data augmentation techniques to create larger and better data. In our study, we start by further analysing the human perception of toxicity in conversational data (i.e., tweets), in the absence versus presence of context, in this case, previous tweets in the same conversational thread. We observed that the conclusions of previous work on human perception are mainly due to data issues: The contextual data available does not provide sufficient evidence that context is indeed important (even for humans). The data problem is common in current toxicity datasets: cases labelled as toxic are either obviously toxic (i.e., overt toxicity with swear, racist, etc. words), and thus context does is not needed for a decision, or are ambiguous, vague or unclear even in the presence of context; in addition, the data contains labeling inconsistencies. To address this problem, we propose to automatically generate contextual samples where toxicity is not obvious (i.e., covert cases) without context or where different contexts can lead to different toxicity judgements for the same tweet. We generate toxic and non-toxic utterances conditioned on the context or on target tweets using a range of techniques for controlled text generation(e.g., Generative Adversarial Networks and steering techniques). On the contextual detection models, we posit that their poor performance is due to limitations on both of the data they are trained on (same problems stated above) and the architectures they use, which are not able to leverage context in effective ways. To improve on that, we propose text classification architectures that take the hierarchy of conversational utterances into account. In experiments benchmarking ours against previous models on existing and automatically generated data, we show that both data and architectural choices are very important. Our model achieves substantial performance improvements as compared to the baselines that are non-contextual or contextual but agnostic of the conversation structure.

Keywords: contextual toxicity detection, data augmentation, hierarchical text classification models, natural language processing

Procedia PDF Downloads 169
5869 Vertebrate Model to Examine the Biological Effectiveness of Different Radiation Qualities

Authors: Rita Emília Szabó, Róbert Polanek, Tünde Tőkés, Zoltán Szabó, Szabolcs Czifrus, Katalin Hideghéty

Abstract:

Purpose: Several feature of zebrafish are making them amenable for investigation on therapeutic approaches such as ionizing radiation. The establishment of zebrafish model for comprehensive radiobiological research stands in the focus of our investigation, comparing the radiation effect curves of neutron and photon irradiation. Our final aim is to develop an appropriate vertebrate model in order to investigate the relative biological effectiveness of laser driven ionizing radiation. Methods and Materials: After careful dosimetry series of viable zebrafish embryos were exposed to a single fraction whole-body neutron-irradiation (1,25; 1,875; 2; 2,5 Gy) at the research reactor of the Technical University of Budapest and to conventional 6 MeV photon beam at 24 hour post-fertilization (hpf). The survival and morphologic abnormalities (pericardial edema, spine curvature) of each embryo were assessed for each experiment at 24-hour intervals from the point of fertilization up to 168 hpf (defining the dose lethal for 50% (LD50)). Results: In the zebrafish embryo model LD50 at 20 Gy dose level was defined and the same lethality were found at 2 Gy dose from the reactor neutron beam resulting RBE of 10. Dose-dependent organ perturbations were detected on macroscopic (shortening of the body length, spine curvature, microcephaly, micro-ophthalmia, micrognathia, pericardial edema, and inhibition of yolk sac resorption) and microscopic (marked cellular changes in skin, cardiac, gastrointestinal system) with the same magnitude of dose difference. Conclusion: In our observations, we found that zebrafish embryo model can be used for investigating the effects of different type of ionizing radiation and this system proved to be highly efficient vertebrate model for preclinical examinations.

Keywords: ionizing radiation, LD50, relative biological effectiveness, zebrafish embryo

Procedia PDF Downloads 307
5868 Road Traffic Accidents Analysis in Mexico City through Crowdsourcing Data and Data Mining Techniques

Authors: Gabriela V. Angeles Perez, Jose Castillejos Lopez, Araceli L. Reyes Cabello, Emilio Bravo Grajales, Adriana Perez Espinosa, Jose L. Quiroz Fabian

Abstract:

Road traffic accidents are among the principal causes of traffic congestion, causing human losses, damages to health and the environment, economic losses and material damages. Studies about traditional road traffic accidents in urban zones represents very high inversion of time and money, additionally, the result are not current. However, nowadays in many countries, the crowdsourced GPS based traffic and navigation apps have emerged as an important source of information to low cost to studies of road traffic accidents and urban congestion caused by them. In this article we identified the zones, roads and specific time in the CDMX in which the largest number of road traffic accidents are concentrated during 2016. We built a database compiling information obtained from the social network known as Waze. The methodology employed was Discovery of knowledge in the database (KDD) for the discovery of patterns in the accidents reports. Furthermore, using data mining techniques with the help of Weka. The selected algorithms was the Maximization of Expectations (EM) to obtain the number ideal of clusters for the data and k-means as a grouping method. Finally, the results were visualized with the Geographic Information System QGIS.

Keywords: data mining, k-means, road traffic accidents, Waze, Weka

Procedia PDF Downloads 415
5867 A Review: Detection and Classification Defects on Banana and Apples by Computer Vision

Authors: Zahow Muoftah

Abstract:

Traditional manual visual grading of fruits has been one of the agricultural industry’s major challenges due to its laborious nature as well as inconsistency in the inspection and classification process. The main requirements for computer vision and visual processing are some effective techniques for identifying defects and estimating defect areas. Automated defect detection using computer vision and machine learning has emerged as a promising area of research with a high and direct impact on the visual inspection domain. Grading, sorting, and disease detection are important factors in determining the quality of fruits after harvest. Many studies have used computer vision to evaluate the quality level of fruits during post-harvest. Many studies have used computer vision to evaluate the quality level of fruits during post-harvest. Many studies have been conducted to identify diseases and pests that affect the fruits of agricultural crops. However, most previous studies concentrated solely on the diagnosis of a lesion or disease. This study focused on a comprehensive study to identify pests and diseases of apple and banana fruits using detection and classification defects on Banana and Apples by Computer Vision. As a result, the current article includes research from these domains as well. Finally, various pattern recognition techniques for detecting apple and banana defects are discussed.

Keywords: computer vision, banana, apple, detection, classification

Procedia PDF Downloads 104