Search results for: classification accuracies
464 An Analysis of Different Essential Components of Flight Plan Operations at Low Altitude
Authors: Apisit Nawapanpong, Natthapat Boonjerm
Abstract:
This project aims to analyze and identify the flight plan of low-altitude aviation in Thailand and other countries. The development of UAV technology has led the innovation and revolution in the aviation industry; this includes the development of new modes of passenger or freight transportation, and it has also affected other industries widely. At present, this technology is being developed rapidly and has been tested all over the world to make the most efficient for technology or innovation, and it is likely to grow more extensively. However, no flight plan for low-altitude operation has been published by the government organization; when compared with high-altitude aviation with manned aircraft, various unique factors are different, whether mission, operation, altitude range or airspace restrictions. In the study of the essential components of low-altitude operation measures to be practical and tangible, there were major problems, so the main consideration of this project is to analyze the components of low-altitude operations which are conducted up to the altitudes of 400 ft or 120 meters above ground level referring to the terrain, for example, air traffic management, classification of aircraft, basic necessity and safety, and control area. This research will focus on confirming the theory through qualitative and quantitative research combined with theoretical modeling and regulatory framework and by gaining insights from various positions in aviation industries, including aviation experts, government officials, air traffic controllers, pilots, and airline operators to identify the critical essential components of low-altitude flight operation. This project analyzes by using computer programs for science and statistics research to prove that the result is equivalent to the theory and be beneficial for regulating the flight plan for low-altitude operation by different essential components from this project and can be further developed for future studies and research in aviation industries.Keywords: low-altitude aviation, UAV technology, flight plan, air traffic management, safety measures
Procedia PDF Downloads 68463 Artificial Intelligence Based Online Monitoring System for Cardiac Patient
Authors: Syed Qasim Gilani, Muhammad Umair, Muhammad Noman, Syed Bilawal Shah, Aqib Abbasi, Muhammad Waheed
Abstract:
Cardiovascular Diseases(CVD's) are the major cause of death in the world. The main reason for these deaths is the unavailability of first aid for heart failure. In many cases, patients die before reaching the hospital. We in this paper are presenting innovative online health service for Cardiac Patients. The proposed online health system has two ends. Users through device developed by us can communicate with their doctor through a mobile application. This interface provides them with first aid.Also by using this service, they have an easy interface with their doctors for attaining medical advice. According to the proposed system, we developed a device called Cardiac Care. Cardiac Care is a portable device which a patient can use at their home for monitoring heart condition. When a patient checks his/her heart condition, Electrocardiogram (ECG), Blood Pressure(BP), Temperature are sent to the central database. The severity of patients condition is checked using Artificial Intelligence Algorithm at the database. If the patient is suffering from the minor problem, our algorithm will suggest a prescription for patients. But if patient's condition is severe, patients record is sent to doctor through the mobile Android application. Doctor after reviewing patients condition suggests next step. If a doctor identifies the patient condition as critical, then the message is sent to the central database for sending an ambulance for the patient. Ambulance starts moving towards patient for bringing him/her to hospital. We have implemented this model at prototype level. This model will be life-saving for millions of people around the globe. According to this proposed model patients will be in contact with their doctors all the time.Keywords: cardiovascular disease, classification, electrocardiogram, blood pressure
Procedia PDF Downloads 184462 Facilitating Written Biology Assessment in Large-Enrollment Courses Using Machine Learning
Authors: Luanna B. Prevost, Kelli Carter, Margaurete Romero, Kirsti Martinez
Abstract:
Writing is an essential scientific practice, yet, in several countries, the increasing university science class-size limits the use of written assessments. Written assessments allow students to demonstrate their learning in their own words and permit the faculty to evaluate students’ understanding. However, the time and resources required to grade written assessments prohibit their use in large-enrollment science courses. This study examined the use of machine learning algorithms to automatically analyze student writing and provide timely feedback to the faculty about students' writing in biology. Written responses to questions about matter and energy transformation were collected from large-enrollment undergraduate introductory biology classrooms. Responses were analyzed using the LightSide text mining and classification software. Cohen’s Kappa was used to measure agreement between the LightSide models and human raters. Predictive models achieved agreement with human coding of 0.7 Cohen’s Kappa or greater. Models captured that when writing about matter-energy transformation at the ecosystem level, students focused on primarily on the concepts of heat loss, recycling of matter, and conservation of matter and energy. Models were also produced to capture writing about processes such as decomposition and biochemical cycling. The models created in this study can be used to provide automatic feedback about students understanding of these concepts to biology faculty who desire to use formative written assessments in larger enrollment biology classes, but do not have the time or personnel for manual grading.Keywords: machine learning, written assessment, biology education, text mining
Procedia PDF Downloads 281461 Close-Range Remote Sensing Techniques for Analyzing Rock Discontinuity Properties
Authors: Sina Fatolahzadeh, Sergio A. Sepúlveda
Abstract:
This paper presents advanced developments in close-range, terrestrial remote sensing techniques to enhance the characterization of rock masses. The study integrates two state-of-the-art laser-scanning technologies, the HandySCAN and GeoSLAM laser scanners, to extract high-resolution geospatial data for rock mass analysis. These instruments offer high accuracy, precision, low acquisition time, and high efficiency in capturing intricate geological features in small to medium size outcrops and slope cuts. Using the HandySCAN and GeoSLAM laser scanners facilitates real-time, three-dimensional mapping of rock surfaces, enabling comprehensive assessments of rock mass characteristics. The collected data provide valuable insights into structural complexities, surface roughness, and discontinuity patterns, which are essential for geological and geotechnical analyses. The synergy of these advanced remote sensing technologies contributes to a more precise and straightforward understanding of rock mass behavior. In this case, the main parameters of RQD, joint spacing, persistence, aperture, roughness, infill, weathering, water condition, and joint orientation in a slope cut along the Sea-to-Sky Highway, BC, were remotely analyzed to calculate and evaluate the Rock Mass Rating (RMR) and Geological Strength Index (GSI) classification systems. Automatic and manual analyses of the acquired data are then compared with field measurements. The results show the usefulness of the proposed remote sensing methods and their appropriate conformity with the actual field data.Keywords: remote sensing, rock mechanics, rock engineering, slope stability, discontinuity properties
Procedia PDF Downloads 66460 Electromagnetic Fields Characterization of an Urban Area in Lagos De Moreno Mexico and Its Correlation with Public Health Hazards
Authors: Marco Vinicio Félix Lerma, Efrain Rubio Rosas, Fernando Ricardez Rueda, Victor Manuel Castaño Meneses
Abstract:
This paper reports a spectral analysis of the exposure levels of radiofrequency electromagnetic fields originating from a wide variety of telecommunications sources present in an urban area of Lagos de Moreno, Jalisco, Mexico. The electromagnetic characterization of the urban zone under study was carried out by measurements in 118 sites. Measurements of TETRA,ISM434, LTE800, ISM868, GSM900, GSM1800, 3G UMTS,4G UMTS, Wlan2.4, LTE2.6, DECT, VHF Television and FM radio signals were performed at distances ranging over 10 to 1000m from 87 broadcasting towers concentrated in an urban area of about 3 hectares. The aim of these measurements is the evaluation of the electromagnetic fields power levels generated by communication systems because of their interaction with the human body. We found that in certain regions the general public exposure limits determined by ICNIRP (International Commission of Non Ionizing Radiation Protection) are overpassed from 5% up to 61% of the upper values, indicating an imminent health public hazard, whereas in other regions we found that these limits are not overpassed. This work proposes an electromagnetic pollution classification for urban zones according with ICNIRP standards. We conclude that the urban zone under study presents diverse levels of pollution and that in certain regions an electromagnetic shielding solution is needed in order to safeguard the health of the population that lives there. A practical solution in the form of paint coatings and fiber curtains for the buildings present in this zone is also proposed.Keywords: electromagnetic field, telecommunication systems, electropollution, health hazards
Procedia PDF Downloads 393459 Hands-off Parking: Deep Learning Gesture-based System for Individuals with Mobility Needs
Authors: Javier Romera, Alberto Justo, Ignacio Fidalgo, Joshue Perez, Javier Araluce
Abstract:
Nowadays, individuals with mobility needs face a significant challenge when docking vehicles. In many cases, after parking, they encounter insufficient space to exit, leading to two undesired outcomes: either avoiding parking in that spot or settling for improperly placed vehicles. To address this issue, the following paper presents a parking control system employing gestural teleoperation. The system comprises three main phases: capturing body markers, interpreting gestures, and transmitting orders to the vehicle. The initial phase is centered around the MediaPipe framework, a versatile tool optimized for real-time gesture recognition. MediaPipe excels at detecting and tracing body markers, with a special emphasis on hand gestures. Hands detection is done by generating 21 reference points for each hand. Subsequently, after data capture, the project employs the MultiPerceptron Layer (MPL) for indepth gesture classification. This tandem of MediaPipe's extraction prowess and MPL's analytical capability ensures that human gestures are translated into actionable commands with high precision. Furthermore, the system has been trained and validated within a built-in dataset. To prove the domain adaptation, a framework based on the Robot Operating System (ROS), as a communication backbone, alongside CARLA Simulator, is used. Following successful simulations, the system is transitioned to a real-world platform, marking a significant milestone in the project. This real vehicle implementation verifies the practicality and efficiency of the system beyond theoretical constructs.Keywords: gesture detection, mediapipe, multiperceptron layer, robot operating system
Procedia PDF Downloads 100458 Modelling the Effect of Biomass Appropriation for Human Use on Global Biodiversity
Authors: Karina Reiter, Stefan Dullinger, Christoph Plutzar, Dietmar Moser
Abstract:
Due to population growth and changing patterns of production and consumption, the demand for natural resources and, as a result, the pressure on Earth’s ecosystems are growing. Biodiversity mapping can be a useful tool for assessing species endangerment or detecting hotspots of extinction risks. This paper explores the benefits of using the change in trophic energy flows as a consequence of the human alteration of the biosphere in biodiversity mapping. To this end, multiple linear regression models were developed to explain species richness in areas where there is no human influence (i.e. wilderness) for three taxonomic groups (birds, mammals, amphibians). The models were then applied to predict (I) potential global species richness using potential natural vegetation (NPPpot) and (II) global ‘actual’ species richness after biomass appropriation using NPP remaining in ecosystems after harvest (NPPeco). By calculating the difference between predicted potential and predicted actual species numbers, maps of estimated species richness loss were generated. Results show that biomass appropriation for human use can indeed be linked to biodiversity loss. Areas for which the models predicted high species loss coincide with areas where species endangerment and extinctions are recorded to be particularly high by the International Union for Conservation of Nature and Natural Resources (IUCN). Furthermore, the analysis revealed that while the species distribution maps of the IUCN Red List of Threatened Species used for this research can determine hotspots of biodiversity loss in large parts of the world, the classification system for threatened and extinct species needs to be revised to better reflect local risks of extinction.Keywords: biodiversity loss, biomass harvest, human appropriation of net primary production, species richness
Procedia PDF Downloads 130457 Analysing “The Direction of Artificial Intelligence Legislation from a Global Perspective” from the Perspective of “AIGC Copyright Protection” Content
Authors: Xiaochen Mu
Abstract:
Due to the diversity of stakeholders and the ambiguity of ownership boundaries, the current protection models for Artificial Intelligence Generated Content (AIGC) have many disadvantages. In response to this situation, there are three different protection models worldwide. The United States Copyright Office stipulates that works autonomously generated by artificial intelligence ‘lack’ the element of human creation, and non-human AI cannot create works. To protect and promote investment in the field of artificial intelligence, UK legislation, through Section 9(3) of the CDPA, designates the author of AI-generated works as ‘the person by whom the arrangements necessary for the creation of the work are undertaken.’ China neither simply excludes the work attributes of AI-generated content based on the lack of a natural person subject as the sole reason, nor does it generalize that AIGC should or should not be protected. Instead, it combines specific case circumstances and comprehensively evaluates the degree of originality of AIGC and the contributions of natural persons to AIGC. In China's first AI drawing case, the court determined that the image in question was the result of the plaintiff's design and selection through inputting prompt words and setting parameters, reflecting the plaintiff's intellectual investment and personalized expression, and should be recognized as a work in the sense of copyright law. Despite opposition, the ruling also established the feasibility of the AIGC copyright protection path. The recognition of the work attributes of AIGC will not lead to overprotection that hinders the overall development of the AI industry. Just as with the legislation and regulation of AI by various countries, there is a need for a balance between protection and development. For example, the provisional agreement reached on the EU AI Act, based on a risk classification approach, seeks a dynamic balance between copyright protection and the development of the AI industry.Keywords: generative artificial intelligence, originality, works, copyright
Procedia PDF Downloads 42456 A Knowledge-Based Development of Risk Management Approaches for Construction Projects
Authors: Masoud Ghahvechi Pour
Abstract:
Risk management is a systematic and regular process of identifying, analyzing and responding to risks throughout the project's life cycle in order to achieve the optimal level of elimination, reduction or control of risk. The purpose of project risk management is to increase the probability and effect of positive events and reduce the probability and effect of unpleasant events on the project. Risk management is one of the most fundamental parts of project management, so that unmanaged or untransmitted risks can be one of the primary factors of failure in a project. Effective risk management does not apply to risk regression, which is apparently the cheapest option of the activity. However, the main problem with this option is the economic sensitivity, because what is potentially profitable is by definition risky, and what does not pose a risk is economically interesting and does not bring tangible benefits. Therefore, in relation to the implemented project, effective risk management is finding a "middle ground" in its management, which includes, on the one hand, protection against risk from a negative direction by means of accurate identification and classification of risk, which leads to analysis And it becomes a comprehensive analysis. On the other hand, management using all mathematical and analytical tools should be based on checking the maximum benefits of these decisions. Detailed analysis, taking into account all aspects of the company, including stakeholder analysis, will allow us to add what will become tangible benefits for our project in the future to effective risk management. Identifying the risk of the project is based on the theory that which type of risk may affect the project, and also refers to specific parameters and estimating the probability of their occurrence in the project. These conditions can be divided into three groups: certainty, uncertainty, and risk, which in turn support three types of investment: risk preference, risk neutrality, specific risk deviation, and its measurement. The result of risk identification and project analysis is a list of events that indicate the cause and probability of an event, and a final assessment of its impact on the environment.Keywords: risk, management, knowledge, risk management
Procedia PDF Downloads 66455 Effect of Humor on Pain and Anxiety in Patients with Rheumatoi̇d Arthri̇ti̇s: A Prospective, Randomized Controlled Study
Authors: Burcu Babadağ Savaş, Nihal Orlu, Güler Balcı Alparslan, Ertuğrul Çolak, Cengiz Korkmaz
Abstract:
Introduction/objectives: We aimed to investigate the effect of humor on pain and state anxiety in patients with rheumatoid arthritis (RA) receiving biologic intravenous (IV) infusion therapy. Method: The study sample consisted of 36 patients who met the classification criteria for RA and inclusion criteria in a rheumatology outpatient clinic at a university hospital between September 2020 and November 2021. Two sample groups were formed: the intervention group (watching a comedy movie) (n=18) and the control group (n=18). The intervention group consisted of the patient watching a comedy movie of his/her choice from an archive created by the researchers during the biological IV infusion therapy (approximately 90-120 minutes). The data collection instruments used before and after the test were the descriptive identification form, the visual analog scale (VAS), and the state anxiety scale. Results: The mean VAS scores of patients in the intervention group were 5.05 ± 2.01 in the pre-test and 2.61 ± 1.91 in the post-test. The mean state anxiety scores of patients in the intervention group were 45.94 ± 9.97 in the pre-test and 34.22 ± 6.57 in the post-test. Thus, patients who watched comedy movies during biologic IV infusion therapy in the infusion center had a greater reduction in pain scores than the control group and the effect size was small. Although there was a decrease in state anxiety scores in both groups, there was no significant difference between groups and the effect size was not relevant. Conclusions: During IV infusion therapy, watching comedy movies is recommended as a nursing care intervention for reducing pain in patients with RA in cooperation with other health professionals.Keywords: watching comedy movie, humor, pain, anxiety, nursing, care
Procedia PDF Downloads 139454 Functional Feeding Groups and Trophic Levels of Benthic Macroinvertebrates Assemblages in Albertine Rift Rivers and Streams in South Western Uganda
Authors: Peace Liz Sasha Musonge
Abstract:
Behavioral aspects of species nutrition such as feeding methods and food type are archetypal biological traits signifying how species have adapted to their environment. This concept of functional feeding groups (FFG) analysis is currently used to ascertain the trophic levels of the aquatic food web in a specific microhabitat. However, in Eastern Africa, information about the FFG classification of benthic macroinvertebrates in highland rivers and streams is almost absent, and existing studies have fragmented datasets. For this reason, we carried out a robust study to determine the feed type, trophic level and FFGs, of 56 macroinvertebrate taxa (identified to family level) from Albertine rift valley streams. Our findings showed that all five major functional feeding groups were represented; Gatherer Collectors (GC); Predators (PR); shredders (SH); Scrapers (SC); and Filterer collectors. The most dominant functional feeding group was the Gatherer Collectors (GC) that accounted for 53.5% of the total population. The most abundant (GC) families were Baetidae (7813 individuals), Chironomidae NTP (5628) and Caenidae (1848). Majority of the macroinvertebrate population feed on Fine particulate organic matter (FPOM) from the stream bottom. In terms of taxa richness the Predators (PR) had the highest value of 24 taxa and the Filterer Collectors group had the least number of taxa (3). The families that had the highest number of predators (PR) were Corixidae (1024 individuals), Coenagrionidae (445) and Libellulidae (283). However, Predators accounted for only 7.4% of the population. The findings highlighted the functional feeding groups and habitat type of macroinvertebrate communities along an altitudinal gradient.Keywords: trophic levels, functional feeding groups, macroinvertebrates, Albertine rift
Procedia PDF Downloads 235453 Advances of Image Processing in Precision Agriculture: Using Deep Learning Convolution Neural Network for Soil Nutrient Classification
Authors: Halimatu S. Abdullahi, Ray E. Sheriff, Fatima Mahieddine
Abstract:
Agriculture is essential to the continuous existence of human life as they directly depend on it for the production of food. The exponential rise in population calls for a rapid increase in food with the application of technology to reduce the laborious work and maximize production. Technology can aid/improve agriculture in several ways through pre-planning and post-harvest by the use of computer vision technology through image processing to determine the soil nutrient composition, right amount, right time, right place application of farm input resources like fertilizers, herbicides, water, weed detection, early detection of pest and diseases etc. This is precision agriculture which is thought to be solution required to achieve our goals. There has been significant improvement in the area of image processing and data processing which has being a major challenge. A database of images is collected through remote sensing, analyzed and a model is developed to determine the right treatment plans for different crop types and different regions. Features of images from vegetations need to be extracted, classified, segmented and finally fed into the model. Different techniques have been applied to the processes from the use of neural network, support vector machine, fuzzy logic approach and recently, the most effective approach generating excellent results using the deep learning approach of convolution neural network for image classifications. Deep Convolution neural network is used to determine soil nutrients required in a plantation for maximum production. The experimental results on the developed model yielded results with an average accuracy of 99.58%.Keywords: convolution, feature extraction, image analysis, validation, precision agriculture
Procedia PDF Downloads 315452 Gender Estimation by Means of Quantitative Measurements of Foramen Magnum: An Analysis of CT Head Images
Authors: Thilini Hathurusinghe, Uthpalie Siriwardhana, W. M. Ediri Arachchi, Ranga Thudugala, Indeewari Herath, Gayani Senanayake
Abstract:
The foramen magnum is more prone to protect than other skeletal remains during high impact and severe disruptive injuries. Therefore, it is worthwhile to explore whether these measurements can be used to determine the human gender which is vital in forensic and anthropological studies. The idea was to find out the ability to use quantitative measurements of foramen magnum as an anatomical indicator for human gender estimation and to evaluate the gender-dependent variations of foramen magnum using quantitative measurements. Randomly selected 113 subjects who underwent CT head scans at Sri Jayawardhanapura General Hospital of Sri Lanka within a period of six months, were included in the study. The sample contained 58 males (48.76 ± 14.7 years old) and 55 females (47.04 ±15.9 years old). Maximum length of the foramen magnum (LFM), maximum width of the foramen magnum (WFM), minimum distance between occipital condyles (MnD) and maximum interior distance between occipital condyles (MxID) were measured. Further, AreaT and AreaR were also calculated. The gender was estimated using binomial logistic regression. The mean values of all explanatory variables (LFM, WFM, MnD, MxID, AreaT, and AreaR) were greater among male than female. All explanatory variables except MnD (p=0.669) were statistically significant (p < 0.05). Significant bivariate correlations were demonstrated by AreaT and AreaR with the explanatory variables. The results evidenced that WFM and MxID were the best measurements in predicting gender according to binomial logistic regression. The estimated model was: log (p/1-p) =10.391-0.136×MxID-0.231×WFM, where p is the probability of being a female. The classification accuracy given by the above model was 65.5%. The quantitative measurements of foramen magnum can be used as a reliable anatomical marker for human gender estimation in the Sri Lankan context.Keywords: foramen magnum, forensic and anthropological studies, gender estimation, logistic regression
Procedia PDF Downloads 151451 Assessment of Knowledge, Awareness about Hemorrhoids Causes and Stages among the General Public of Saudi Arabia
Authors: Asaiel Mubark Al Hadi
Abstract:
Background: A frequent anorectal condition known as hemorrhoids, sometimes known as piles, is characterized by a weakening of the anal cushion and the supporting tissue as well as spasms of the internal sphincter. Hemorrhoids are most frequently identified by painless bright red bleeding, prolapse, annoying grape-like tissue prolapse, itching, or a combination of symptoms. digital rectal examination (DRE) and anoscope are used to diagnose it. Constipation, a low-fiber diet, a high body- mass index (BMI), pregnancy, and a reduced physical activity are among the factors that are typically thought to increase the risk of hemorrhoids. Golighers is the most commonly used hemorrhoid classification scheme It is 4 degrees, which determines the degree of the event. The purpose of this study is to assess knowledge and awareness level of the causes and stages of Hemorrhoids in the public of Saudi Arabia. Method: This cross-sectional study was conducted in the Saudi Arabia between Oct 2022- Dec 2022. The study group included at least 384 aged above 18 years. The outcomes of this study were analyzed using the SPSS program using a pre-tested questionnaire. Results: The study included 1410 participants, 69.9% of them were females and 30.1% were males. 53.7% of participants aged 20- 30 years old. 17% of participants had hemorrhoids and 42% had a relative who had hemorrhoids. 42.8% of participants could identify stage 1 of hemorrhoids correctly, 44.7% identified stage 2 correctly, 46.7% identified stage 3 correctly and 58.1% identified stage 4 correctly. Only 28.9% of participants had high level of knowledge about hemorrhoids, 62.7% had moderate knowledge and 8.4% had low knowledge. Conclusion: In conclusion, Saudi general population has poor knowledge of hemorrhoids, their causes and their management approach. There was a significant association between knowledge scores of hemorrhoids with age, gender, residence area and employment.Keywords: hemorrhoids, external hemorrhoid, internal hemorrhoid, anal fissure, hemorrhoid stages, prolapse, rectal bleeding
Procedia PDF Downloads 97450 Stress and Rhythm in the Educated Nigerian Accent of English
Authors: Nkereke M. Essien
Abstract:
The intention of this paper is to examine stress in the Educated Nigerian Accent of English (ENAE) with the aim of analyzing stress and rhythmic patterns of Nigerian English. Our aim also is to isolate differences and similarities in the stress patterns studied and also know what forms the accent of these Educated Nigerian English (ENE) which marks them off from other groups or English’s of the world, to ascertain and characterize it and to provide documented evidence for its existence. Nigerian stress and rhythmic patterns are significantly different from the British English stress and rhythmic patterns consequently, the educated Nigerian English (ENE) features more stressed syllables than the native speakers’ varieties. The excessive stressed of syllables causes a contiguous “Ss” in the rhythmic flow of ENE, and this brings about a “jerky rhythm’ which distorts communication. To ascertain this claim, ten (10) Nigerian speakers who are educated in the English Language were selected by a stratified Random Sampling technique from two Federal Universities in Nigeria. This classification belongs to the education to the educated class or standard variety. Their performance was compared to that of a Briton (control). The Metrical system of analysis was used. The respondents were made to read some words and utterance which was recorded and analyzed perceptually, statistically and acoustically using the one-way Analysis of Variance (ANOVA). The Turky-Kramer Post Hoc test, the Wilcoxon Matched Pairs Signed Ranks test, and the Praat analysis software were used in the analysis. It was revealed from our findings that the Educated Nigerian English speakers feature more stressed syllables in their productions by spending more time in pronouncing stressed syllables and sometimes lesser time in pronouncing the unstressed syllables. Their overall tempo was faster. The ENE speakers used tone to mark prominence while the native speaker used stress to mark pronounce, typified by the control. We concluded that the stress pattern of the ENE speakers was significantly different from the native speaker’s variety represented by the control’s performance.Keywords: accent, Nigerian English, rhythm, stress
Procedia PDF Downloads 240449 Shift in the Rhizosphere Soil Fungal Community Associated with Root Rot Infection of Plukenetia Volubilis Linneo Caused by Fusarium and Rhizopus Species
Authors: Constantine Uwaremwe, Wenjie Bao, Bachir Goudia Daoura, Sandhya Mishra, Xianxian Zhang, Lingjie Shen, Shangwen Xia, Xiaodong Yang
Abstract:
Background: Plukenetia volubilis Linneo is an oleaginous plant belonging to the family Euphorbiaceae. Due to its seeds containing a high content of edible oil and rich in vitamins, P. volubilis is cultivated as an economical plant worldwide. However, the cultivation and growth of P. volubilis is challenged by phytopathogen invasion leading to production loss. Methods: In the current study, we tested the pathogenicity of fungal pathogens isolated from root rot infected P. volubilis plant tissues by inoculating them into healthy P. volubilis seedlings. Metagenomic sequencing was used to assess the shift in the fungal community of P. volubilis rhizosphere soil after root rot infection. Results: Four Fusarium isolates and two Rhizopus isolates were found to be root rot causative agents of P. volubilis as they induced typical root rot symptoms in healthy seedlings. The metagenomic sequencing data showed that root rot infection altered the rhizosphere fungal community. In root rot infected soil, the richness and diversity indices increased or decreased depending on pathogens. The four most abundant phyla across all samples were Ascomycota, Glomeromycota, Basidiomycota, and Mortierellomycota. In infected soil, the relative abundance of each phylum increased or decreased depending on the pathogen and functional taxonomic classification. Conclusions: Based on our results, we concluded that Fusarium and Rhizopus species cause root rot infection of P. volubilis. In root rot infected P. volubilis, the shift in the rhizosphere fungal community was pathogen-dependent. These findings may serve as a key point for a future study on the biocontrol of root rot of P. volubilis.Keywords: fusarium spp., plukenetia volubilis l., rhizopus spp., rhizosphere fungal community, root rot
Procedia PDF Downloads 41448 Characterization of Atmospheric Aerosols by Developing a Cascade Impactor
Authors: Sapan Bhatnagar
Abstract:
Micron size particles emitted from different sources and produced by combustion have serious negative effects on human health and environment. They can penetrate deep into our lungs through the respiratory system. Determination of the amount of particulates present in the atmosphere per cubic meter is necessary to monitor, regulate and model atmospheric particulate levels. Cascade impactor is used to collect the atmospheric particulates and by gravimetric analysis, their concentration in the atmosphere of different size ranges can be determined. Cascade impactors have been used for the classification of particles by aerodynamic size. They operate on the principle of inertial impaction. It consists of a number of stages each having an impaction plate and a nozzle. Collection plates are connected in series with smaller and smaller cutoff diameter. Air stream passes through the nozzle and the plates. Particles in the stream having large enough inertia impact upon the plate and smaller particles pass onto the next stage. By designing each successive stage with higher air stream velocity in the nozzle, smaller diameter particles will be collected at each stage. Particles too small to be impacted on the last collection plate will be collected on a backup filter. Impactor consists of 4 stages each made of steel, having its cut-off diameters less than 10 microns. Each stage is having collection plates, soaked with oil to prevent bounce and allows the impactor to function at high mass concentrations. Even after the plate is coated with particles, the incoming particle will still have a wet surface which significantly reduces particle bounce. The particles that are too small to be impacted on the last collection plate are then collected on a backup filter (microglass fiber filter), fibers provide larger surface area to which particles may adhere and voids in filter media aid in reducing particle re-entrainment.Keywords: aerodynamic diameter, cascade, environment, particulates, re-entrainment
Procedia PDF Downloads 320447 Quality Analysis of Vegetables Through Image Processing
Authors: Abdul Khalique Baloch, Ali Okatan
Abstract:
The quality analysis of food and vegetable from image is hot topic now a day, where researchers make them better then pervious findings through different technique and methods. In this research we have review the literature, and find gape from them, and suggest better proposed approach, design the algorithm, developed a software to measure the quality from images, where accuracy of image show better results, and compare the results with Perouse work done so for. The Application we uses an open-source dataset and python language with tensor flow lite framework. In this research we focus to sort food and vegetable from image, in the images, the application can sorts and make them grading after process the images, it could create less errors them human base sorting errors by manual grading. Digital pictures datasets were created. The collected images arranged by classes. The classification accuracy of the system was about 94%. As fruits and vegetables play main role in day-to-day life, the quality of fruits and vegetables is necessary in evaluating agricultural produce, the customer always buy good quality fruits and vegetables. This document is about quality detection of fruit and vegetables using images. Most of customers suffering due to unhealthy foods and vegetables by suppliers, so there is no proper quality measurement level followed by hotel managements. it have developed software to measure the quality of the fruits and vegetables by using images, it will tell you how is your fruits and vegetables are fresh or rotten. Some algorithms reviewed in this thesis including digital images, ResNet, VGG16, CNN and Transfer Learning grading feature extraction. This application used an open source dataset of images and language used python, and designs a framework of system.Keywords: deep learning, computer vision, image processing, rotten fruit detection, fruits quality criteria, vegetables quality criteria
Procedia PDF Downloads 70446 Role of Endotherapy vs Surgery in the Management of Traumatic Pancreatic Injury: A Tertiary Center Experience
Authors: Thinakar Mani Balusamy, Ratnakar S. Kini, Bharat Narasimhan, Venkateswaran A. R, Pugazhendi Thangavelu, Mohammed Ali, Prem Kumar K., Kani Sheikh M., Sibi Thooran Karmegam, Radhakrishnan N., Mohammed Noufal
Abstract:
Introduction: Pancreatic injury remains a complicated condition requiring an individualized case by case approach to management. In this study, we aim to analyze the varied presentations and treatment outcomes of traumatic pancreatic injury in a tertiary care center. Methods: All consecutive patients hospitalized at our center with traumatic pancreatic injury between 2013 and 2017 were included. The American Association for Surgery of Trauma (AAST) classification was used to stratify patients into five grades of severity. Outcome parameters were then analyzed based on the treatment modality employed. Results: Of the 35 patients analyzed, 26 had an underlying blunt trauma with the remaining nine presenting due to penetrating injury. Overall in-hospital mortality was 28%. 19 of these patients underwent exploratory laparotomy with the remaining 16 managed nonoperatively. Nine patients had a severe injury ( > grade 3) – of which four underwent endotherapy, three had stents placed and one underwent an endoscopic pseudocyst drainage. Among those managed nonoperatively, three underwent a radiological drainage procedure. Conclusion: Mortality rates were clearly higher in patients managed operatively. This is likely a result of significantly higher degrees of major associated non-pancreatic injuries and not just a reflection of surgical morbidity. Despite this, surgical management remains the mainstay of therapy, especially in higher grades of pancreatic injury. However we would like to emphasize that endoscopic intervention definitely remains the preferred treatment modality when the clinical setting permits. This is especially applicable in cases of main pancreatic duct injury with ascites as well as pseudocysts.Keywords: endotherapy, non-operative management, surgery, traumatic pancreatic injury
Procedia PDF Downloads 207445 Land Use Dynamics of Ikere Forest Reserve, Nigeria Using Geographic Information System
Authors: Akintunde Alo
Abstract:
The incessant encroachments into the forest ecosystem by the farmers and local contractors constitute a major threat to the conservation of genetic resources and biodiversity in Nigeria. To propose a viable monitoring system, this study employed Geographic Information System (GIS) technology to assess the changes that occurred for a period of five years (between 2011 and 2016) in Ikere forest reserve. Landsat imagery of the forest reserve was obtained. For the purpose of geo-referencing the acquired satellite imagery, ground-truth coordinates of some benchmark places within the forest reserve was relied on. Supervised classification algorithm, image processing, vectorization and map production were realized using ArcGIS. Various land use systems within the forest ecosystem were digitized into polygons of different types and colours for 2011 and 2016, roads were represented with lines of different thickness and colours. Of the six land-use delineated, the grassland increased from 26.50 % in 2011 to 45.53% in 2016 of the total land area with a percentage change of 71.81 %. Plantations of Gmelina arborea and Tectona grandis on the other hand reduced from 62.16 % in 2011 to 27.41% in 2016. The farmland and degraded land recorded percentage change of about 176.80 % and 8.70 % respectively from 2011 to 2016. Overall, the rate of deforestation in the study area is on the increase and becoming severe. About 72.59% of the total land area has been converted to non-forestry uses while the remnant 27.41% is occupied by plantations of Gmelina arborea and Tectona grandis. Interestingly, over 55 % of the plantation area in 2011 has changed to grassland, or converted to farmland and degraded land in 2016. The rate of change over time was about 9.79 % annually. Based on the results, rapid actions to prevail on the encroachers to stop deforestation and encouraged re-afforestation in the study area are recommended.Keywords: land use change, forest reserve, satellite imagery, geographical information system
Procedia PDF Downloads 356444 Comparative Correlation Investigation of Polynuclear Aromatic Hydrocarbons (PAHs) in Soils of Different Land Uses: Sources Evaluation Perspective
Authors: O. Onoriode Emoyan, E. Eyitemi Akporhonor, Charles Otobrise
Abstract:
Polycyclic Aromatic Hydrocarbons (PAHs) are formed mainly as a result of incomplete combustion of organic materials during industrial, domestic activities or natural occurrence. Their toxicity and contamination of terrestrial and aquatic ecosystem have been established. Though with limited validity index, previous research has focused on PAHs isomer pair ratios of variable physicochemical properties in source identification. The objective of this investigation was to determine the empirical validity of Pearson correlation coefficient (PCC) and cluster analysis (CA) in PAHs source identification along soil samples of different land uses. Therefore, 16 PAHs grouped as endocrine disruption substances (EDSs) were determined in 10 sample stations in top and sub soils seasonally. PAHs was determined the use of Varian 300 gas chromatograph interfaced with flame ionization detector. Instruments and reagents used are of standard and chromatographic grades respectively. PCC and CA results showed that the classification of PAHs along kinetically and thermodyanamically-favoured and those derived directly from plants product through biologically mediated processes used in source signature is about the predominance PAHs are likely to be. Therefore the observed PAHs in the studied stations have trace quantities of the vast majority of the sixteen un-substituted PAHs which may ultimately inhabit the actual source signature authentication. Type and extent of bacterial metabolism, transformation products/substrates, and environmental factors such as: salinity, pH, oxygen concentration, nutrients, light intensity, temperature, co-substrates and environmental medium are hereby recommended as factors to be considered when evaluating possible sources of PAHs.Keywords: comparative correlation, kinetically and thermodynamically-favored PAHs, pearson correlation coefficient, cluster analysis, sources evaluation
Procedia PDF Downloads 419443 Data Mining Spatial: Unsupervised Classification of Geographic Data
Authors: Chahrazed Zouaoui
Abstract:
In recent years, the volume of geospatial information is increasing due to the evolution of communication technologies and information, this information is presented often by geographic information systems (GIS) and stored on of spatial databases (BDS). The classical data mining revealed a weakness in knowledge extraction at these enormous amounts of data due to the particularity of these spatial entities, which are characterized by the interdependence between them (1st law of geography). This gave rise to spatial data mining. Spatial data mining is a process of analyzing geographic data, which allows the extraction of knowledge and spatial relationships from geospatial data, including methods of this process we distinguish the monothematic and thematic, geo- Clustering is one of the main tasks of spatial data mining, which is registered in the part of the monothematic method. It includes geo-spatial entities similar in the same class and it affects more dissimilar to the different classes. In other words, maximize intra-class similarity and minimize inter similarity classes. Taking account of the particularity of geo-spatial data. Two approaches to geo-clustering exist, the dynamic processing of data involves applying algorithms designed for the direct treatment of spatial data, and the approach based on the spatial data pre-processing, which consists of applying clustering algorithms classic pre-processed data (by integration of spatial relationships). This approach (based on pre-treatment) is quite complex in different cases, so the search for approximate solutions involves the use of approximation algorithms, including the algorithms we are interested in dedicated approaches (clustering methods for partitioning and methods for density) and approaching bees (biomimetic approach), our study is proposed to design very significant to this problem, using different algorithms for automatically detecting geo-spatial neighborhood in order to implement the method of geo- clustering by pre-treatment, and the application of the bees algorithm to this problem for the first time in the field of geo-spatial.Keywords: mining, GIS, geo-clustering, neighborhood
Procedia PDF Downloads 375442 Gender Bias in Natural Language Processing: Machines Reflect Misogyny in Society
Authors: Irene Yi
Abstract:
Machine learning, natural language processing, and neural network models of language are becoming more and more prevalent in the fields of technology and linguistics today. Training data for machines are at best, large corpora of human literature and at worst, a reflection of the ugliness in society. Machines have been trained on millions of human books, only to find that in the course of human history, derogatory and sexist adjectives are used significantly more frequently when describing females in history and literature than when describing males. This is extremely problematic, both as training data, and as the outcome of natural language processing. As machines start to handle more responsibilities, it is crucial to ensure that they do not take with them historical sexist and misogynistic notions. This paper gathers data and algorithms from neural network models of language having to deal with syntax, semantics, sociolinguistics, and text classification. Results are significant in showing the existing intentional and unintentional misogynistic notions used to train machines, as well as in developing better technologies that take into account the semantics and syntax of text to be more mindful and reflect gender equality. Further, this paper deals with the idea of non-binary gender pronouns and how machines can process these pronouns correctly, given its semantic and syntactic context. This paper also delves into the implications of gendered grammar and its effect, cross-linguistically, on natural language processing. Languages such as French or Spanish not only have rigid gendered grammar rules, but also historically patriarchal societies. The progression of society comes hand in hand with not only its language, but how machines process those natural languages. These ideas are all extremely vital to the development of natural language models in technology, and they must be taken into account immediately.Keywords: gendered grammar, misogynistic language, natural language processing, neural networks
Procedia PDF Downloads 120441 Visualization Tool for EEG Signal Segmentation
Authors: Sweeti, Anoop Kant Godiyal, Neha Singh, Sneh Anand, B. K. Panigrahi, Jayasree Santhosh
Abstract:
This work is about developing a tool for visualization and segmentation of Electroencephalograph (EEG) signals based on frequency domain features. Change in the frequency domain characteristics are correlated with change in mental state of the subject under study. Proposed algorithm provides a way to represent the change in the mental states using the different frequency band powers in form of segmented EEG signal. Many segmentation algorithms have been suggested in literature having application in brain computer interface, epilepsy and cognition studies that have been used for data classification. But the proposed method focusses mainly on the better presentation of signal and that’s why it could be a good utilization tool for clinician. Algorithm performs the basic filtering using band pass and notch filters in the range of 0.1-45 Hz. Advanced filtering is then performed by principal component analysis and wavelet transform based de-noising method. Frequency domain features are used for segmentation; considering the fact that the spectrum power of different frequency bands describes the mental state of the subject. Two sliding windows are further used for segmentation; one provides the time scale and other assigns the segmentation rule. The segmented data is displayed second by second successively with different color codes. Segment’s length can be selected as per need of the objective. Proposed algorithm has been tested on the EEG data set obtained from University of California in San Diego’s online data repository. Proposed tool gives a better visualization of the signal in form of segmented epochs of desired length representing the power spectrum variation in data. The algorithm is designed in such a way that it takes the data points with respect to the sampling frequency for each time frame and so it can be improved to use in real time visualization with desired epoch length.Keywords: de-noising, multi-channel data, PCA, power spectra, segmentation
Procedia PDF Downloads 397440 White Wine Discrimination Based on Deconvoluted Surface Enhanced Raman Spectroscopy Signals
Authors: Dana Alina Magdas, Nicoleta Simona Vedeanu, Ioana Feher, Rares Stiufiuc
Abstract:
Food and beverages authentication using rapid and non-expensive analytical tools represents nowadays an important challenge. In this regard, the potential of vibrational techniques in food authentication has gained an increased attention during the last years. For wines discrimination, Raman spectroscopy appears more feasible to be used as compared with IR (infrared) spectroscopy, because of the relatively weak water bending mode in the vibrational spectroscopy fingerprint range. Despite this, the use of Raman technique in wine discrimination is in an early stage. Taking this into consideration, the wine discrimination potential of surface-enhanced Raman scattering (SERS) technique is reported in the present work. The novelty of this study, compared with the previously reported studies, concerning the application of vibrational techniques in wine discrimination consists in the fact that the present work presents the wines differentiation based on the individual signals obtained from deconvoluted spectra. In order to achieve wines classification with respect to variety, geographical origin and vintage, the peaks intensities obtained after spectra deconvolution were compared using supervised chemometric methods like Linear Discriminant Analysis (LDA). For this purpose, a set of 20 white Romanian wines from different viticultural Romanian regions four varieties, was considered. Chemometric methods applied directly to row SERS experimental spectra proved their efficiency, but discrimination markers identification found to be very difficult due to the overlapped signals as well as for the band shifts. By using this approach, a better general view related to the differences that appear among the wines in terms of compositional differentiation could be reached.Keywords: chemometry, SERS, variety, wines discrimination
Procedia PDF Downloads 160439 Adapting Tools for Text Monitoring and for Scenario Analysis Related to the Field of Social Disasters
Authors: Svetlana Cojocaru, Mircea Petic, Inga Titchiev
Abstract:
Humanity faces more and more often with different social disasters, which in turn can generate new accidents and catastrophes. To mitigate their consequences, it is important to obtain early possible signals about the events which are or can occur and to prepare the corresponding scenarios that could be applied. Our research is focused on solving two problems in this domain: identifying signals related that an accident occurred or may occur and mitigation of some consequences of disasters. To solve the first problem, methods of selecting and processing texts from global network Internet are developed. Information in Romanian is of special interest for us. In order to obtain the mentioned tools, we should follow several steps, divided into preparatory stage and processing stage. Throughout the first stage, we manually collected over 724 news articles and classified them into 10 categories of social disasters. It constitutes more than 150 thousand words. Using this information, a controlled vocabulary of more than 300 keywords was elaborated, that will help in the process of classification and identification of the texts related to the field of social disasters. To solve the second problem, the formalism of Petri net has been used. We deal with the problem of inhabitants’ evacuation in useful time. The analysis methods such as reachability or coverability tree and invariants technique to determine dynamic properties of the modeled systems will be used. To perform a case study of properties of extended evacuation system by adding time, the analysis modules of PIPE such as Generalized Stochastic Petri Nets (GSPN) Analysis, Simulation, State Space Analysis, and Invariant Analysis have been used. These modules helped us to obtain the average number of persons situated in the rooms and the other quantitative properties and characteristics related to its dynamics.Keywords: lexicon of disasters, modelling, Petri nets, text annotation, social disasters
Procedia PDF Downloads 197438 A Machine Learning Approach for Detecting and Locating Hardware Trojans
Authors: Kaiwen Zheng, Wanting Zhou, Nan Tang, Lei Li, Yuanhang He
Abstract:
The integrated circuit industry has become a cornerstone of the information society, finding widespread application in areas such as industry, communication, medicine, and aerospace. However, with the increasing complexity of integrated circuits, Hardware Trojans (HTs) implanted by attackers have become a significant threat to their security. In this paper, we proposed a hardware trojan detection method for large-scale circuits. As HTs introduce physical characteristic changes such as structure, area, and power consumption as additional redundant circuits, we proposed a machine-learning-based hardware trojan detection method based on the physical characteristics of gate-level netlists. This method transforms the hardware trojan detection problem into a machine-learning binary classification problem based on physical characteristics, greatly improving detection speed. To address the problem of imbalanced data, where the number of pure circuit samples is far less than that of HTs circuit samples, we used the SMOTETomek algorithm to expand the dataset and further improve the performance of the classifier. We used three machine learning algorithms, K-Nearest Neighbors, Random Forest, and Support Vector Machine, to train and validate benchmark circuits on Trust-Hub, and all achieved good results. In our case studies based on AES encryption circuits provided by trust-hub, the test results showed the effectiveness of the proposed method. To further validate the method’s effectiveness for detecting variant HTs, we designed variant HTs using open-source HTs. The proposed method can guarantee robust detection accuracy in the millisecond level detection time for IC, and FPGA design flows and has good detection performance for library variant HTs.Keywords: hardware trojans, physical properties, machine learning, hardware security
Procedia PDF Downloads 147437 Investigation of Different Machine Learning Algorithms in Large-Scale Land Cover Mapping within the Google Earth Engine
Authors: Amin Naboureh, Ainong Li, Jinhu Bian, Guangbin Lei, Hamid Ebrahimy
Abstract:
Large-scale land cover mapping has become a new challenge in land change and remote sensing field because of involving a big volume of data. Moreover, selecting the right classification method, especially when there are different types of landscapes in the study area is quite difficult. This paper is an attempt to compare the performance of different machine learning (ML) algorithms for generating a land cover map of the China-Central Asia–West Asia Corridor that is considered as one of the main parts of the Belt and Road Initiative project (BRI). The cloud-based Google Earth Engine (GEE) platform was used for generating a land cover map for the study area from Landsat-8 images (2017) by applying three frequently used ML algorithms including random forest (RF), support vector machine (SVM), and artificial neural network (ANN). The selected ML algorithms (RF, SVM, and ANN) were trained and tested using reference data obtained from MODIS yearly land cover product and very high-resolution satellite images. The finding of the study illustrated that among three frequently used ML algorithms, RF with 91% overall accuracy had the best result in producing a land cover map for the China-Central Asia–West Asia Corridor whereas ANN showed the worst result with 85% overall accuracy. The great performance of the GEE in applying different ML algorithms and handling huge volume of remotely sensed data in the present study showed that it could also help the researchers to generate reliable long-term land cover change maps. The finding of this research has great importance for decision-makers and BRI’s authorities in strategic land use planning.Keywords: land cover, google earth engine, machine learning, remote sensing
Procedia PDF Downloads 113436 The Predictors of Head and Neck Cancer-Head and Neck Cancer-Related Lymphedema in Patients with Resected Advanced Head and Neck Cancer
Authors: Shu-Ching Chen, Li-Yun Lee
Abstract:
The purpose of the study was to identify the factors associated with head and neck cancer-related lymphoedema (HNCRL)-related symptoms, body image, and HNCRL-related functional outcomes among patients with resected advanced head and neck cancer. A cross-sectional correlational design was conducted to examine the predictors of HNCRL-related functional outcomes in patients with resected advanced head and neck cancer. Eligible patients were recruited from a single medical center in northern Taiwan. Consecutive patients were approached and recruited from the Radiation Head and Neck Outpatient Department of this medical center. Eligible subjects were assessed for the Symptom Distress Scale–Modified for Head and Neck Cancer (SDS-mhnc), Brief International Classification of Functioning, Disability and Health (ICF) Core Set for Head and Neck Cancer (BCSQ-H&N), Body Image Scale–Modified (BIS-m), The MD Anderson Head and Neck Lymphedema Rating Scale (MDAHNLRS), The Foldi’s Stages of Lymphedema (Foldi’s Scale), Patterson’s Scale, UCLA Shoulder Rating Scale (UCLA SRS), and Karnofsky’s Performance Status Index (KPS). The results showed that the worst problems with body HNCRL functional outcomes. Patients’ HNCRL symptom distress and performance status are robust predictors across over for overall HNCRL functional outcomes, problems with body HNCRL functional outcomes, and activity and social functioning HNCRL functional outcomes. Based on the results of this period research program, we will develop a Cancer Rehabilitation and Lymphedema Care Program (CRLCP) to use in the care of patients with resected advanced head and neck cancer.Keywords: head and neck cancer, resected, lymphedema, symptom, body image, functional outcome
Procedia PDF Downloads 258435 A Critical Study on Unprecedented Employment Discrimination and Growth of Contractual Labour Engaged by Rail Industry in India
Authors: Munmunlisa Mohanty, K. D. Raju
Abstract:
Rail industry is one of the model employers in India has separate national legislation (Railways Act 1989) to regulate its vast employment structure, functioning across the country. Indian Railway is not only the premier transport industry of the country; indeed, it is Asia’s most extensive rail network organisation and the world’s second-largest industry functioning under one management. With the growth of globalization of industrial products, the scope of anti-employment discrimination is no more confined to gender aspect only; instead, it extended to the unregularized classification of labour force applicable in the various industrial establishments in India. And the Indian Rail Industry inadvertently enhanced such discriminatory employment trends by engaging contractual labour in an unprecedented manner. The engagement of contractual labour by rail industry vanished the core “Employer-Employee” relationship between rail management and contractual labour who employed through the contractor. This employment trend reduces the cost of production and supervision, discourages the contractual labour from forming unions, and reduces its collective bargaining capacity. So, the primary intention of this paper is to highlight the increasing discriminatory employment scope for contractual labour engaged by Indian Railways. This paper critically analyses the diminishing perspective of anti-employment opportunity practiced by Indian Railways towards contractual labour and demands an urgent outlook on the probable scope of anti-employment discrimination against contractual labour engaged by Indian Railways. The researcher used doctrinal methodology where primary materials (Railways Act, Contract Labour Act and Occupational, health and Safety Code, 2020) and secondary data (CAG Report 2018, Railways Employment Regulation Rules, ILO Report etc.) are used for the paper.Keywords: anti-employment, CAG Report, contractual labour, discrimination, Indian Railway, principal employer
Procedia PDF Downloads 170