Search results for: processing programs
4398 Comparison of Tribological and Mechanical Properties of White Metal Produced by Laser Cladding and Conventional Methods
Authors: Jae-Il Jeong, Hoon-Jae Park, Jung-Woo Cho, Yang-Gon Kim, Jin-Young Park, Joo-Young Oh, Si-Geun Choi, Seock-Sam Kim, Young Tae Cho, Chan Gyu Kim, Jong-Hyoung Kim
Abstract:
Bearing component has strongly required to decrease vibration and wear to achieve high durability and life time. In the industry field, bearing durability is improved by surface treatment on the bearing surface by centrifugal casting or gravity casting production method. However, this manufacturing method has caused problems such as long processing time, defect rate, and health harmful effect. To solve this problem, there is a laser cladding deposition treatment, which provides fast processing and food adhesion. Therefore, optimum conditions of white metal laser deposition should be studied to minimize bearing contact axis wear using laser cladding techniques. In this study, we deposit a soft white metal layer on SCM440, which is mainly used for shaft and bolt. On laser deposition process, the laser power and powder feed rate and laser head speed factors are controlled to find out the optimal conditions. We also measure hardness using micro Vickers, analyze FE-SEM (Field Emission Scanning Electron Microscope) and EDS (Energy Dispersive Spectroscopy) to study the mechanical properties and surface characteristics with various parameters change. Furthermore, this paper suggests the optimum condition of laser cladding deposition to apply in industrial fields. This work was supported by the Industrial Innovation Project of the Korea Evaluation Institute of Industrial Technology (KEIT) granted financial resource from the Ministry of Trade, Industry & Energy, Republic of Korea (Research no. 10051653).Keywords: laser deposition, bearing, white metal, mechanical properties
Procedia PDF Downloads 2644397 Establishment of Precision System for Underground Facilities Based on 3D Absolute Positioning Technology
Authors: Yonggu Jang, Jisong Ryu, Woosik Lee
Abstract:
The study aims to address the limitations of existing underground facility exploration equipment in terms of exploration depth range, relative depth measurement, data processing time, and human-centered ground penetrating radar image interpretation. The study proposed the use of 3D absolute positioning technology to develop a precision underground facility exploration system. The aim of this study is to establish a precise exploration system for underground facilities based on 3D absolute positioning technology, which can accurately survey up to a depth of 5m and measure the 3D absolute location of precise underground facilities. The study developed software and hardware technologies to build the precision exploration system. The software technologies developed include absolute positioning technology, ground surface location synchronization technology of GPR exploration equipment, GPR exploration image AI interpretation technology, and integrated underground space map-based composite data processing technology. The hardware systems developed include a vehicle-type exploration system and a cart-type exploration system. The data was collected using the developed exploration system, which employs 3D absolute positioning technology. The GPR exploration images were analyzed using AI technology, and the three-dimensional location information of the explored precise underground facilities was compared to the integrated underground space map. The study successfully developed a precision underground facility exploration system based on 3D absolute positioning technology. The developed exploration system can accurately survey up to a depth of 5m and measure the 3D absolute location of precise underground facilities. The system comprises software technologies that build a 3D precise DEM, synchronize the GPR sensor's ground surface 3D location coordinates, automatically analyze and detect underground facility information in GPR exploration images and improve accuracy through comparative analysis of the three-dimensional location information, and hardware systems, including a vehicle-type exploration system and a cart-type exploration system. The study's findings and technological advancements are essential for underground safety management in Korea. The proposed precision exploration system significantly contributes to establishing precise location information of underground facility information, which is crucial for underground safety management and improves the accuracy and efficiency of exploration. The study addressed the limitations of existing equipment in exploring underground facilities, proposed 3D absolute positioning technology-based precision exploration system, developed software and hardware systems for the exploration system, and contributed to underground safety management by providing precise location information. The developed precision underground facility exploration system based on 3D absolute positioning technology has the potential to provide accurate and efficient exploration of underground facilities up to a depth of 5m. The system's technological advancements contribute to the establishment of precise location information of underground facility information, which is essential for underground safety management in Korea.Keywords: 3D absolute positioning, AI interpretation of GPR exploration images, complex data processing, integrated underground space maps, precision exploration system for underground facilities
Procedia PDF Downloads 624396 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading
Authors: Robert Caulk
Abstract:
A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration
Procedia PDF Downloads 894395 Arabic Light Word Analyser: Roles with Deep Learning Approach
Authors: Mohammed Abu Shquier
Abstract:
This paper introduces a word segmentation method using the novel BP-LSTM-CRF architecture for processing semantic output training. The objective of web morphological analysis tools is to link a formal morpho-syntactic description to a lemma, along with morpho-syntactic information, a vocalized form, a vocalized analysis with morpho-syntactic information, and a list of paradigms. A key objective is to continuously enhance the proposed system through an inductive learning approach that considers semantic influences. The system is currently under construction and development based on data-driven learning. To evaluate the tool, an experiment on homograph analysis was conducted. The tool also encompasses the assumption of deep binary segmentation hypotheses, the arbitrary choice of trigram or n-gram continuation probabilities, language limitations, and morphology for both Modern Standard Arabic (MSA) and Dialectal Arabic (DA), which provide justification for updating this system. Most Arabic word analysis systems are based on the phonotactic morpho-syntactic analysis of a word transmitted using lexical rules, which are mainly used in MENA language technology tools, without taking into account contextual or semantic morphological implications. Therefore, it is necessary to have an automatic analysis tool taking into account the word sense and not only the morpho-syntactic category. Moreover, they are also based on statistical/stochastic models. These stochastic models, such as HMMs, have shown their effectiveness in different NLP applications: part-of-speech tagging, machine translation, speech recognition, etc. As an extension, we focus on language modeling using Recurrent Neural Network (RNN); given that morphological analysis coverage was very low in dialectal Arabic, it is significantly important to investigate deeply how the dialect data influence the accuracy of these approaches by developing dialectal morphological processing tools to show that dialectal variability can support to improve analysis.Keywords: NLP, DL, ML, analyser, MSA, RNN, CNN
Procedia PDF Downloads 424394 Prevalence of Gastrointestinal Nematodes of Farm Animals by Copro-Culture
Authors: Mosaab A. Omar, Mohammad Saleh Al-Aboody
Abstract:
In the present study, examination of 442 faecal samples was performed: 171 from cattle, 128 from buffaloes and 143 from sheep. During the period from May, 2014 to April, 2015, fecal examination showed the infection rate with abomasal nematodes was 30% in cattle, 22.6% in buffaloes, and 31.4% in sheep. Fecal culture gave results of 47.5%, 30%, and 50.3% in cattle, buffaloes and sheep respectively. Seasonal infection with abomasal nematodes as shown by faecal culture in cattle, reveals the highest infection rate is in summer (55.9%), followed by spring (54.1%), autumn (50%), and winter (33.3%). Cooperia spp. is the most prevalent larva in both cattle and buffaloes; Strongyloides papillosus is the most predominant one in sheep. Here we introduce the first study of abomasal worms infection in ruminants in Qena, Egypt. The prevalence is found to be so high among the all examined animals, that we recommend that the authorities apply suitable control programs.Keywords: haemonchus, ostertagia, seasonal dynamics, floatation
Procedia PDF Downloads 4614393 The Differences in Skill Performance Between Online and Conventional Learning Among Nursing Students
Authors: Nurul Nadrah
Abstract:
As a result of the COVID-19 pandemic, a movement control order was implemented, leading to the adoption of online learning as a substitute for conventional classroom instruction. Thus, this study aims to determine the differences in skill performance between online learning and conventional methods among nursing students. We employed a quasi-experimental design with purposive sampling, involving a total of 59 nursing students, and used online learning as the intervention. As a result, the study found there was a significant difference in student skill performance between online learning and conventional methods. As a conclusion, in times of hardship, it is necessary to implement alternative pedagogical approaches, especially in critical fields like nursing, to ensure the uninterrupted progression of educational programs. This study suggests that online learning can be effectively employed as a means of imparting knowledge to nursing students during their training.Keywords: nursing education, online learning, skill performance, conventional learning method
Procedia PDF Downloads 474392 Understanding Cyber Terrorism from Motivational Perspectives: A Qualitative Data Analysis
Authors: Yunos Zahri, Ariffin Aswami
Abstract:
Cyber terrorism represents the convergence of two worlds: virtual and physical. The virtual world is a place in which computer programs function and data move, whereas the physical world is where people live and function. The merging of these two domains is the interface being targeted in the incidence of cyber terrorism. To better understand why cyber terrorism acts are committed, this study presents the context of cyber terrorism from motivational perspectives. Motivational forces behind cyber terrorism can be social, political, ideological and economic. In this research, data are analyzed using a qualitative method. A semi-structured interview with purposive sampling was used for data collection. With the growing interconnectedness between critical infrastructures and Information & Communication Technology (ICT), selecting targets that facilitate maximum disruption can significantly influence terrorists. This work provides a baseline for defining the concept of cyber terrorism from motivational perspectives.Keywords: cyber terrorism, terrorism, motivation, qualitative analysis
Procedia PDF Downloads 4224391 Intelligent Process and Model Applied for E-Learning Systems
Authors: Mafawez Alharbi, Mahdi Jemmali
Abstract:
E-learning is a developing area especially in education. E-learning can provide several benefits to learners. An intelligent system to collect all components satisfying user preferences is so important. This research presents an approach that it capable to personalize e-information and give the user their needs following their preferences. This proposal can make some knowledge after more evaluations made by the user. In addition, it can learn from the habit from the user. Finally, we show a walk-through to prove how intelligent process work.Keywords: artificial intelligence, architecture, e-learning, software engineering, processing
Procedia PDF Downloads 1914390 Integrating Natural Language Processing (NLP) and Machine Learning in Lung Cancer Diagnosis
Authors: Mehrnaz Mostafavi
Abstract:
The assessment and categorization of incidental lung nodules present a considerable challenge in healthcare, often necessitating resource-intensive multiple computed tomography (CT) scans for growth confirmation. This research addresses this issue by introducing a distinct computational approach leveraging radiomics and deep-learning methods. However, understanding local services is essential before implementing these advancements. With diverse tracking methods in place, there is a need for efficient and accurate identification approaches, especially in the context of managing lung nodules alongside pre-existing cancer scenarios. This study explores the integration of text-based algorithms in medical data curation, indicating their efficacy in conjunction with machine learning and deep-learning models for identifying lung nodules. Combining medical images with text data has demonstrated superior data retrieval compared to using each modality independently. While deep learning and text analysis show potential in detecting previously missed nodules, challenges persist, such as increased false positives. The presented research introduces a Structured-Query-Language (SQL) algorithm designed for identifying pulmonary nodules in a tertiary cancer center, externally validated at another hospital. Leveraging natural language processing (NLP) and machine learning, the algorithm categorizes lung nodule reports based on sentence features, aiming to facilitate research and assess clinical pathways. The hypothesis posits that the algorithm can accurately identify lung nodule CT scans and predict concerning nodule features using machine-learning classifiers. Through a retrospective observational study spanning a decade, CT scan reports were collected, and an algorithm was developed to extract and classify data. Results underscore the complexity of lung nodule cohorts in cancer centers, emphasizing the importance of careful evaluation before assuming a metastatic origin. The SQL and NLP algorithms demonstrated high accuracy in identifying lung nodule sentences, indicating potential for local service evaluation and research dataset creation. Machine-learning models exhibited strong accuracy in predicting concerning changes in lung nodule scan reports. While limitations include variability in disease group attribution, the potential for correlation rather than causality in clinical findings, and the need for further external validation, the algorithm's accuracy and potential to support clinical decision-making and healthcare automation represent a significant stride in lung nodule management and research.Keywords: lung cancer diagnosis, structured-query-language (SQL), natural language processing (NLP), machine learning, CT scans
Procedia PDF Downloads 1014389 Multiscale Process Modeling Analysis for the Prediction of Composite Strength Allowables
Authors: Marianna Maiaru, Gregory M. Odegard
Abstract:
During the processing of high-performance thermoset polymer matrix composites, chemical reactions occur during elevated pressure and temperature cycles, causing the constituent monomers to crosslink and form a molecular network that gradually can sustain stress. As the crosslinking process progresses, the material naturally experiences a gradual shrinkage due to the increase in covalent bonds in the network. Once the cured composite completes the cure cycle and is brought to room temperature, the thermal expansion mismatch of the fibers and matrix cause additional residual stresses to form. These compounded residual stresses can compromise the reliability of the composite material and affect the composite strength. Composite process modeling is greatly complicated by the multiscale nature of the composite architecture. At the molecular level, the degree of cure controls the local shrinkage and thermal-mechanical properties of the thermoset. At the microscopic level, the local fiber architecture and packing affect the magnitudes and locations of residual stress concentrations. At the macroscopic level, the layup sequence controls the nature of crack initiation and propagation due to residual stresses. The goal of this research is use molecular dynamics (MD) and finite element analysis (FEA) to predict the residual stresses in composite laminates and the corresponding effect on composite failure. MD is used to predict the polymer shrinkage and thermomechanical properties as a function of degree of cure. This information is used as input into FEA to predict the residual stresses on the microscopic level resulting from the complete cure process. Virtual testing is subsequently conducted to predict strength allowables. Experimental characterization is used to validate the modeling.Keywords: molecular dynamics, finite element analysis, processing modeling, multiscale modeling
Procedia PDF Downloads 924388 Design and Implementation of a Control System for a Walking Robot with Color Sensing and Line following Using PIC and ATMEL Microcontrollers
Authors: Ibraheem K. Ibraheem
Abstract:
The aim of this research is to design and implement line-tracking mobile robot. The robot must follow a line drawn on the floor with different color, avoids hitting moving object like another moving robot or walking people and achieves color sensing. The control system reacts by controlling each of the motors to keep the tracking sensor over the middle of the line. Proximity sensors used to avoid hitting moving objects that may pass in front of the robot. The programs have been written using micro c instructions, then converted into PIC16F887 ATmega48/88/168 microcontrollers counterparts. Practical simulations show that the walking robot accurately achieves line following action and exactly recognizes the colors and avoids any obstacle in front of it.Keywords: color sensing, H-bridge, line following, mobile robot, PIC microcontroller, obstacle avoidance, phototransistor
Procedia PDF Downloads 3984387 Implications of Internationalization for Management and Practice in Higher Education
Authors: Naziema Begum Jappie
Abstract:
The internationalization of higher education has become a focal point for academic institutions worldwide, including those in South Africa. This paper explores the multifaceted implications of internationalization on management and practice within the South African higher education landscape. Universities all over the world are increasingly recognizing the challenges of globalization and the pressures towards internationalization. Internationalization in higher education encompasses a range of activities, including academic exchange programs, research collaborations, joint degree programs, and the recruitment of international students and faculty. In South Africa, this process is driven by various factors, including the quest for global competitiveness, the pursuit of academic excellence, and the promotion of cultural diversity. However, while internationalization presents numerous opportunities, it also brings forth significant challenges that require careful consideration by management and practitioners in higher education institutions. Furthermore, the internationalization of higher education in South Africa has significant implications for teaching and learning practices. With an increasingly diverse student body, educators must employ innovative pedagogical approaches that cater to the needs and preferences of a multicultural cohort. This may involve the integration of global perspectives into the curriculum, the use of technology-enhanced learning platforms, and the promotion of intercultural competence among students and faculty. Additionally, the exchange of knowledge and ideas with international partners can enrich research activities and contribute to the advancement of knowledge in various fields. The internationalization of higher education in South Africa has profound implications for management and practice within academic institutions. While it offers opportunities for enhancing academic quality, promoting cultural exchange, and advancing research agendas, it also presents challenges that require strategic planning, resource allocation, and stakeholder engagement. By addressing these challenges proactively and leveraging the opportunities presented by internationalization, South African universities can position themselves as global leaders in higher education while contributing to the socio-economic development of the country and the continent at large. This paper draws together the international experience in South Africa to explore the emerging patterns of strategy and practice in internationalizing Higher Education and will highlight some critical notions of how the concepts of internationalization and globalization in the context of higher education are understood by those who lead universities and what new challenges are being created as universities seek to become more international. Institutions cannot simply have bullet points in the strategic plan for the recruitment of international students; there has to be a complete commitment to a national strategy of inclusivity. This paper will further examine the leadership styles that ensure transformation together with the goals set out for internationalization. Discussions around adding the international relations dimension to the curriculum. Addressing the issues relevant to cross-border delivery of higher education.Keywords: challenges, higher education, internationalization, strategic focus
Procedia PDF Downloads 554386 Bactericidal Efficacy of Quaternary Ammonium Compound on Carriers with Food Additive Grade Calcium Hydroxide against Salmonella Infantis and Escherichia coli
Authors: M. Shahin Alam, Satoru Takahashi, Mariko Itoh, Miyuki Komura, Mayuko Suzuki, Natthanan Sangsriratanakul, Kazuaki Takehara
Abstract:
Cleaning and disinfection are key components of routine biosecurity in livestock farming and food processing industry. The usage of suitable disinfectants and their proper concentration are important factors for a successful biosecurity program. Disinfectants have optimum bactericidal and virucidal efficacies at temperatures above 20°C, but very few studies on application and effectiveness of disinfectants at low temperatures have been done. In the present study, the bactericidal efficacies of food additive grade calcium hydroxide (FdCa(OH)), quaternary ammonium compound (QAC) and their mixture, were investigated under different conditions, including time, organic materials (fetal bovine serum: FBS) and temperature, either in suspension or in carrier test. Salmonella Infantis and Escherichia coli, which are the most prevalent gram negative bacteria in commercial poultry housing and food processing industry, were used in this study. Initially, we evaluated these disinfectants at two different temperatures (4°C and room temperature (RT) (25°C ± 2°C)) and 7 contact times (0, 5 and 30 sec, 1, 3, 20 and 30 min), with suspension tests either in the presence or absence of 5% FBS. Secondly, we investigated the bactericidal efficacies of these disinfectants by carrier tests (rubber, stainless steel and plastic) at same temperatures and 4 contact times (30 sec, 1, 3, and 5 min). Then, we compared the bactericidal efficacies of each disinfectant within their mixtures, as follows. When QAC was diluted with redistilled water (dW2) at 1: 500 (QACx500) to obtain the final concentration of didecyl-dimethylammonium chloride (DDAC) of 200 ppm, it could inactivate Salmonella Infantis within 5 sec at RT either with or without 5% FBS in suspension test; however, at 4°C it required 30 min in presence of 5% FBS. FdCa(OH)2 solution alone could inactivate bacteria within 1 min both at RT and 4°C even with 5% FBS. While FdCa(OH)2 powder was added at final concentration 0.2% to QACx500 (Mix500), the mixture could inactivate bacteria within 30 sec and 5 sec, respectively, with or without 5% FBS at 4°C. The findings from the suspension test indicated that low temperature inhibited the bactericidal efficacy of QAC, whereas Mix500 was effective, regardless of short contact time and low temperature, even with 5% FBS. In the carrier test, single disinfectant required bit more time to inactivate bacteria on rubber and plastic surfaces than on stainless steel. However, Mix500 could inactivate S. Infantis on rubber, stainless steel and plastic surfaces within 30 sec and 1 min, respectively, at RT and 4°C; but, for E. coli, it required only 30 sec at both temperatures. So, synergistic effects were observed on different carriers at both temperatures. For a successful enhancement of biosecurity during winter, the disinfectants should be selected that could have short contact times with optimum efficacy against the target pathogen. The present study findings help farmers to make proper strategies for application of disinfectants in their livestock farming and food processing industry.Keywords: carrier, food additive grade calcium hydroxide (FdCa(OH)₂), quaternary ammonium compound, synergistic effects
Procedia PDF Downloads 2944385 Genetic and Non-Genetic Evaluation of Milk Yield and Litter Size of Awassi Sheep in Drylands
Authors: Khaled Al-Najjar, Ahmad Q. Al-Momani, Ahmed Elnahas, Reda Elsaid
Abstract:
The research was carried out using records of Awassi sheep bred in drylands at Al-Fjaj Station, Jordan. That aimed to study non-genetic factors affecting milk yield (MK), litter size at birth (LZB); estimate heritability, repeatability, and genetic and phenotypic correlation using SAS and MTDFREML programs. The results were as follows, the average MK and LZB were 92.84 (kg) and 1.16, respectively. MK was highly significantly affected by each parity, age of ewe, year of lambing, and lactation period, while only the year of lambing had a significant effect on LZB. The heritability and repeatability were 0.07 and 0.10 for MK, while it was 0.05 and 0.25 for LZB. The genetic and phenotypic correlations were 0.17 and 0.02 between MK and LZB, respectively. The research concluded that the herd is genetically homozygous and therefore needs to increase genetic variance by introducing LZB-improved rams and selecting females from dams who achieved at least four parties to increase returns in drylands.Keywords: Awassi sheep, genetic parameters, litter size, milk yield
Procedia PDF Downloads 1214384 Income Inequality and Its Effects on Household Livelihoods in Parker Paint Community, Liberia
Authors: Robertson Freeman
Abstract:
The prime objective of this research is to examine income inequality and its effects on household livelihoods in Parker Paint. Many researchers failed to address the potential threat of income inequality on diverse household livelihood indicators, including health, food, housing, transport and many others. They examine and generalize the effects of income differentials on household livelihoods by addressing one indicator of livelihood security. This research fills the loopholes of previous research by examining the effects of income inequality and how it affects the livelihoods of households, taking into consideration livelihood indicators including health, food security, and transport. The researcher employed the mixed research method to analyze the distribution of income and solicit opinions of household heads on the effects of their monthly income on their livelihoods. Age and sex structure, household composition, type of employment and educational status influence income inequality. The level of income, Lorenz curve and the Gini coefficient was mutually employed to calculate and determine the level of income inequality. One hundred eighty-two representing 96% of household heads are employed while 8, representing 4%, are unemployed. However, out of a total number of 182 employed, representing 96%, 27 people representing 14%, are employed in the formal private sector, while 110, representing 58%, are employed in the private informal sector. Monthly average income, savings, investments and unexpected circumstances affect the livelihood of households. Infrastructural development and wellbeing should be pursued by reducing expenditure earmarked in other sectors and channeling the funds towards the provision of household needs. One of the potent tools for consolidating household livelihoods is to initiate livelihood empowerment programs. Government and private sector agencies should establish more health insurance schemes, providing mosquito nets, immunization services, public transport, as well as embarking on feeding programs, especially in the remote areas of Parker paint. To climax the research findings, self-employment, entrepreneurship and the general private sector employment is a transparent double-edged sword. If employed in the private sector, there is the likelihood to increase one’s income. However, this also induces the income gap between the rich and poor since many people are exploited by affluence, thereby relegating the poor from the wealth hierarchy. Age and sex structure, as well as type of employment, should not be overlooked since they all play fundamental roles in influencing income inequality. Savings and investments seem to play a positive role in reducing income inequality. However, savings and investment in this research affect livelihoods negatively. It behooves mankind to strive and work hard to the best of ability in earning sufficient income and embracing measures to retain his financial strength. In so doing, people will be able to provide basic household needs, celebrate the reduction in unemployment and dependence and finally ensure sustainable livelihoods.Keywords: income, inequality, livelihood, pakerpaint
Procedia PDF Downloads 1244383 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory
Authors: Xiaochen Mu
Abstract:
Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.Keywords: data protection, property rights, intellectual property, Big data
Procedia PDF Downloads 394382 Building Atmospheric Moisture Diagnostics: Environmental Monitoring and Data Collection
Authors: Paula Lopez-Arce, Hector Altamirano, Dimitrios Rovas, James Berry, Bryan Hindle, Steven Hodgson
Abstract:
Efficient mould remediation and accurate moisture diagnostics leading to condensation and mould growth in dwellings are largely untapped. Number of factors are contributing to the rising trend of excessive moisture in homes mainly linked with modern living, increased levels of occupation and rising fuel costs, as well as making homes more energy efficient. Environmental monitoring by means of data collection though loggers sensors and survey forms has been performed in a range of buildings from different UK regions. Air and surface temperature and relative humidity values of residential areas affected by condensation and/or mould issues were recorded. Additional measurements were taken through different trials changing type, location, and position of loggers. In some instances, IR thermal images and ventilation rates have also been acquired. Results have been interpreted together with environmental key parameters by processing and connecting data from loggers and survey questionnaires, both in buildings with and without moisture issues. Monitoring exercises carried out during Winter and Spring time show the importance of developing and following accurate protocols for guidance to obtain consistent, repeatable and comparable results and to improve the performance of environmental monitoring. A model and a protocol are being developed to build a diagnostic tool with the goal of performing a simple but precise residential atmospheric moisture diagnostics to distinguish the cause entailing condensation and mould generation, i.e., ventilation, insulation or heating systems issue. This research shows the relevance of monitoring and processing environmental data to assign moisture risk levels and determine the origin of condensation or mould when dealing with a building atmospheric moisture excess.Keywords: environmental monitoring, atmospheric moisture, protocols, mould
Procedia PDF Downloads 1394381 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review
Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha
Abstract:
Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision-making has not been far-fetched. Proper classification of this textual information in a given context has also been very difficult. As a result, we decided to conduct a systematic review of previous literature on sentiment classification and AI-based techniques that have been used in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that can correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy by assessing different artificial intelligence techniques. We evaluated over 250 articles from digital sources like ScienceDirect, ACM, Google Scholar, and IEEE Xplore and whittled down the number of research to 31. Findings revealed that Deep learning approaches such as CNN, RNN, BERT, and LSTM outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also necessary for developing a robust sentiment classifier and can be obtained from places like Twitter, movie reviews, Kaggle, SST, and SemEval Task4. Hybrid Deep Learning techniques like CNN+LSTM, CNN+GRU, CNN+BERT outperformed single Deep Learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of sentiment analyzer development due to its simplicity and AI-based library functionalities. Based on some of the important findings from this study, we made a recommendation for future research.Keywords: artificial intelligence, natural language processing, sentiment analysis, social network, text
Procedia PDF Downloads 1154380 Evaluation of Modern Natural Language Processing Techniques via Measuring a Company's Public Perception
Authors: Burak Oksuzoglu, Savas Yildirim, Ferhat Kutlu
Abstract:
Opinion mining (OM) is one of the natural language processing (NLP) problems to determine the polarity of opinions, mostly represented on a positive-neutral-negative axis. The data for OM is usually collected from various social media platforms. In an era where social media has considerable control over companies’ futures, it’s worth understanding social media and taking actions accordingly. OM comes to the fore here as the scale of the discussion about companies increases, and it becomes unfeasible to gauge opinion on individual levels. Thus, the companies opt to automize this process by applying machine learning (ML) approaches to their data. For the last two decades, OM or sentiment analysis (SA) has been mainly performed by applying ML classification algorithms such as support vector machines (SVM) and Naïve Bayes to a bag of n-gram representations of textual data. With the advent of deep learning and its apparent success in NLP, traditional methods have become obsolete. Transfer learning paradigm that has been commonly used in computer vision (CV) problems started to shape NLP approaches and language models (LM) lately. This gave a sudden rise to the usage of the pretrained language model (PTM), which contains language representations that are obtained by training it on the large datasets using self-supervised learning objectives. The PTMs are further fine-tuned by a specialized downstream task dataset to produce efficient models for various NLP tasks such as OM, NER (Named-Entity Recognition), Question Answering (QA), and so forth. In this study, the traditional and modern NLP approaches have been evaluated for OM by using a sizable corpus belonging to a large private company containing about 76,000 comments in Turkish: SVM with a bag of n-grams, and two chosen pre-trained models, multilingual universal sentence encoder (MUSE) and bidirectional encoder representations from transformers (BERT). The MUSE model is a multilingual model that supports 16 languages, including Turkish, and it is based on convolutional neural networks. The BERT is a monolingual model in our case and transformers-based neural networks. It uses a masked language model and next sentence prediction tasks that allow the bidirectional training of the transformers. During the training phase of the architecture, pre-processing operations such as morphological parsing, stemming, and spelling correction was not used since the experiments showed that their contribution to the model performance was found insignificant even though Turkish is a highly agglutinative and inflective language. The results show that usage of deep learning methods with pre-trained models and fine-tuning achieve about 11% improvement over SVM for OM. The BERT model achieved around 94% prediction accuracy while the MUSE model achieved around 88% and SVM did around 83%. The MUSE multilingual model shows better results than SVM, but it still performs worse than the monolingual BERT model.Keywords: BERT, MUSE, opinion mining, pretrained language model, SVM, Turkish
Procedia PDF Downloads 1464379 Validation of Escherichia coli O157:H7 Inactivation on Apple-Carrot Juice Treated with Manothermosonication by Kinetic Models
Authors: Ozan Kahraman, Hao Feng
Abstract:
Several models such as Weibull, Modified Gompertz, Biphasic linear, and Log-logistic models have been proposed in order to describe non-linear inactivation kinetics and used to fit non-linear inactivation data of several microorganisms for inactivation by heat, high pressure processing or pulsed electric field. First-order kinetic parameters (D-values and z-values) have often been used in order to identify microbial inactivation by non-thermal processing methods such as ultrasound. Most ultrasonic inactivation studies employed first-order kinetic parameters (D-values and z-values) in order to describe the reduction on microbial survival count. This study was conducted to analyze the E. coli O157:H7 inactivation data by using five microbial survival models (First-order, Weibull, Modified Gompertz, Biphasic linear and Log-logistic). First-order, Weibull, Modified Gompertz, Biphasic linear and Log-logistic kinetic models were used for fitting inactivation curves of Escherichia coli O157:H7. The residual sum of squares and the total sum of squares criteria were used to evaluate the models. The statistical indices of the kinetic models were used to fit inactivation data for E. coli O157:H7 by MTS at three temperatures (40, 50, and 60 0C) and three pressures (100, 200, and 300 kPa). Based on the statistical indices and visual observations, the Weibull and Biphasic models were best fitting of the data for MTS treatment as shown by high R2 values. The non-linear kinetic models, including the Modified Gompertz, First-order, and Log-logistic models did not provide any better fit to data from MTS compared the Weibull and Biphasic models. It was observed that the data found in this study did not follow the first-order kinetics. It is possibly because of the cells which are sensitive to ultrasound treatment were inactivated first, resulting in a fast inactivation period, while those resistant to ultrasound were killed slowly. The Weibull and biphasic models were found as more flexible in order to determine the survival curves of E. coli O157:H7 treated by MTS on apple-carrot juice.Keywords: Weibull, Biphasic, MTS, kinetic models, E.coli O157:H7
Procedia PDF Downloads 3664378 Digi-Buddy: A Smart Cane with Artificial Intelligence and Real-Time Assistance
Authors: Amaladhithyan Krishnamoorthy, Ruvaitha Banu
Abstract:
Vision is considered as the most important sense in humans, without which leading a normal can be often difficult. There are many existing smart canes for visually impaired with obstacle detection using ultrasonic transducer to help them navigate. Though the basic smart cane increases the safety of the users, it does not help in filling the void of visual loss. This paper introduces the concept of Digi-Buddy which is an evolved smart cane for visually impaired. The cane consists for several modules, apart from the basic obstacle detection features; the Digi-Buddy assists the user by capturing video/images and streams them to the server using a wide-angled camera, which then detects the objects using Deep Convolutional Neural Network. In addition to determining what the particular image/object is, the distance of the object is assessed by the ultrasonic transducer. The sound generation application, modelled with the help of Natural Language Processing is used to convert the processed images/object into audio. The object detected is signified by its name which is transmitted to the user with the help of Bluetooth hear phones. The object detection is extended to facial recognition which maps the faces of the person the user meets in the database of face images and alerts the user about the person. One of other crucial function consists of an automatic-intimation-alarm which is triggered when the user is in an emergency. If the user recovers within a set time, a button is provisioned in the cane to stop the alarm. Else an automatic intimation is sent to friends and family about the whereabouts of the user using GPS. In addition to safety and security by the existing smart canes, the proposed concept devices to be implemented as a prototype helping visually-impaired visualize their surroundings through audio more in an amicable way.Keywords: artificial intelligence, facial recognition, natural language processing, internet of things
Procedia PDF Downloads 3554377 Paramedic Strength and Flexibility: Findings of a 6-Month Workplace Exercise Randomised Controlled Trial
Authors: Jayden R. Hunter, Alexander J. MacQuarrie, Samantha C. Sheridan, Richard High, Carolyn Waite
Abstract:
Workplace exercise programs have been recommended to improve the musculoskeletal fitness of paramedics with the aim of reducing injury rates, and while they have shown efficacy in other occupations, they have not been delivered and evaluated in Australian paramedics to our best knowledge. This study investigated the effectiveness of a 6-month workplace exercise program (MedicFit; MF) to improve paramedic fitness with or without health coach (HC) support. A group of regional Australian paramedics (n=76; 43 male; mean ± SD 36.5 ± 9.1 years; BMI 28.0 ± 5.4 kg/m²) were randomised at the station level to either exercise with remote health coach support (MFHC; n=30), exercise without health coach support (MF; n=23), or no-exercise control (CON; n=23) groups. MFHC and MF participants received a 6-month, low-moderate intensity resistance and flexibility exercise program to be performed ƒ on station without direct supervision. Available exercise equipment included dumbbells, resistance bands, Swiss balls, medicine balls, kettlebells, BOSU balls, yoga mats, and foam rollers. MFHC and MF participants were also provided with a comprehensive exercise manual including sample exercise sessions aimed at improving musculoskeletal strength and flexibility which included exercise prescription (i.e. sets, reps, duration, load). Changes to upper-body (push-ups), lower-body (wall squat) and core (plank hold) strength and flexibility (back scratch and sit-reach tests) after the 6-month intervention were analysed using repeated measures ANOVA to compare changes between groups and over time. Upper-body (+20.6%; p < 0.01; partial eta squared = 0.34 [large effect]) and lower-body (+40.8%; p < 0.05; partial eta squared = 0.08 (moderate effect)) strength increased significantly with no interaction or group effects. Changes to core strength (+1.4%; p=0.17) and both upper-body (+19.5%; p=0.56) and lower-body (+3.3%; p=0.15) flexibility were non-significant with no interaction or group effects observed. While upper- and lower-body strength improved over the course of the intervention, providing a 6-month workplace exercise program with or without health coach support did not confer any greater strength or flexibility benefits than exercise testing alone (CON). Although exercise adherence was not measured, it is possible that participants require additional methods of support such as face-to-face exercise instruction and guidance and individually-tailored exercise programs to achieve adequate participation and improvements in musculoskeletal fitness. This presents challenges for more remote paramedic stations without regular face-to-face access to suitably qualified exercise professionals, and future research should investigate the effectiveness of other forms of exercise delivery and guidance for these paramedic officers such as remotely-facilitated digital exercise prescription and monitoring.Keywords: workplace exercise, paramedic health, strength training, flexibility training
Procedia PDF Downloads 1394376 How to Ensure Environmental Sustainability and Food Security through the Use of Payments for Environmental Services in Developing Countries
Authors: Carlos Alves
Abstract:
This research paper demonstrates how payments for environmental services (PES) can be an effective mechanism to combat food insecurity and reduce environmental degradation in developing countries. The paper begins by discussing how environmental services affect each one of the pillars of food security: availability, access, and utilization of food. However, due to numerous global environmental challenges, a new pillar of food security based on environmental sustainability is proposed and discussed. An argument is then made that PES can usefully combat food insecurity. It can provide an extra income to those who take on environmental service and help them to have a better access to food. In order to be successful in addressing food insecurity, PES schemes should target on the poor and redress issues that can prevent their effectiveness. Finally, the research presents a case study that discusses how several developing countries addressed problems and successfully developed PES programs.Keywords: environmental sustainability, food security, nutrition, payments for environmental services
Procedia PDF Downloads 3924375 Audio-Visual Co-Data Processing Pipeline
Authors: Rita Chattopadhyay, Vivek Anand Thoutam
Abstract:
Speech is the most acceptable means of communication where we can quickly exchange our feelings and thoughts. Quite often, people can communicate orally but cannot interact or work with computers or devices. It’s easy and quick to give speech commands than typing commands to computers. In the same way, it’s easy listening to audio played from a device than extract output from computers or devices. Especially with Robotics being an emerging market with applications in warehouses, the hospitality industry, consumer electronics, assistive technology, etc., speech-based human-machine interaction is emerging as a lucrative feature for robot manufacturers. Considering this factor, the objective of this paper is to design the “Audio-Visual Co-Data Processing Pipeline.” This pipeline is an integrated version of Automatic speech recognition, a Natural language model for text understanding, object detection, and text-to-speech modules. There are many Deep Learning models for each type of the modules mentioned above, but OpenVINO Model Zoo models are used because the OpenVINO toolkit covers both computer vision and non-computer vision workloads across Intel hardware and maximizes performance, and accelerates application development. A speech command is given as input that has information about target objects to be detected and start and end times to extract the required interval from the video. Speech is converted to text using the Automatic speech recognition QuartzNet model. The summary is extracted from text using a natural language model Generative Pre-Trained Transformer-3 (GPT-3). Based on the summary, essential frames from the video are extracted, and the You Only Look Once (YOLO) object detection model detects You Only Look Once (YOLO) objects on these extracted frames. Frame numbers that have target objects (specified objects in the speech command) are saved as text. Finally, this text (frame numbers) is converted to speech using text to speech model and will be played from the device. This project is developed for 80 You Only Look Once (YOLO) labels, and the user can extract frames based on only one or two target labels. This pipeline can be extended for more than two target labels easily by making appropriate changes in the object detection module. This project is developed for four different speech command formats by including sample examples in the prompt used by Generative Pre-Trained Transformer-3 (GPT-3) model. Based on user preference, one can come up with a new speech command format by including some examples of the respective format in the prompt used by the Generative Pre-Trained Transformer-3 (GPT-3) model. This pipeline can be used in many projects like human-machine interface, human-robot interaction, and surveillance through speech commands. All object detection projects can be upgraded using this pipeline so that one can give speech commands and output is played from the device.Keywords: OpenVINO, automatic speech recognition, natural language processing, object detection, text to speech
Procedia PDF Downloads 804374 Some Yield Parameters of Wheat Genotypes
Authors: Shatha A. Yousif, Hatem Jasim, Ali R. Abas, Dheya P. Yousef
Abstract:
To study the effect of the cross direction in bead wheat, three hybrid combinations (Babyle 113 , Iratome), (Sawa , Tamose2) and (Al Hashymya Al Iraq) were tested for plant height, number of tillers/m, number of grains per spike, weight of grains per spike, 1000-grain weight and grain yield. The results revealed that the direction of the cross had significant effect the number of grain/spike, tillers/m and grain yields. Grain yield was positively and significantly correlated with 1000-grain weight, number of grains per spike and tillers. Depend on the result of heritability and genetic advance it was suggested that 1000-grain weight number of grains per spike and tillers should be given emphasis for future wheat yield improvement programs.Keywords: correlation, genetic advance, heritability, wheat, yield traits
Procedia PDF Downloads 4294373 Friction Stir Processing of the AA7075T7352 Aluminum Alloy Microstructures Mechanical Properties and Texture Characteristics
Authors: Roopchand Tandon, Zaheer Khan Yusufzai, R. Manna, R. K. Mandal
Abstract:
Present work describes microstructures, mechanical properties, and texture characteristics of the friction stir processed AA7075T7352 aluminum alloy. Phases were analyzed with the help of x-ray diffractometre (XRD), transmission electron microscope (TEM) along with the differential scanning calorimeter (DSC). Depth-wise microstructures and dislocation characteristics from the nugget-zone of the friction stir processed specimens were studied using the bright field (BF) and weak beam dark-field (WBDF) TEM micrographs, and variation in the microstructures as well as dislocation characteristics were the noteworthy features found. XRD analysis display changes in the chemistry as well as size of the phases in the nugget and heat affected zones (Nugget and HAZ). Whereas the base metal (BM) microstructures remain un-affected. High density dislocations were noticed in the nugget regions of the processed specimen, along with the formation of dislocation contours and tangles. .The ɳ’ and ɳ phases, along with the GP-Zones were completely dissolved and trapped by the dislocations. Such an observations got corroborated to the improved mechanical as well as stress corrosion cracking (SCC) performances. Bulk texture and residual stress measurements were done by the Panalytical Empyrean MRD system with Co- kα radiation. Nugget zone (NZ) display compressive residual stress as compared to thermo-mechanically(TM) and heat affected zones (HAZ). Typical f.c.c. deformation texture components (e.g. Copper, Brass, and Goss) were seen. Such a phenomenon is attributed to the enhanced hardening as well as other mechanical performance of the alloy. Mechanical characterizations were done using the tensile test and Anton Paar Instrumented Micro Hardness tester. Enhancement in the yield strength value is reported from the 89MPa to the 170MPa; on the other hand, highest hardness value was reported in the nugget-zone of the processed specimens.Keywords: aluminum alloy, mechanical characterization, texture characterstics, friction stir processing
Procedia PDF Downloads 1074372 Professional Learning, Professional Development and Academic Identity of Sessional Teachers: Underpinning Theoretical Frameworks
Authors: Aparna Datey
Abstract:
This paper explores the theoretical frameworks underpinning professional learning, professional development, and academic identity. The focus is on sessional teachers (also called tutors or adjuncts) in architectural design studios, who may be practitioners, masters or doctoral students and academics hired ‘as needed’. Drawing from Schön’s work on reflective practice, learning and developmental theories of Vygotsky (social constructionism and zones of proximal development), informal and workplace learning, this research proposes that sessional teachers not only develop their teaching skills but also shape their identities through their 'everyday' work. Continuing academic staff develop their teaching through a combination of active teaching, self-reflection on teaching, as well as learning to teach from others via formalised programs and informally in the workplace. They are provided professional development and recognised for their teaching efforts through promotion, student citations, and awards for teaching excellence. The teaching experiences of sessional staff, by comparison, may be discontinuous and they generally have fewer opportunities and incentives for teaching development. In the absence of access to formalised programs, sessional teachers develop their teaching informally in workplace settings that may be supportive or unhelpful. Their learning as teachers is embedded in everyday practice applying problem-solving skills in ambiguous and uncertain settings. Depending on their level of expertise, they understand how to teach a subject such that students are stimulated to learn. Adult learning theories posit that adults have different motivations for learning and fall into a matrix of readiness, that an adult’s ability to make sense of their learning is shaped by their values, expectations, beliefs, feelings, attitudes, and judgements, and they are self-directed. The level of expertise of sessional teachers depends on their individual attributes and motivations, as well as on their work environment, the good practices they acquire and enhance through their practice, career training and development, the clarity of their role in the delivery of teaching, and other factors. The architectural design studio is ideal for study due to the historical persistence of the vocational learning or apprenticeship model (learning under the guidance of experts) and a pedagogical format using two key approaches: project-based problem solving and collaborative learning. Hence, investigating the theoretical frameworks underlying academic roles and informal professional learning in the workplace would deepen understanding of their professional development and how they shape their academic identities. This qualitative research is ongoing at a major university in Australia, but the growing trend towards hiring sessional staff to teach core courses in many disciplines is a global one. This research will contribute to including transient sessional teachers in the discourse on institutional quality, effectiveness, and student learning.Keywords: academic identity, architectural design learning, pedagogy, teaching and learning, sessional teachers
Procedia PDF Downloads 1244371 Innovative Business Education Pedagogy: A Case Study of Action Learning at NITIE, Mumbai
Authors: Sudheer Dhume, T. Prasad
Abstract:
There are distinct signs of Business Education losing its sheen. It is more so in developing countries. One of the reasons is the value addition at the end of 2 year MBA program is not matching with the requirements of present times and expectations of the students. In this backdrop, Pedagogy Innovation has become prerequisite for making our MBA programs relevant and useful. This paper is the description and analysis of innovative Action Learning pedagogical approach adopted by a group of faculty members at NITIE Mumbai. It not only promotes multidisciplinary research but also enhances integration of the functional areas skillsets in the students. The paper discusses the theoretical bases of this pedagogy and evaluates the effectiveness of it vis-à-vis conventional pedagogical tools. The evaluation research using Bloom’s taxonomy framework showed that this blended method of Business Education is much superior as compared to conventional pedagogy.Keywords: action learning, blooms taxonomy, business education, innovation, pedagogy
Procedia PDF Downloads 2704370 Quality of Life Responses of Students with Intellectual Disabilities Entering an Inclusive, Residential Post-Secondary Program
Authors: Mary A. Lindell
Abstract:
Adults with intellectual disabilities (ID) are increasingly attending postsecondary institutions, including inclusive residential programs at four-year universities. The legislation, national organizations, and researchers support developing postsecondary education (PSE) options for this historically underserved population. Simultaneously, researchers are assessing the quality of life indicators (QOL) for people with ID. This study explores the quality of life characteristics for individuals with ID entering a two-year PSE program. A survey aligned with the PSE program was developed and administered to participants before they began their college program (in future studies, the same survey will be administered 6 months and 1 year after graduating). Employment, income, and housing are frequently cited QOL measures. People with disabilities, and especially people with ID, are more likely to experience unemployment and low wages than people without disabilities. PSE improves adult outcomes (e.g., employment, income, housing) for people with and without disabilities. Similarly, adults with ID who attend PSE are more likely to be employed than their peers who do not attend PSE; however, adults with ID are least likely among their typical peers and other students with disabilities to attend PSE. There is increased attention to providing individuals with ID access to PSE and more research is needed regarding the characteristics of students attending PSE. This study focuses on the participants of a fully residential two-year program for individuals with ID. Students earn an Applied Skills Certificate while focusing on five benchmarks: self-care, home care, relationships, academics, and employment. To create a QOL measure, the goals of the PSE program were identified, and possible assessment items were initially selected from the National Core Indicators (NCI) and the National Transition Longitudinal Survey 2 (NTLS2) that aligned with the five program goals. Program staff and advisory committee members offered input on potential item alignment with program goals and expected value to students with ID in the program. National experts in researching QOL outcomes of people with ID were consulted and concurred that the items selected would be useful in measuring the outcomes of postsecondary students with ID. The measure was piloted, modified, and administered to incoming students with ID. Research questions: (1) In what ways are students with ID entering a two-year PSE program similar to individuals with ID who complete the NCI and NTLS2 surveys? (2) In what ways are students with ID entering a two-year PSE program different than individuals with ID who completed the NCI and NTLS2 surveys? The process of developing a QOL measure specific to a PSE program for individuals with ID revealed that many of the items in comprehensive national QOL measures are not relevant to stake-holders of this two-year residential inclusive PSE program. Specific responses of students with ID entering an inclusive PSE program will be presented as well as a comparison to similar items on national QOL measures. This study explores the characteristics of students with ID entering a residential, inclusive PSE program. This information is valuable for, researchers, educators, and policy makers as PSE programs become more accessible for individuals with ID.Keywords: intellectual disabilities, inclusion, post-secondary education, quality of life
Procedia PDF Downloads 994369 Detecting Hate Speech And Cyberbullying Using Natural Language Processing
Authors: Nádia Pereira, Paula Ferreira, Sofia Francisco, Sofia Oliveira, Sidclay Souza, Paula Paulino, Ana Margarida Veiga Simão
Abstract:
Social media has progressed into a platform for hate speech among its users, and thus, there is an increasing need to develop automatic detection classifiers of offense and conflicts to help decrease the prevalence of such incidents. Online communication can be used to intentionally harm someone, which is why such classifiers could be essential in social networks. A possible application of these classifiers is the automatic detection of cyberbullying. Even though identifying the aggressive language used in online interactions could be important to build cyberbullying datasets, there are other criteria that must be considered. Being able to capture the language, which is indicative of the intent to harm others in a specific context of online interaction is fundamental. Offense and hate speech may be the foundation of online conflicts, which have become commonly used in social media and are an emergent research focus in machine learning and natural language processing. This study presents two Portuguese language offense-related datasets which serve as examples for future research and extend the study of the topic. The first is similar to other offense detection related datasets and is entitled Aggressiveness dataset. The second is a novelty because of the use of the history of the interaction between users and is entitled the Conflicts/Attacks dataset. Both datasets were developed in different phases. Firstly, we performed a content analysis of verbal aggression witnessed by adolescents in situations of cyberbullying. Secondly, we computed frequency analyses from the previous phase to gather lexical and linguistic cues used to identify potentially aggressive conflicts and attacks which were posted on Twitter. Thirdly, thorough annotation of real tweets was performed byindependent postgraduate educational psychologists with experience in cyberbullying research. Lastly, we benchmarked these datasets with other machine learning classifiers.Keywords: aggression, classifiers, cyberbullying, datasets, hate speech, machine learning
Procedia PDF Downloads 228