Search results for: rubber wood processing workers
3400 Influence of Processing Parameters in Selective Laser Melting on the Microstructure and Mechanical Properties of Ti/Tin Composites With in-situ and ex-situ Reinforcement
Authors: C. Sánchez de Rojas Candela, A. Riquelme, P. Rodrigo, M. D. Escalera-Rodríguez, B. Torres, J. Rams
Abstract:
Selective laser melting is one of the most commonly used AM techniques. In it, a thin layer of metallic powder is deposited, and a laser is used to melt selected zones. The accumulation of layers, each one molten in the preselected zones, gives rise to the formation of a 3D sample with a nearly arbitrary design. To ensure that the properties of the final parts match those of the powder, all the process is carried out in an inert atmosphere, preferentially Ar, although this gas could be substituted. Ti6Al4V alloy is widely used in multiple industrial applications such as aerospace, maritime transport and biomedical, due to its properties. However, due to the demanding requirements of these applications, greater hardness and wear resistance are necessary, together with a better machining capacity, which currently limits its commercialization. To improve these properties, in this study, Selective Laser Melting (SLM) is used to manufacture Ti/TiN metal matrix composites with in-situ and ex-situ titanium nitride reinforcement where the scanning speed is modified (from 28.5 up to 65 mm/s) to study the influence of the processing parameters in SLM. A one-step method of nitriding the Ti6Al4V alloy is carried out to create in-situ TiN reinforcement in a reactive atmosphere and it is compared with ex-situ composites manufactured by previous mixture of both the titanium alloy powder and the ceramic reinforcement particles. The microstructure and mechanical properties of the different Ti/TiN composite materials have been analyzed. As a result, the existence of a similar matrix has been confirmed in in-situ and ex-situ fabrications and the growth mechanisms of the nitrides have been studied. An increase in the mechanical properties with respect to the initial alloy has been observed in both cases and related to changes in their microstructure. Specifically, a greater improvement (around 30.65%) has been identified in those manufactured by the in-situ method at low speeds although other properties such as porosity must be improved for their future industrial applicability.Keywords: in-situ reinforcement, nitriding reaction, selective laser melting, titanium nitride
Procedia PDF Downloads 793399 Examining the Influence of Firm Internal Level Factors on Performance Variations among Micro and Small Enterprises: Evidence from Tanzanian Agri-Food Processing Firms
Authors: Pulkeria Pascoe, Hawa P. Tundui, Marcia Dutra de Barcellos, Hans de Steur, Xavier Gellynck
Abstract:
A majority of Micro and Small Enterprises (MSEs) experience low or no growth. Understanding their performance remains unfinished and disjointed as there is no consensus on the factors influencing it, especially in developing countries. Using a Resource-Based View (RBV) as the theoretical background, this cross-sectional study employed four regression models to examine the influence of firm-level factors (firm-specific characteristics, firm resources, manager socio-demographic characteristics, and selected management practices) on the overall performance variations among 442 Tanzanian micro and small agri-food processing firms. Study results confirmed the RBV argument that intangible resources make a larger contribution to overall performance variations among firms than that tangible resources. Firms' tangible and intangible resources explained 34.5% of overall performance variations (intangible resources explained the overall performance variability by 19.4% compared to tangible resources, which accounted for 15.1%), ranking first in explaining the overall performance variance. Firm-specific characteristics ranked second by influencing variations in overall performance by 29.0%. Selected management practices ranked third (6.3%), while the manager's socio-demographic factors were last on the list, as they influenced the overall performance variability among firms by only 5.1%. The study also found that firms that focus on proper utilization of tangible resources (financial and physical), set targets, and undertake better working capital management practices performed higher than their counterparts (low and average performers). Furthermore, accumulation and proper utilization of intangible resources (relational, organizational, and reputational), undertaking performance monitoring practices, age of the manager, and the choice of the firm location and activity were the dominant significant factors influencing the variations among average and high performers, relative to low performers. The entrepreneurial background was a significant factor influencing variations in average and low-performing firms, indicating that entrepreneurial skills are crucial to achieving average levels of performance. Firm age, size, legal status, source of start-up capital, gender, education level, and total business experience of the manager were not statistically significant variables influencing the overall performance variations among the agri-food processors under the study. The study has identified both significant and non-significant factors influencing performance variations among low, average, and high-performing micro and small agri-food processing firms in Tanzania. Therefore, results from this study will help managers, policymakers and researchers to identify areas where more attention should be placed in order to improve overall performance of MSEs in agri-food industry.Keywords: firm-level factors, micro and small enterprises, performance, regression analysis, resource-based-view
Procedia PDF Downloads 863398 A Mixed Method Design to Studying the Effects of Lean Production on Job Satisfaction and Health Work in a French Context
Authors: Gregor Bouville, Celine Schmidt
Abstract:
This article presents a French case study on lean production drawing on a mixed method design which has received little attention in French management research-especially in French human resources research. The purpose is to show that using a mixed method approach in this particular case overstep the limitations of previous studies in lean production studies. The authors use the embedded design as a special articulation of mixed method to analyse and understand the effects of three organizational practices on job satisfaction and workers’ health. Results show that low scheduled autonomy, quality management, time constraint have deleterious effects on job satisfaction. Furthermore, these three practices have ambivalent effects on health work. Interest in the subjects of mixed method has been growing up among French health researchers and practioners, also recently among French management researchers. This study reinforces and refines how mixed methods may offer interesting perspectives in an integrated framework included human resources, management, and health fields. Finally, potentials benefits and limits for those interdisciplinary researches programs are discussed.Keywords: lean production, mixed method, work organization practices, job satisfaction
Procedia PDF Downloads 3593397 Factors Affecting Transportation Services in Addis Ababa City
Authors: Yared Yitagesu Tilahun
Abstract:
Every nation, developed or developing, relies on transportation, but Addis Abeba City's transportation service is impacted by a number of variables. The current study's objectives are to determine the factors that influence transportation and gauge consumer satisfaction with such services in Addis Abeba. Customers and employees of Addis Ababa's transportation service authority would be the study's target group. 40 workers of the authority would be counted as part of the 310 000 clients that make up the population of the searcher service. Using a straightforward random selection technique, the researcher only chose 99 customers and 28 staff from this enormous group due to the considerable cost and time involved. Data gathering and analysis options included both quantitative and qualitative approaches. The results of this poll show that young people between the ages of 18 and 25 make up the majority of respondents (51.6%). The majority of employees and customers indicated that they are not satisfied with Addis Ababa's overall transportation system. The Addis Abeba Transportation Authority prioritizes client happiness by providing fair service. The company should have a system in place for managing time, resources, and people effectively. It should also provide employees the opportunity to contribute to client handling policies.Keywords: transportation, customer satisfaction, services, determinants
Procedia PDF Downloads 1243396 Processing Big Data: An Approach Using Feature Selection
Authors: Nikat Parveen, M. Ananthi
Abstract:
Big data is one of the emerging technology, which collects the data from various sensors and those data will be used in many fields. Data retrieval is one of the major issue where there is a need to extract the exact data as per the need. In this paper, large amount of data set is processed by using the feature selection. Feature selection helps to choose the data which are actually needed to process and execute the task. The key value is the one which helps to point out exact data available in the storage space. Here the available data is streamed and R-Center is proposed to achieve this task.Keywords: big data, key value, feature selection, retrieval, performance
Procedia PDF Downloads 3413395 Human Rights Violation in Modern Society
Authors: Shenouda Salib Hosni Rofail
Abstract:
The interface between development and human rights has long been the subject of scholarly debate. As a result, a set of principles ranging from the right to development to a human rights-based approach to development has been adopted to understand the dynamics between the two concepts. Despite these attempts, the exact link between development and human rights is not yet fully understood. However, the inevitable interdependence between the two concepts and the idea that development efforts must be made while respecting human rights have gained prominence in recent years. On the other hand, the emergence of sustainable development as a widely accepted approach to development goals and policies further complicates this unresolved convergence. The place of sustainable development in the human rights discourse and its role in ensuring the sustainability of development programs require systematic research. The aim of this article is, therefore, to examine the relationship between development and human rights, with a particular focus on the place of the principles of sustainable development in international human rights law. It will continue to examine whether it recognizes the right to sustainable development. Thus, the Article states that the principles of sustainable development are recognized directly or implicitly in various human rights instruments, which is an affirmative answer to the question posed above. Accordingly, this document scrutinizes international and regional human rights instruments, as well as the case law and interpretations of human rights bodies, to support this hypothesis.Keywords: sustainable development, human rights, the right to development, the human rights-based approach to development, environmental rights, economic development, social sustainability human rights protection, human rights violations, workers’ rights, justice, security.
Procedia PDF Downloads 503394 Self-Determination Theory at the Workplace: Associations between Need Satisfaction and Employment Outcomes
Authors: Wendy I. E. Wesseling
Abstract:
The unemployment rate has been on the rise since the outbreak of the global financial crisis in 2008. Especially labor market entrants suffer from economic downfall. Despite the abundance of programs and agencies that help to reintegrate unemployed youth, considerable less research attention has been paid to 'fit' between these programs and its participants that ensure a durable labor market transition. According to Self-Determination Theory, need satisfaction is associated with better (mental) adjustment. As such, three hypothesis were formulated: when workers’ needs for competence (H1), relatedness (H2), and autonomy (H3) are satisfied in the workplace, they are more likely to remain employed at the same employer. To test these assumptions, a sample of approximately 800 young people enrolled in a youth unemployment policy participated in a longitudinal study. The unemployment policy was aimed at the development of generic and vocational competences, and had a maximum duration of six months. Need satisfaction during the program was measured, as well as their employment outcomes up to 12 months after completion of the policy. All hypotheses were (partly) supported. Some limitations should be noted. First, since our sample consisted primarily of highly educated white graduates, it remains to be tested whether our results generalize to other groups of unemployed youth. Moreover, we are unable to conclude whether the results are due to the intervention, participants (selection effect), or both, because of the lack of a control group.Keywords: need satisfaction, person-job fit, self-determination theory, youth unemployment policy
Procedia PDF Downloads 2553393 Verbal Working Memory in Sequential and Simultaneous Bilinguals: An Exploratory Study
Authors: Archana Rao R., Deepak P., Chayashree P. D., Darshan H. S.
Abstract:
Cognitive abilities in bilinguals have been widely studied over the last few decades. Bilingualism has been found to extensively facilitate the ability to store and manipulate information in Working Memory (WM). The mechanism of WM includes primary memory, attentional control, and secondary memory, each of which makes a contribution to WM. Many researches have been done in an attempt to measure WM capabilities through both verbal (phonological) and nonverbal tasks (visuospatial). Since there is a lot of speculations regarding the relationship between WM and bilingualism, further investigation is required to understand the nature of WM in bilinguals, i.e., with respect to sequential and simultaneous bilinguals. Hence the present study aimed to highlight the verbal working memory abilities in sequential and simultaneous bilinguals with respect to the processing and recall abilities of nouns and verbs. Two groups of bilinguals aged between 18-30 years were considered for the study. Group 1 consisted of 20 (10 males and 10 females) sequential bilinguals who had acquired L1 (Kannada) before the age of 3 and had exposure to L2 (English) for a period of 8-10 years. Group 2 consisted of 20 (10 males and 10 females) simultaneous bilinguals who have acquired both L1 and L2 before the age of 3. Working memory abilities were assessed using two tasks, and a set of stimuli which was presented in gradation of complexity and the stimuli was inclusive of frequent and infrequent nouns and verbs. The tasks involved the participants to judge the correctness of the sentence and simultaneously remember the last word of each sentence and the participants are instructed to recall the words at the end of each set. The results indicated no significant difference between sequential and simultaneous bilinguals in processing the nouns and verbs, and this could be attributed to the proficiency level of the participants in L1 and the alike cognitive abilities between the groups. And recall of nouns was better compared to verbs, maybe because of the complex argument structure involved in verbs. Similarly, authors found a frequency of occurrence of nouns and verbs also had an effect on WM abilities. The difference was also found across gradation due to the load imposed on the central executive function and phonological loop.Keywords: bilinguals, nouns, verbs, working memory
Procedia PDF Downloads 1293392 Comparison of Tribological and Mechanical Properties of White Metal Produced by Laser Cladding and Conventional Methods
Authors: Jae-Il Jeong, Hoon-Jae Park, Jung-Woo Cho, Yang-Gon Kim, Jin-Young Park, Joo-Young Oh, Si-Geun Choi, Seock-Sam Kim, Young Tae Cho, Chan Gyu Kim, Jong-Hyoung Kim
Abstract:
Bearing component has strongly required to decrease vibration and wear to achieve high durability and life time. In the industry field, bearing durability is improved by surface treatment on the bearing surface by centrifugal casting or gravity casting production method. However, this manufacturing method has caused problems such as long processing time, defect rate, and health harmful effect. To solve this problem, there is a laser cladding deposition treatment, which provides fast processing and food adhesion. Therefore, optimum conditions of white metal laser deposition should be studied to minimize bearing contact axis wear using laser cladding techniques. In this study, we deposit a soft white metal layer on SCM440, which is mainly used for shaft and bolt. On laser deposition process, the laser power and powder feed rate and laser head speed factors are controlled to find out the optimal conditions. We also measure hardness using micro Vickers, analyze FE-SEM (Field Emission Scanning Electron Microscope) and EDS (Energy Dispersive Spectroscopy) to study the mechanical properties and surface characteristics with various parameters change. Furthermore, this paper suggests the optimum condition of laser cladding deposition to apply in industrial fields. This work was supported by the Industrial Innovation Project of the Korea Evaluation Institute of Industrial Technology (KEIT) granted financial resource from the Ministry of Trade, Industry & Energy, Republic of Korea (Research no. 10051653).Keywords: laser deposition, bearing, white metal, mechanical properties
Procedia PDF Downloads 2643391 Establishment of Precision System for Underground Facilities Based on 3D Absolute Positioning Technology
Authors: Yonggu Jang, Jisong Ryu, Woosik Lee
Abstract:
The study aims to address the limitations of existing underground facility exploration equipment in terms of exploration depth range, relative depth measurement, data processing time, and human-centered ground penetrating radar image interpretation. The study proposed the use of 3D absolute positioning technology to develop a precision underground facility exploration system. The aim of this study is to establish a precise exploration system for underground facilities based on 3D absolute positioning technology, which can accurately survey up to a depth of 5m and measure the 3D absolute location of precise underground facilities. The study developed software and hardware technologies to build the precision exploration system. The software technologies developed include absolute positioning technology, ground surface location synchronization technology of GPR exploration equipment, GPR exploration image AI interpretation technology, and integrated underground space map-based composite data processing technology. The hardware systems developed include a vehicle-type exploration system and a cart-type exploration system. The data was collected using the developed exploration system, which employs 3D absolute positioning technology. The GPR exploration images were analyzed using AI technology, and the three-dimensional location information of the explored precise underground facilities was compared to the integrated underground space map. The study successfully developed a precision underground facility exploration system based on 3D absolute positioning technology. The developed exploration system can accurately survey up to a depth of 5m and measure the 3D absolute location of precise underground facilities. The system comprises software technologies that build a 3D precise DEM, synchronize the GPR sensor's ground surface 3D location coordinates, automatically analyze and detect underground facility information in GPR exploration images and improve accuracy through comparative analysis of the three-dimensional location information, and hardware systems, including a vehicle-type exploration system and a cart-type exploration system. The study's findings and technological advancements are essential for underground safety management in Korea. The proposed precision exploration system significantly contributes to establishing precise location information of underground facility information, which is crucial for underground safety management and improves the accuracy and efficiency of exploration. The study addressed the limitations of existing equipment in exploring underground facilities, proposed 3D absolute positioning technology-based precision exploration system, developed software and hardware systems for the exploration system, and contributed to underground safety management by providing precise location information. The developed precision underground facility exploration system based on 3D absolute positioning technology has the potential to provide accurate and efficient exploration of underground facilities up to a depth of 5m. The system's technological advancements contribute to the establishment of precise location information of underground facility information, which is essential for underground safety management in Korea.Keywords: 3D absolute positioning, AI interpretation of GPR exploration images, complex data processing, integrated underground space maps, precision exploration system for underground facilities
Procedia PDF Downloads 623390 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading
Authors: Robert Caulk
Abstract:
A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration
Procedia PDF Downloads 893389 Arabic Light Word Analyser: Roles with Deep Learning Approach
Authors: Mohammed Abu Shquier
Abstract:
This paper introduces a word segmentation method using the novel BP-LSTM-CRF architecture for processing semantic output training. The objective of web morphological analysis tools is to link a formal morpho-syntactic description to a lemma, along with morpho-syntactic information, a vocalized form, a vocalized analysis with morpho-syntactic information, and a list of paradigms. A key objective is to continuously enhance the proposed system through an inductive learning approach that considers semantic influences. The system is currently under construction and development based on data-driven learning. To evaluate the tool, an experiment on homograph analysis was conducted. The tool also encompasses the assumption of deep binary segmentation hypotheses, the arbitrary choice of trigram or n-gram continuation probabilities, language limitations, and morphology for both Modern Standard Arabic (MSA) and Dialectal Arabic (DA), which provide justification for updating this system. Most Arabic word analysis systems are based on the phonotactic morpho-syntactic analysis of a word transmitted using lexical rules, which are mainly used in MENA language technology tools, without taking into account contextual or semantic morphological implications. Therefore, it is necessary to have an automatic analysis tool taking into account the word sense and not only the morpho-syntactic category. Moreover, they are also based on statistical/stochastic models. These stochastic models, such as HMMs, have shown their effectiveness in different NLP applications: part-of-speech tagging, machine translation, speech recognition, etc. As an extension, we focus on language modeling using Recurrent Neural Network (RNN); given that morphological analysis coverage was very low in dialectal Arabic, it is significantly important to investigate deeply how the dialect data influence the accuracy of these approaches by developing dialectal morphological processing tools to show that dialectal variability can support to improve analysis.Keywords: NLP, DL, ML, analyser, MSA, RNN, CNN
Procedia PDF Downloads 423388 Intelligent Process and Model Applied for E-Learning Systems
Authors: Mafawez Alharbi, Mahdi Jemmali
Abstract:
E-learning is a developing area especially in education. E-learning can provide several benefits to learners. An intelligent system to collect all components satisfying user preferences is so important. This research presents an approach that it capable to personalize e-information and give the user their needs following their preferences. This proposal can make some knowledge after more evaluations made by the user. In addition, it can learn from the habit from the user. Finally, we show a walk-through to prove how intelligent process work.Keywords: artificial intelligence, architecture, e-learning, software engineering, processing
Procedia PDF Downloads 1913387 Integrating Natural Language Processing (NLP) and Machine Learning in Lung Cancer Diagnosis
Authors: Mehrnaz Mostafavi
Abstract:
The assessment and categorization of incidental lung nodules present a considerable challenge in healthcare, often necessitating resource-intensive multiple computed tomography (CT) scans for growth confirmation. This research addresses this issue by introducing a distinct computational approach leveraging radiomics and deep-learning methods. However, understanding local services is essential before implementing these advancements. With diverse tracking methods in place, there is a need for efficient and accurate identification approaches, especially in the context of managing lung nodules alongside pre-existing cancer scenarios. This study explores the integration of text-based algorithms in medical data curation, indicating their efficacy in conjunction with machine learning and deep-learning models for identifying lung nodules. Combining medical images with text data has demonstrated superior data retrieval compared to using each modality independently. While deep learning and text analysis show potential in detecting previously missed nodules, challenges persist, such as increased false positives. The presented research introduces a Structured-Query-Language (SQL) algorithm designed for identifying pulmonary nodules in a tertiary cancer center, externally validated at another hospital. Leveraging natural language processing (NLP) and machine learning, the algorithm categorizes lung nodule reports based on sentence features, aiming to facilitate research and assess clinical pathways. The hypothesis posits that the algorithm can accurately identify lung nodule CT scans and predict concerning nodule features using machine-learning classifiers. Through a retrospective observational study spanning a decade, CT scan reports were collected, and an algorithm was developed to extract and classify data. Results underscore the complexity of lung nodule cohorts in cancer centers, emphasizing the importance of careful evaluation before assuming a metastatic origin. The SQL and NLP algorithms demonstrated high accuracy in identifying lung nodule sentences, indicating potential for local service evaluation and research dataset creation. Machine-learning models exhibited strong accuracy in predicting concerning changes in lung nodule scan reports. While limitations include variability in disease group attribution, the potential for correlation rather than causality in clinical findings, and the need for further external validation, the algorithm's accuracy and potential to support clinical decision-making and healthcare automation represent a significant stride in lung nodule management and research.Keywords: lung cancer diagnosis, structured-query-language (SQL), natural language processing (NLP), machine learning, CT scans
Procedia PDF Downloads 1013386 Multiscale Process Modeling Analysis for the Prediction of Composite Strength Allowables
Authors: Marianna Maiaru, Gregory M. Odegard
Abstract:
During the processing of high-performance thermoset polymer matrix composites, chemical reactions occur during elevated pressure and temperature cycles, causing the constituent monomers to crosslink and form a molecular network that gradually can sustain stress. As the crosslinking process progresses, the material naturally experiences a gradual shrinkage due to the increase in covalent bonds in the network. Once the cured composite completes the cure cycle and is brought to room temperature, the thermal expansion mismatch of the fibers and matrix cause additional residual stresses to form. These compounded residual stresses can compromise the reliability of the composite material and affect the composite strength. Composite process modeling is greatly complicated by the multiscale nature of the composite architecture. At the molecular level, the degree of cure controls the local shrinkage and thermal-mechanical properties of the thermoset. At the microscopic level, the local fiber architecture and packing affect the magnitudes and locations of residual stress concentrations. At the macroscopic level, the layup sequence controls the nature of crack initiation and propagation due to residual stresses. The goal of this research is use molecular dynamics (MD) and finite element analysis (FEA) to predict the residual stresses in composite laminates and the corresponding effect on composite failure. MD is used to predict the polymer shrinkage and thermomechanical properties as a function of degree of cure. This information is used as input into FEA to predict the residual stresses on the microscopic level resulting from the complete cure process. Virtual testing is subsequently conducted to predict strength allowables. Experimental characterization is used to validate the modeling.Keywords: molecular dynamics, finite element analysis, processing modeling, multiscale modeling
Procedia PDF Downloads 923385 Human Development Strengthening against Terrorism in ASEAN East Asia and Pacific: An Econometric Analysis
Authors: Tismazammi Mustafa, Jaharudin Padli
Abstract:
The frequency of terrorism is increasing throughout years that is resulting in loss of life, damaging people’s property, and destructing the environment. The incident of terrorism is not stationed in one particular country but has spread and scattered in other countries hence causing an increase in the number of terrorism cases. Thus, this paper aims to investigate the factors of human development upon the terrorism in East Asia and Pacific countries. This study used a panel ARDL model, in which it enables to capture the long run and the short run relationship among the variables of interest. Logit Model for Binary data is also used, in which to representing an attributes of dependent variables. This study focuses on several human development variables namely GDP per capita, population, human capital, land area, and technologies. The empirical finding revealed that the GDP per capita, population, human capital, land area, and technologies are positively and statistically significant in influencing the terrorism. Thus, the finding in this study will present as grounds to preserve human rights and develop public awareness and will offer guidelines to policy makers, emergency managers, first responders, public health workers, physicians, and other researchers.Keywords: terrorism, East Asia and Pacific, human development, econometric analysis
Procedia PDF Downloads 4143384 Antibacterial Studies on Cellulolytic Bacteria for Termite Control
Authors: Essam A. Makky, Chan Cai Wen, Muna Jalal, Mashitah M. Yusoff
Abstract:
Termites are considered as important pests that could cause severe wood damage and economic losses in urban, agriculture and forest of Malaysia. The ability of termites to degrade cellulose depends on association of gut cellulolytic microflora or better known as mutual symbionts. With the idea of disrupting the mutual symbiotic association, better pest control practices can be attained. This study is aimed to isolate cellulolytic bacteria from the gut of termites and carry out antibacterial studies for the termite. Confirmation of cellulase activity is done by qualitative and quantitative methods. Impacts of antibiotics and their combinations, as well as heavy metals and disinfectants, are conducted by using disc diffusion method. Effective antibacterial agents are then subjected for termite treatment to study the effectiveness of the agents as termiticides. 24 cellulolytic bacteria are isolated, purified and screened from the gut of termites. All isolates were identified as Gram-negative with either rod or cocci in shape. For antibacterial studies result, isolates were found to be 100% sensitive to 4 antibiotics (rifampicin, tetracycline, gentamycin, and neomycin), 2 heavy metals (cadmium and mercury) and 3 disinfectants (lactic acid, formalin, and hydrogen peroxide). 22 out of 36 antibiotic combinations showed synergistic effect while 15 antibiotic combinations showed an antagonistic effect on isolates. The 2 heavy metals and 3 disinfectants that showed 100% effectiveness, as well as 22 antibiotic combinations, that showed synergistic effect were used for termite control. Among the 27 selected antibacterial agents, 12 of them were found to be effective to kill all the termites within 1 to 6 days. Mercury, lactic acid, formalin and hydrogen peroxide were found to be the most effective termiticides in which all termites were killed within 1 day only. These effective antibacterial agents possess a great potential to be a new application to control the termite pest species in the future.Keywords: antibacterial, cellulase, termicide, termites
Procedia PDF Downloads 4673383 Long Distance Aspirating Smoke Detection for Large Radioactive Areas
Authors: Michael Dole, Pierre Ninin, Denis Raffourt
Abstract:
Most of the CERN’s facilities hosting particle accelerators are large, underground and radioactive areas. All fire detection systems installed in such areas, shall be carefully studied to cope with the particularities of this stringent environment. The detection equipment usually chosen by CERN to secure these underground facilities are based on air sampling technology. The electronic equipment is located in non-radioactive areas whereas air sampling networks are deployed in radioactive areas where fire detection is required. The air sampling technology provides very good detection performances and prevent the "radiation-to-electronic" effects. In addition, it reduces the exposure to radiations of maintenance workers and is permanently available during accelerator operation. In order to protect the Super Proton Synchrotron and its 7 km tunnels, a specific long distance aspirating smoke detector has been developed to detect smoke at up to 700 meters between electronic equipment and the last air sampling hole. This paper describes the architecture, performances and return of experience of the long distance fire detection system developed and installed to secure the CERN Super Proton Synchrotron tunnels.Keywords: air sampling, fire detection, long distance, radioactive areas
Procedia PDF Downloads 1613382 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory
Authors: Xiaochen Mu
Abstract:
Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.Keywords: data protection, property rights, intellectual property, Big data
Procedia PDF Downloads 393381 Simulating Studies on Phosphate Removal from Laundry Wastewater Using Biochar: Dudinin Approach
Authors: Eric York, James Tadio, Silas Owusu Antwi
Abstract:
Laundry wastewater contains a diverse range of chemical pollutants that can have detrimental effects on human health and the environment. In this study, simulation studies by Spyder Python software v 3.2 to assess the efficacy of biochar in removing PO₄³⁻ from wastewater were conducted. Through modeling and simulation, the mechanisms involved in the adsorption process of phosphate by biochar were studied by altering variables which is specific to the phosphate from common laundry phosphate detergents, such as the aqueous solubility, initial concentration, and temperature using the Dudinin Approach (DA). Results showed that the concentration equilibrate at near the highest concentrations for Sugar beet-120 mgL⁻¹, Tailing-85 mgL⁻¹, CaO- rich-50 mgL⁻¹, Eggshell and rice straw-48 mgL⁻¹, Undaria Pinnatifida Roots-190 mgL⁻¹, Ca-Alginate Granular Beads -240 mgL⁻¹, Laminaria Japonica Powder -900 mgL⁻¹, Pinesaw dust-57 mgL⁻¹, Ricehull-190 mgL⁻¹, sesame straw- 470 mgL⁻¹, Sugar Bagasse-380 mgL⁻¹, Miscanthus Giganteus-240 mgL⁻¹, Wood Bc-130 mgL⁻¹, Pine-25 mgL⁻¹, Sawdust-6.8 mgL⁻¹, Sewage Sludge-, Rice husk-12 mgL⁻¹, Corncob-117 mgL⁻¹, Maize straw- 1800 mgL⁻¹ while Peanut -Eucalyptus polybractea-, Crawfish equilibrated at near concentration. CO₂ activated Thalia, sewage sludge biochar, Broussonetia Papyrifera Leaves equilibrated just at the lower concentration. Only Soyer bean Stover exhibited a sharp rise and fall peak in mid-concentration at 2 mgL⁻¹ volume. The modelling results were consistent with experimental findings from the literature, ensuring the accuracy, repeatability, and reliability of the simulation study. The simulation study provided insights into adsorption for PO₄³⁻ from wastewater by biochar using concentration per volume that can be adsorbed ideally under the given conditions. Studies showed that applying the principle experimentally in real wastewater with all its complexity is warranted and not far-fetched.Keywords: simulation studies, phosphate removal, biochar, adsorption, wastewater treatment
Procedia PDF Downloads 1383380 Building Atmospheric Moisture Diagnostics: Environmental Monitoring and Data Collection
Authors: Paula Lopez-Arce, Hector Altamirano, Dimitrios Rovas, James Berry, Bryan Hindle, Steven Hodgson
Abstract:
Efficient mould remediation and accurate moisture diagnostics leading to condensation and mould growth in dwellings are largely untapped. Number of factors are contributing to the rising trend of excessive moisture in homes mainly linked with modern living, increased levels of occupation and rising fuel costs, as well as making homes more energy efficient. Environmental monitoring by means of data collection though loggers sensors and survey forms has been performed in a range of buildings from different UK regions. Air and surface temperature and relative humidity values of residential areas affected by condensation and/or mould issues were recorded. Additional measurements were taken through different trials changing type, location, and position of loggers. In some instances, IR thermal images and ventilation rates have also been acquired. Results have been interpreted together with environmental key parameters by processing and connecting data from loggers and survey questionnaires, both in buildings with and without moisture issues. Monitoring exercises carried out during Winter and Spring time show the importance of developing and following accurate protocols for guidance to obtain consistent, repeatable and comparable results and to improve the performance of environmental monitoring. A model and a protocol are being developed to build a diagnostic tool with the goal of performing a simple but precise residential atmospheric moisture diagnostics to distinguish the cause entailing condensation and mould generation, i.e., ventilation, insulation or heating systems issue. This research shows the relevance of monitoring and processing environmental data to assign moisture risk levels and determine the origin of condensation or mould when dealing with a building atmospheric moisture excess.Keywords: environmental monitoring, atmospheric moisture, protocols, mould
Procedia PDF Downloads 1393379 AI-Based Techniques for Online Social Media Network Sentiment Analysis: A Methodical Review
Authors: A. M. John-Otumu, M. M. Rahman, O. C. Nwokonkwo, M. C. Onuoha
Abstract:
Online social media networks have long served as a primary arena for group conversations, gossip, text-based information sharing and distribution. The use of natural language processing techniques for text classification and unbiased decision-making has not been far-fetched. Proper classification of this textual information in a given context has also been very difficult. As a result, we decided to conduct a systematic review of previous literature on sentiment classification and AI-based techniques that have been used in order to gain a better understanding of the process of designing and developing a robust and more accurate sentiment classifier that can correctly classify social media textual information of a given context between hate speech and inverted compliments with a high level of accuracy by assessing different artificial intelligence techniques. We evaluated over 250 articles from digital sources like ScienceDirect, ACM, Google Scholar, and IEEE Xplore and whittled down the number of research to 31. Findings revealed that Deep learning approaches such as CNN, RNN, BERT, and LSTM outperformed various machine learning techniques in terms of performance accuracy. A large dataset is also necessary for developing a robust sentiment classifier and can be obtained from places like Twitter, movie reviews, Kaggle, SST, and SemEval Task4. Hybrid Deep Learning techniques like CNN+LSTM, CNN+GRU, CNN+BERT outperformed single Deep Learning techniques and machine learning techniques. Python programming language outperformed Java programming language in terms of sentiment analyzer development due to its simplicity and AI-based library functionalities. Based on some of the important findings from this study, we made a recommendation for future research.Keywords: artificial intelligence, natural language processing, sentiment analysis, social network, text
Procedia PDF Downloads 1153378 Coping in Your Profession: An Exploratory Analysis of Healthcare Students’ Perceptions of Burnout
Authors: Heather Clark, Jon Kelly
Abstract:
Burnout among healthcare professionals has been elevated to a high level of concern. The descriptions of the healthcare workplace often include language such as, stressful, long hours, rotating shifts, weekends and holidays, and exhausting. New graduate healthcare professionals are being sent into the workplace with little to no coping skills, knowledge of signs and symptoms of burnout, or resources that are available. The authors of this study created a university course entitled 'coping in your profession' that enrolled registered nurses, licensed practical nurses, EMTs, nurse assistants, and medical assistants. The course addresses burnout, self-analysis, incivility, coping mechanisms, and organizational responsibilities for employee well-being. The students were surveyed using QualtricsXM that included a pre-course and post-course analysis. Pre-course results showed high levels of individual experiences with burnout and limited knowledge of resources to combat burnout. Post-course results included personal growth and that students’ perception of burnout can be prevented at both the individual and the organization levels. Students also indicated that few to no resources to combat burnout existed at their place of employment. Addressing burnout at the educational level helps prepare graduates with the knowledge and tools to combat burnout at the individual and organization level.Keywords: burnout, coping, healthcare workers, incivility, resilience
Procedia PDF Downloads 1363377 Political Discourse Used in the TV Talk Shows of Pakistani Media
Authors: Hafiz Sajjad Hussain, Asad Razzaq
Abstract:
The study aims to explore the relationship between application of speech and discourse used by the political workers and their leaders for maintaining authoritative approach and dialog power. The representation of these relationships between ideology and language in the analysis of discourse and spoken text following Van Dijk Socio-Cognitive model. Media and political leaders are two pillars of a state and their role is so important for development and effects on the society. Media has become an industry in the recent years in the globe, and especially, the private sector developed a lot in the last decade in Pakistan. Media is the easiest way of communication with the large community in a short time and used discourse independently. The prime time of the news channels in Pakistan presents the political programs on most favorite story or incident of the day. The current program broadcasted by a private channel ARY News July 6, 2014 covered the most top story of the day. The son of Ex. CJ Arslan Iftikhar moves an application to Election Commission of Pakistan about the daughter of the most popular political leader and chairman PTI Imran Khan. This movement turns the whole scenario of the political parties and media got a hot issue form discussion. This study also shows that the ideology and meanings which are presented by the TV channels not always obvious for readers.Keywords: electronic media, political discourse, ideology of media, power, authoritative approach
Procedia PDF Downloads 5293376 Advanced Separation Process of Hazardous Plastics and Metals from End-Of-Life Vehicles Shredder Residue by Nanoparticle Froth Flotation
Authors: Srinivasa Reddy Mallampati, Min Hee Park, Soo Mim Cho, Sung Hyeon Yoon
Abstract:
One of the issues of End of Life Vehicles (ELVs) recycling promotion is technology for the appropriate treatment of automotive shredder residue (ASR). Owing to its high heterogeneity and variable composition (plastic (23–41%), rubber/elastomers (9–21%), metals (6–13%), glass (10–20%) and dust (soil/sand) etc.), ASR can be classified as ‘hazardous waste’, on the basis of the presence of heavy metals (HMs), PCBs, BFRs, mineral oils, etc. Considering their relevant concentrations, these metals and plastics should be properly recovered for recycling purposes before ASR residues are disposed of. Brominated flame retardant additives in ABS/HIPS and PVC may generate dioxins and furans at elevated temperatures. Moreover, these BFRs additives present in plastic materials may leach into the environment during landfilling operations. ASR thermal process removes some of the organic material but concentrates, the heavy metals and POPs present in the ASR residues. In the present study, Fe/Ca/CaO nanoparticle assisted ozone treatment has been found to selectively hydrophilize the surface of ABS/HIPS and PVC plastics, enhancing its wettability and thereby promoting its separation from ASR plastics by means of froth flotation. The water contact angles, of ABS/HIPS and PVC decreased, about 18.7°, 18.3°, and 17.9° in ASR respectively. Under froth flotation conditions at 50 rpm, about 99.5% and 99.5% of HIPS in ASR samples sank, resulting in a purity of 98% and 99%. Furthermore, at 150 rpm a 100% PVC separation in the settled fraction, with 98% of purity in ASR, respectively. Total recovery of non-ABS/HIPS and PVC plastics reached nearly 100% in the floating fraction. This process improved the quality of recycled ASR plastics by removing surface contaminants or impurities. Further, a hybrid ball-milling and with Fe/Ca/CaO nanoparticle froth flotation process was established for the recovery of HMs from ASR. After ball-milling with Fe/Ca/CaO nanoparticle additives, the flotation efficiency increased to about 55 wt% and the HMs recovery were also increased about 90% for the 0.25 mm size fractions of ASR. Coating with Fe/Ca/CaO nanoparticles associated with subsequent microbubble froth flotation allowed the air bubbles to attach firmly on the HMs. SEM–EDS maps showed that the amounts of HMs were significant on the surface of the floating ASR fraction. This result, along with the low HM concentration in the settled fraction, was confirmed by elemental spectra and semi-quantitative SEM–EDS analysis. Developed hybrid preferential hazardous plastics and metals separation process from ASR is a simple, highly efficient, and sustainable procedure.Keywords: end of life vehicles shredder residue, hazardous plastics, nanoparticle froth flotation, separation process
Procedia PDF Downloads 2773375 Evaluation of Modern Natural Language Processing Techniques via Measuring a Company's Public Perception
Authors: Burak Oksuzoglu, Savas Yildirim, Ferhat Kutlu
Abstract:
Opinion mining (OM) is one of the natural language processing (NLP) problems to determine the polarity of opinions, mostly represented on a positive-neutral-negative axis. The data for OM is usually collected from various social media platforms. In an era where social media has considerable control over companies’ futures, it’s worth understanding social media and taking actions accordingly. OM comes to the fore here as the scale of the discussion about companies increases, and it becomes unfeasible to gauge opinion on individual levels. Thus, the companies opt to automize this process by applying machine learning (ML) approaches to their data. For the last two decades, OM or sentiment analysis (SA) has been mainly performed by applying ML classification algorithms such as support vector machines (SVM) and Naïve Bayes to a bag of n-gram representations of textual data. With the advent of deep learning and its apparent success in NLP, traditional methods have become obsolete. Transfer learning paradigm that has been commonly used in computer vision (CV) problems started to shape NLP approaches and language models (LM) lately. This gave a sudden rise to the usage of the pretrained language model (PTM), which contains language representations that are obtained by training it on the large datasets using self-supervised learning objectives. The PTMs are further fine-tuned by a specialized downstream task dataset to produce efficient models for various NLP tasks such as OM, NER (Named-Entity Recognition), Question Answering (QA), and so forth. In this study, the traditional and modern NLP approaches have been evaluated for OM by using a sizable corpus belonging to a large private company containing about 76,000 comments in Turkish: SVM with a bag of n-grams, and two chosen pre-trained models, multilingual universal sentence encoder (MUSE) and bidirectional encoder representations from transformers (BERT). The MUSE model is a multilingual model that supports 16 languages, including Turkish, and it is based on convolutional neural networks. The BERT is a monolingual model in our case and transformers-based neural networks. It uses a masked language model and next sentence prediction tasks that allow the bidirectional training of the transformers. During the training phase of the architecture, pre-processing operations such as morphological parsing, stemming, and spelling correction was not used since the experiments showed that their contribution to the model performance was found insignificant even though Turkish is a highly agglutinative and inflective language. The results show that usage of deep learning methods with pre-trained models and fine-tuning achieve about 11% improvement over SVM for OM. The BERT model achieved around 94% prediction accuracy while the MUSE model achieved around 88% and SVM did around 83%. The MUSE multilingual model shows better results than SVM, but it still performs worse than the monolingual BERT model.Keywords: BERT, MUSE, opinion mining, pretrained language model, SVM, Turkish
Procedia PDF Downloads 1463374 Validation of Escherichia coli O157:H7 Inactivation on Apple-Carrot Juice Treated with Manothermosonication by Kinetic Models
Authors: Ozan Kahraman, Hao Feng
Abstract:
Several models such as Weibull, Modified Gompertz, Biphasic linear, and Log-logistic models have been proposed in order to describe non-linear inactivation kinetics and used to fit non-linear inactivation data of several microorganisms for inactivation by heat, high pressure processing or pulsed electric field. First-order kinetic parameters (D-values and z-values) have often been used in order to identify microbial inactivation by non-thermal processing methods such as ultrasound. Most ultrasonic inactivation studies employed first-order kinetic parameters (D-values and z-values) in order to describe the reduction on microbial survival count. This study was conducted to analyze the E. coli O157:H7 inactivation data by using five microbial survival models (First-order, Weibull, Modified Gompertz, Biphasic linear and Log-logistic). First-order, Weibull, Modified Gompertz, Biphasic linear and Log-logistic kinetic models were used for fitting inactivation curves of Escherichia coli O157:H7. The residual sum of squares and the total sum of squares criteria were used to evaluate the models. The statistical indices of the kinetic models were used to fit inactivation data for E. coli O157:H7 by MTS at three temperatures (40, 50, and 60 0C) and three pressures (100, 200, and 300 kPa). Based on the statistical indices and visual observations, the Weibull and Biphasic models were best fitting of the data for MTS treatment as shown by high R2 values. The non-linear kinetic models, including the Modified Gompertz, First-order, and Log-logistic models did not provide any better fit to data from MTS compared the Weibull and Biphasic models. It was observed that the data found in this study did not follow the first-order kinetics. It is possibly because of the cells which are sensitive to ultrasound treatment were inactivated first, resulting in a fast inactivation period, while those resistant to ultrasound were killed slowly. The Weibull and biphasic models were found as more flexible in order to determine the survival curves of E. coli O157:H7 treated by MTS on apple-carrot juice.Keywords: Weibull, Biphasic, MTS, kinetic models, E.coli O157:H7
Procedia PDF Downloads 3663373 Digi-Buddy: A Smart Cane with Artificial Intelligence and Real-Time Assistance
Authors: Amaladhithyan Krishnamoorthy, Ruvaitha Banu
Abstract:
Vision is considered as the most important sense in humans, without which leading a normal can be often difficult. There are many existing smart canes for visually impaired with obstacle detection using ultrasonic transducer to help them navigate. Though the basic smart cane increases the safety of the users, it does not help in filling the void of visual loss. This paper introduces the concept of Digi-Buddy which is an evolved smart cane for visually impaired. The cane consists for several modules, apart from the basic obstacle detection features; the Digi-Buddy assists the user by capturing video/images and streams them to the server using a wide-angled camera, which then detects the objects using Deep Convolutional Neural Network. In addition to determining what the particular image/object is, the distance of the object is assessed by the ultrasonic transducer. The sound generation application, modelled with the help of Natural Language Processing is used to convert the processed images/object into audio. The object detected is signified by its name which is transmitted to the user with the help of Bluetooth hear phones. The object detection is extended to facial recognition which maps the faces of the person the user meets in the database of face images and alerts the user about the person. One of other crucial function consists of an automatic-intimation-alarm which is triggered when the user is in an emergency. If the user recovers within a set time, a button is provisioned in the cane to stop the alarm. Else an automatic intimation is sent to friends and family about the whereabouts of the user using GPS. In addition to safety and security by the existing smart canes, the proposed concept devices to be implemented as a prototype helping visually-impaired visualize their surroundings through audio more in an amicable way.Keywords: artificial intelligence, facial recognition, natural language processing, internet of things
Procedia PDF Downloads 3553372 Enforcement against Illegal Logging: Issues and Challenges
Authors: Muhammad Nur Haniff Mohd Noor, Rokiah Kadir, Suriyani Muhamad
Abstract:
Sustainable forest management and forest protection can be hampered by illegal logging. Illegal logging is not uncommon in many wood-producing countries. Hence, law enforcement, especially in timber-producing countries, is crucial in ensuring compliance with forestry related regulations, as well as confirming that all parties obey the rules and regulations prescribed by the authorities. However, enforcement officers are encountering various challenges and difficulties which have undermined the enforcement capacity and efficiency. The appropriate policy responses for these issues are important to resolve the problems in the long term and empowering enforcement capacity to meet future challenges of forest law enforcement. This paper is written according to extensive review of the articles and publications by The International Criminal Police Organization (INTERPOL), The International Tropical Timber Organization (ITTO), Chatham House and The Food and Agriculture Organization of the United Nations (FAO). Subsequently, various books and journal articles are reviewed to gain further insight towards enforcement issues and challenges. This paper identifies several issues which consist of (1) insufficient enforcement capacity and resources (2) lack of coordination between various enforcement agencies, (3) corruption in the government and private sectors and (4) unclear legal frameworks related to the forestry sector. Next, this paper discusses appropriate policy responses to address each enforcement challenges according to various publications. This includes specific reports concerning forest law enforcement published by international forestry-related organizations. Therefore, lack of resources, inadequate synchronization between agencies, corruption, and legal issues present challenges to enforcement officers in their daily routines. Recommendations regarding proper policy responses to overcome the issues are of great importance in assisting forest authorities in prioritizing their resources appropriately.Keywords: corruption, enforcement challenges, enforcement capacity, forest law enforcement, insufficient agency coordination, legislative ambiguity
Procedia PDF Downloads 1873371 Impact of Job Crafting on Work Engagement and Well-Being among Indian Working Professionals
Authors: Arjita Jhingran
Abstract:
The pandemic was a turning point for flexible employment. In today’s market, employees prefer companies that provide the autonomy to change their work environment and are flexible. Post pandemic employees have become accustomed to modifying, re-designing, and re-aligning their work environment, task, and the way they interact with co-workers based on their preferences after working from home for a long time. In this scenario, the concept of job crafting has come to the forefront, and research on the subject has expanded, particularly during COVID-19. Managers who provide opportunities to craft the job are driving enhanced engagement and well-being. The current study will aim to examine the impact of job crafting on work engagement and psychological well-being among 385 working professionals, ranging in the age group of 21- 39 years. (M age=30 years). The study will also draw comparisons between freelancers and full-time employees, as freelancers have been considered to have more autonomy over their job. A comparison-based among MNC or startups will be studied; as for the majority of startups, autonomy is a primary motivator. Moreover, a difference based on the level of experience will also be observed, which will add to the body of knowledge. The data will be collected through Job Crafting Questionnaire, Utrecht Work Engagement Scale, and Psychological Well-Being Scale. To infer the findings, correlation analysis will be used to study the relationship among variables, and a Three way ANOVA will be used to draw comparisons.Keywords: job crafting, work engagement, well-being, freelancers, start-ups
Procedia PDF Downloads 105