Search results for: human error
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10142

Search results for: human error

9182 A Two-Stage Adaptation towards Automatic Speech Recognition System for Malay-Speaking Children

Authors: Mumtaz Begum Mustafa, Siti Salwah Salim, Feizal Dani Rahman

Abstract:

Recently, Automatic Speech Recognition (ASR) systems were used to assist children in language acquisition as it has the ability to detect human speech signal. Despite the benefits offered by the ASR system, there is a lack of ASR systems for Malay-speaking children. One of the contributing factors for this is the lack of continuous speech database for the target users. Though cross-lingual adaptation is a common solution for developing ASR systems for under-resourced language, it is not viable for children as there are very limited speech databases as a source model. In this research, we propose a two-stage adaptation for the development of ASR system for Malay-speaking children using a very limited database. The two stage adaptation comprises the cross-lingual adaptation (first stage) and cross-age adaptation. For the first stage, a well-known speech database that is phonetically rich and balanced, is adapted to the medium-sized Malay adults using supervised MLLR. The second stage adaptation uses the speech acoustic model generated from the first adaptation, and the target database is a small-sized database of the target users. We have measured the performance of the proposed technique using word error rate, and then compare them with the conventional benchmark adaptation. The two stage adaptation proposed in this research has better recognition accuracy as compared to the benchmark adaptation in recognizing children’s speech.

Keywords: Automatic Speech Recognition System, children speech, adaptation, Malay

Procedia PDF Downloads 403
9181 Impact Position Method Based on Distributed Structure Multi-Agent Coordination with JADE

Authors: YU Kaijun, Liang Dong, Zhang Yarong, Jin Zhenzhou, Yang Zhaobao

Abstract:

For the impact monitoring of distributed structures, the traditional positioning methods are based on the time difference, which includes the four-point arc positioning method and the triangulation positioning method. But in the actual operation, these two methods have errors. In this paper, the Multi-Agent Blackboard Coordination Principle is used to combine the two methods. Fusion steps: (1) The four-point arc locating agent calculates the initial point and records it to the Blackboard Module.(2) The triangulation agent gets its initial parameters by accessing the initial point.(3) The triangulation agent constantly accesses the blackboard module to update its initial parameters, and it also logs its calculated point into the blackboard.(4) When the subsequent calculation point and the initial calculation point are within the allowable error, the whole coordination fusion process is finished. This paper presents a Multi-Agent collaboration method whose agent framework is JADE. The JADE platform consists of several agent containers, with the agent running in each container. Because of the perfect management and debugging tools of the JADE, it is very convenient to deal with complex data in a large structure. Finally, based on the data in Jade, the results show that the impact location method based on Multi-Agent coordination fusion can reduce the error of the two methods.

Keywords: impact monitoring, structural health monitoring(SHM), multi-agent system(MAS), black-board coordination, JADE

Procedia PDF Downloads 181
9180 Work Engagement Reducing Employee Turnover Intentions in Telecommunication Sector: The Moderator Role of Human Resource Development Climate between Work Engagement and Turnover Intentions

Authors: Pirzada Sami Ullah Sabri

Abstract:

The present study examines the relationship between work engagement (WE) and employee turnover intentions (TI) in telecommunication sector using human resource development climate (HRDC) as a moderator. Based on 538 employees of telecommunication sector Hierarchal regression analysis is employed to examine the influence of HRDC on the relationship of work engagement and turnover intentions. The result indicates the negative correlation between work engagement and turnover intentions; HRD climate support as a powerful moderator increases the work engagement and lessens the turnover intentions. The study shows the importance of favorable and supportive HRD climate which foster the work engagement of the employees in the organization. By understanding the importance of human resource development climate and work engagement in reducing the turnover intentions can increase the productivity and performance of the organization.

Keywords: turnover intentions, work engagement, human resource development, climate, hierarchal regression analysis, telecommunication sector

Procedia PDF Downloads 436
9179 A Conceptualization of the Relationship between Frontline Service Robots and Humans in Service Encounters and the Effect on Well-Being

Authors: D. Berg, N. Hartley, L. Nasr

Abstract:

This paper presents a conceptual model of human-robot interaction within service encounters and the effect on the well-being of both consumers and service providers. In this paper, service providers are those employees who work alongside frontline service robots. The significance of this paper lies in the knowledge created which outlines how frontline service robots can be effectively utilized in service encounters for the benefit of organizations and society as a whole. As this paper is conceptual in nature, the main methodologies employed are theoretical, namely problematization and theory building. The significance of this paper is underpinned by the shift of service robots from manufacturing plants and factory floors to consumer-facing service environments. This service environment places robots in direct contact with frontline employees and consumers creating a hybrid workplace where humans work alongside service robots. This change from back-end to front-end roles may have implications not only on the physical environment, servicescape, design, and strategy of service offerings and encounters but also on the human parties of the service encounter itself. Questions such as ‘how are frontline service robots impacting and changing the service encounter?’ and ‘what effect are such changes having on the well-being of the human actors in a service encounter?’ spring to mind. These questions form the research question of this paper. To truly understand social service robots, an interdisciplinary perspective is required. Besides understanding the function, system, design or mechanics of a service robot, it is also necessary to understand human-robot interaction. However not simply human-robot interaction, but particularly what happens when such robots are placed in commercial settings and when human-robot interaction becomes consumer-robot interaction and employee-robot interaction? A service robot in this paper is characterized by two main factors; its social characteristics and the consumer-facing environment within which it operates. The conceptual framework presented in this paper contributes to interdisciplinary discussions surrounding social robotics, service, and technology’s impact on consumer and service provider well-being, and hopes that such knowledge will help improve services, as well as the prosperity and well-being of society.

Keywords: frontline service robots, human-robot interaction, service encounters, well-being

Procedia PDF Downloads 214
9178 Relationship between Electricity Consumption and Economic Growth: Evidence from Nigeria (1971-2012)

Authors: N. E Okoligwe, Okezie A. Ihugba

Abstract:

Few scholars disagrees that electricity consumption is an important supporting factor for economy growth. However, the relationship between electricity consumption and economy growth has different manifestation in different countries according to previous studies. This paper examines the causal relationship between electricity consumption and economic growth for Nigeria. In an attempt to do this, the paper tests the validity of the modernization or depending hypothesis by employing various econometric tools such as Augmented Dickey Fuller (ADF) and Johansen Co-integration test, the Error Correction Mechanism (ECM) and Granger Causality test on time series data from 1971-2012. The Granger causality is found not to run from electricity consumption to real GDP and from GDP to electricity consumption during the year of study. The null hypothesis is accepted at the 5 per cent level of significance where the probability value (0.2251 and 0.8251) is greater than five per cent level of significance because both of them are probably determined by some other factors like; increase in urban population, unemployment rate and the number of Nigerians that benefit from the increase in GDP and increase in electricity demand is not determined by the increase in GDP (income) over the period of study because electricity demand has always been greater than consumption. Consequently; the policy makers in Nigeria should place priority in early stages of reconstruction on building capacity additions and infrastructure development of the electric power sector as this would force the sustainable economic growth in Nigeria.

Keywords: economic growth, electricity consumption, error correction mechanism, granger causality test

Procedia PDF Downloads 314
9177 Research on Pilot Sequence Design Method of Multiple Input Multiple Output Orthogonal Frequency Division Multiplexing System Based on High Power Joint Criterion

Authors: Linyu Wang, Jiahui Ma, Jianhong Xiang, Hanyu Jiang

Abstract:

For the pilot design of the sparse channel estimation model in Multiple Input Multiple Output Orthogonal Frequency Division Multiplexing (MIMO-OFDM) systems, the observation matrix constructed according to the matrix cross-correlation criterion, total correlation criterion and other optimization criteria are not optimal, resulting in inaccurate channel estimation and high bit error rate at the receiver. This paper proposes a pilot design method combining high-power sum and high-power variance criteria, which can more accurately estimate the channel. First, the pilot insertion position is designed according to the high-power variance criterion under the condition of equal power. Then, according to the high power sum criterion, the pilot power allocation is converted into a cone programming problem, and the power allocation is carried out. Finally, the optimal pilot is determined by calculating the weighted sum of the high power sum and the high power variance. Compared with the traditional pilot frequency, under the same conditions, the constructed MIMO-OFDM system uses the optimal pilot frequency for channel estimation, and the communication bit error rate performance obtains a gain of 6~7dB.

Keywords: MIMO-OFDM, pilot optimization, compressed sensing, channel estimation

Procedia PDF Downloads 155
9176 Multimodal Deep Learning for Human Activity Recognition

Authors: Ons Slimene, Aroua Taamallah, Maha Khemaja

Abstract:

In recent years, human activity recognition (HAR) has been a key area of research due to its diverse applications. It has garnered increasing attention in the field of computer vision. HAR plays an important role in people’s daily lives as it has the ability to learn advanced knowledge about human activities from data. In HAR, activities are usually represented by exploiting different types of sensors, such as embedded sensors or visual sensors. However, these sensors have limitations, such as local obstacles, image-related obstacles, sensor unreliability, and consumer concerns. Recently, several deep learning-based approaches have been proposed for HAR and these approaches are classified into two categories based on the type of data used: vision-based approaches and sensor-based approaches. This research paper highlights the importance of multimodal data fusion from skeleton data obtained from videos and data generated by embedded sensors using deep neural networks for achieving HAR. We propose a deep multimodal fusion network based on a twostream architecture. These two streams use the Convolutional Neural Network combined with the Bidirectional LSTM (CNN BILSTM) to process skeleton data and data generated by embedded sensors and the fusion at the feature level is considered. The proposed model was evaluated on a public OPPORTUNITY++ dataset and produced a accuracy of 96.77%.

Keywords: human activity recognition, action recognition, sensors, vision, human-centric sensing, deep learning, context-awareness

Procedia PDF Downloads 107
9175 The Simultaneous Application of Chemical and Biological Markers to Identify Reliable Indicators of Untreated Human Waste and Fecal Pollution in Urban Philadelphia Source Waters

Authors: Stafford Stewart, Hui Yu, Rominder Suri

Abstract:

This paper publishes the results of the first known study conducted in urban Philadelphia waterways that simultaneously utilized anthropogenic chemical and biological markers to identify suitable indicators of untreated human waste and fecal pollution. A total of 13 outfall samples, 30 surface water samples, and 2 groundwater samples were analyzed for fecal contamination and untreated human waste using a suite of 25 chemical markers and 5 bio-markers. Pearson rank correlation tests were conducted to establish associations between the abundances of bio-markers and the concentrations of chemical markers. Results show that 16S rRNA gene of human-associated Bacteroidales (BacH) was very strongly correlated (0.76 – 0.97, p < 0.05) with labile chemical markers acetaminophen, cotinine, estriol, and urobilin. Likewise, human-specific F- RNA coliphages (F-RNA-II) and labile chemical markers, urobilin, ibuprofen, cotinine and estriol, were significantly correlated (0.77 – 0.95, p < 0.05). Similarly, a strong positive correlation (0.67 – 0.91, p < 0.05) was evident between the abundances of bio-markers BacH and F-RNA-II, and the concentrations of the conservative markers, trimethoprim, meprobamate, diltiazem, triclocarban, metformin, sucralose, gemfibrozil, sulfamethoxazole, and carbamazepine. Human mitochondrial DNA (MitoH) correlated moderately with labile markers nicotine and salicylic acid as well as with conservative markers metformin and triclocarban (0.31 – 0.47, p<0.05). This study showed that by associating chemical and biological markers, a robust technique was developed for fingerprinting source-specific untreated waste and fecal contamination in source waters.

Keywords: anthropogenic markers, bacteroidales, fecal pollution, source waters, wastewater

Procedia PDF Downloads 19
9174 Factors Affecting Employee Decision Making in an AI Environment

Authors: Yogesh C. Sharma, A. Seetharaman

Abstract:

The decision-making process in humans is a complicated system influenced by a variety of intrinsic and extrinsic factors. Human decisions have a ripple effect on subsequent decisions. In this study, the scope of human decision making is limited to employees. In an organisation, a person makes a variety of decisions from the time they are hired to the time they retire. The goal of this research is to identify various elements that influence decision-making. In addition, the environment in which a decision is made is a significant aspect of the decision-making process. Employees in today's workplace use artificial intelligence (AI) systems for automation and decision augmentation. The impact of AI systems on the decision-making process is examined in this study. This research is designed based on a systematic literature review. Based on gaps in the literature, limitations and the scope of future research have been identified. Based on these findings, a research framework has been designed to identify various factors affecting employee decision making. Employee decision making is influenced by technological advancement, data-driven culture, human trust, decision automation-augmentation, and workplace motivation. Hybrid human-AI systems require the development of new skill sets and organisational design. Employee psychological safety and supportive leadership influences overall job satisfaction.

Keywords: employee decision making, artificial intelligence (AI) environment, human trust, technology innovation, psychological safety

Procedia PDF Downloads 113
9173 Usage the Point Analysis Algorithm (SANN) on Drought Analysis

Authors: Khosro Shafie Motlaghi, Amir Reza Salemian

Abstract:

In arid and semi-arid regions like our country Evapotranspiration is the greatestportion of water resource. Therefor knowlege of its changing and other climate parameters plays an important role for planning, development, and management of water resource. In this search the Trend of long changing of Evapotranspiration (ET0), average temprature, monthly rainfall were tested. To dose, all synoptic station s in iran were divided according to the climate with Domarton climate. The present research was done in semi-arid climate of Iran, and in which 14 synoptic with 30 years period of statistics were investigated with 3 methods of minimum square error, Mann Kendoll, and Vald-Volfoytz Evapotranspiration was calculated by using the method of FAO-Penman. The results of investigation in periods of statistic has shown that the process Evapotranspiration parameter of 24 percent of stations is positive, and for 2 percent is negative, and for 47 percent. It was without any Trend. Similary for 22 percent of stations was positive the Trend of parameter of temperature for 19 percent , the trend was negative and for 64 percent, it was without any Trend. The results of rainfall trend has shown that the amount of rainfall in most stations was not considered as a meaningful trend. The result of Mann-kendoll method similar to minimum square error method. regarding the acquired result was can admit that in future years Some regions will face increase of temperature and Evapotranspiration.

Keywords: analysis, algorithm, SANN, ET0

Procedia PDF Downloads 302
9172 Study of Indian and Southeast Asian Literature to Trace Evolution of Hanuman

Authors: Subramanian Chidambaran

Abstract:

Right from the Vedic period, we have instances of human heroes being deified and later even assimilated into other deities. Many scholars opine Indra to be one such Vedic deity who rose from a ‘human leader’ to the position of Devata. We also see the assimilation of the Vedic deity Rudra into Śiva in post-Vedic period. Thus the current deities and Gods we worship in the polytheistic Hindu system have been a result of many such deifications and assimilations. Hanumān is one such contemporary character in Indian culture that changed from a valiant hero of the Rāmāyaṇa to a prominent deity in present days. There are also many arguments on whether Hanumān was truly a monkey or a human as the term ‘vānara’ could be interpreted as ‘vā narah’ i.e. ‘or a human’. Despite the popularity of this deity, there is very little academic research done on the genesis and evolution of him. There are many questions that arise - Does Hanumān find any mention (in any form) in literature or archaeological evidence prior to Vālmῑki Rāmāyaṇa? What is the character of Hanumān in the Vālmῑki Rāmāyaṇa? How has this evolved in later Indian literature and where do we see the deification process beginning? What’s the character of Hanumān in literature beyond Indian shores such as Southeast Asian literature and how does it compare with those in Indian literature? This paper is an attempt to answer these questions and trace the evolution of the character Hanumān right from the Vālmῑki Rāmāyaṇa to other Indian literature as well as Southeast Asian literature.

Keywords: Hanumān, Indian, Rāmāyaṇa, Southeast Asia

Procedia PDF Downloads 285
9171 Fossil Health: Causes and Consequences of Hegemonic Health Paradigms

Authors: Laila Vivas

Abstract:

Fossil Health is proposed as a value-concept to describe the hegemonic health paradigms that underpin health enactment. Such representation is justified by Foucaldian and related ideas on biopower and biosocialities, calling for the politicization of health and signalling the importance of narratives. This approach, hence, enables contemplating health paradigms as reflexive or co-constitutive of health itself or, in other words, conceiving health as a verb. Fossil health is a symbolic representation, influenced by Andreas Malm’s concept of fossil capitalism, that integrates environment and health as non-dichotomic areas. Fossil Health sustains that current notions of human and non-human health revolve around fossil fuel dependencies. Moreover, addressing disequilibria from established health ideals involves fossil-fixes. Fossil Health, therefore, represents causes and consequences of a health conception that has the agency to contribute to the functioning of a particular structural eco-social model. Moreover, within current capitalist relations, Fossil Health expands its meaning to cover not only fossil implications but also other dominant paradigms of the capitalist system that are (re)produced through health paradigms, such as the burgeoning of technoscience and biomedicalization, privatization of health, expertization of health, or the imposing of standards of uniformity. Overall, Fossil Health is a comprehensive approach to environment and health, where understanding hegemonic health paradigms means understanding our (human-non-human) nature paradigms and the structuring effect these narratives convey.

Keywords: fossil health, environment, paradigm, capitalism

Procedia PDF Downloads 127
9170 The Law of Donation and Transplantation of Human Body Organs in the Kurdistan Region of Iraq

Authors: Rebaz Sdiq Ismail

Abstract:

Organ donation and transplantation is one of the most debated topics in modern jurisprudence. It is a surgical procedure that aims to prolong a person’s life suffering from damaged or missing organs. This surgical procedure is carried out by removing an organ from a donor and transplanting it into the body of the recipient. As human life is of high value in Islamic Sharia, therefore, the donor and recipient should go through an intensive medical examination to remove any health risk associated with the organ and transplantation procedure. Thus, in carrying out the organ donation process, any violation of the Sharia decree that might cause harm to the human body is strictly prohibited. The researcher concludes that the former scholars of Islamic Sharia, along with some of the contemporary scholars, are against the entire concept of organ donation and transplant. However, the majority of contemporary scholars support organ donation.

Keywords: law, donation, organ, Kurdistan, sharia

Procedia PDF Downloads 36
9169 Patients’ Rights: An Enquiry into the Activities of Local Psychiatric Centers Managed by Muslims in South-West Nigeria

Authors: Shaykh-Luqman Jimoh

Abstract:

In Nigeria, aside the eight Government hospitals designated Psychiatric hospitals, there are also many local psychiatric centers managed by muslims and non-muslim individuals. These centers have been heavily criticized for human right abuses. This study is an inquiry into the truth or otherwise of the criticism. The study focuses on the activities of local centers managed by muslim individuals in South-West Nigeria with a view to determining the extent they uphold or violate their patients’ fundamental human rights as guaranteed by Islam. Information about the activities of the centers were collected through oral interviews. Both descriptive and analytical methods were used in the study. The study revealed that while there are some activities of the local centers managed by muslims in the study area that could be regarded as outright violation of patients’ fundamental human rights, some others, in view of the rationale behind them, may not necessarily constitute outright violation of the patients’ fundamental human rights as hitherto painted except where excesses are committed. The study therefore, using Islamic paradigm, suggests general measures that could be taken to improve on the activities of the centers.

Keywords: local psychiatric centers, muslim exorcists, patients’ rights, South-West Nigeria

Procedia PDF Downloads 506
9168 Prevalence and Antimicrobial Susceptibility of Thermophilic Campylobacter Strains Isolated from Humans and Poultry in Batna

Authors: Baali Mohamed

Abstract:

Campylobacter are among the most common human bacterial gastroenteritis cases in many countries, and poultry meat is considered as a major source of human campylobacteriosis. This study is conducted, on one hand, to determine the prevalence of infection with thermotolerant Campylobacter both in broiler flocks and men, and to study their sensitivity to antibiotics, and secondly for comparing the two methods of isolation of Campylobacter thermotolerant: technique of passive filtration and selective isolation technique using the Karmali medium. This study examined 310 samples, 260 of avian origin and 50 of human origin, during the period from June 2011 to March 2012. Detecting Campylobacter thermotolerant is conducted using the standard ISO 10272. The results show that 66% (95% CI : 60-72%) of avian samples are contaminated with C. TT (172/260). The study of antibiotic susceptibility revealed that all strains (100%) are resistant to ampicillin and amoxicillin/clavulanic acid, 90% to erythromycin, 66.3% to tetracycline, 53.3% to chloramphenicol and 46.7% to enrofloxacin. However, no resistance is noted to gentamycin. In human samples, three strains of C. thermotolerant are detected, with a contamination rate of 6%. The results of the statistical analysis using the chi-square test (χ2) showed that Campylobacter infection, on the one hand, had seasonal variation with a summer peak (p < 0.05) and, on the other hand, are not influenced by the size of the herd.

Keywords: thermotolerant campylobacter, broiler, man, Karmali

Procedia PDF Downloads 401
9167 Error Analysis of Pronunciation of French by Sinhala Speaking Learners

Authors: Chandeera Gunawardena

Abstract:

The present research analyzes the pronunciation errors encountered by thirty Sinhala speaking learners of French on the assumption that the pronunciation errors were systematic and they reflect the interference of the native language of the learners. The thirty participants were selected using random sampling method. By the time of the study, the subjects were studying French as a foreign language for their Bachelor of Arts Degree at University of Kelaniya, Sri Lanka. The participants were from a homogenous linguistics background. All participants speak the same native language (Sinhala) thus they had completed their secondary education in Sinhala medium and during which they had also learnt French as a foreign language. A battery operated audio tape recorder and a 120-minute blank cassettes were used for recording. A list comprised of 60 words representing all French phonemes was used to diagnose pronunciation difficulties. Before the recording process commenced, the subjects were requested to familiarize themselves with the words through reading them several times. The recording was conducted individually in a quiet classroom and each recording approximately took fifteen minutes. Each subject was required to read at a normal speed. After the completion of recording, the recordings were replayed to identify common errors which were immediately transcribed using the International Phonetic Alphabet. Results show that Sinhala speaking learners face problems with French nasal vowels and French initial consonants clusters. The learners also exhibit errors which occur because of their second language (English) interference.

Keywords: error analysis, pronunciation difficulties, pronunciation errors, Sinhala speaking learners of French

Procedia PDF Downloads 214
9166 A Comparative Study of Sampling-Based Uncertainty Propagation with First Order Error Analysis and Percentile-Based Optimization

Authors: M. Gulam Kibria, Shourav Ahmed, Kais Zaman

Abstract:

In system analysis, the information on the uncertain input variables cause uncertainty in the system responses. Different probabilistic approaches for uncertainty representation and propagation in such cases exist in the literature. Different uncertainty representation approaches result in different outputs. Some of the approaches might result in a better estimation of system response than the other approaches. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge (MUQC) has posed challenges about uncertainty quantification. Subproblem A, the uncertainty characterization subproblem, of the challenge posed is addressed in this study. In this subproblem, the challenge is to gather knowledge about unknown model inputs which have inherent aleatory and epistemic uncertainties in them with responses (output) of the given computational model. We use two different methodologies to approach the problem. In the first methodology we use sampling-based uncertainty propagation with first order error analysis. In the other approach we place emphasis on the use of Percentile-Based Optimization (PBO). The NASA Langley MUQC’s subproblem A is developed in such a way that both aleatory and epistemic uncertainties need to be managed. The challenge problem classifies each uncertain parameter as belonging to one the following three types: (i) An aleatory uncertainty modeled as a random variable. It has a fixed functional form and known coefficients. This uncertainty cannot be reduced. (ii) An epistemic uncertainty modeled as a fixed but poorly known physical quantity that lies within a given interval. This uncertainty is reducible. (iii) A parameter might be aleatory but sufficient data might not be available to adequately model it as a single random variable. For example, the parameters of a normal variable, e.g., the mean and standard deviation, might not be precisely known but could be assumed to lie within some intervals. It results in a distributional p-box having the physical parameter with an aleatory uncertainty, but the parameters prescribing its mathematical model are subjected to epistemic uncertainties. Each of the parameters of the random variable is an unknown element of a known interval. This uncertainty is reducible. From the study, it is observed that due to practical limitations or computational expense, the sampling is not exhaustive in sampling-based methodology. That is why the sampling-based methodology has high probability of underestimating the output bounds. Therefore, an optimization-based strategy to convert uncertainty described by interval data into a probabilistic framework is necessary. This is achieved in this study by using PBO.

Keywords: aleatory uncertainty, epistemic uncertainty, first order error analysis, uncertainty quantification, percentile-based optimization

Procedia PDF Downloads 244
9165 Expectation for Professionalism Effects Reality Shock: A Qualitative And Quantitative Study of Reality Shock among New Human Service Professionals

Authors: Hiromi Takafuji

Abstract:

It is a well-known fact that health care and welfare are the foundation of human activities, and human service professionals such as nurses and child care workers support these activities. COVID-19 pandemic has made the severity of the working environment in these fields even more known. It is high time to discuss the work of human service workers for the sustainable development of the human environment. Early turnover has been recognized as a long-standing issue in these fields. In Japan, the attrition rate within three years of graduation for these occupations has remained high at about 40% for more than 20 years. One of the reasons for this is Reality Shock: RS, which refers to the stress caused by the gap between pre-employment expectations and the post-employment reality experienced by new workers. The purpose of this study was to academically elucidate the mechanism of RS among human service professionals and to contribute to countermeasures against it. Firstly, to explore the structure of the relationship between professionalism and workers' RS, an exploratory interview survey was conducted and analyzed by text mining and content analysis. The results showed that the expectation of professionalism influences RS as a pre-employment job expectation. Next, the expectations of professionalism were quantified and categorized, and the responses of a total of 282 human service work professionals, nurses, child care workers, and caregivers; were finalized for data analysis. The data were analyzed using exploratory factor analysis, confirmatory factor analysis, multiple regression analysis, and structural equation modeling techniques. The results revealed that self-control orientation and authority orientation by qualification had a direct positive significant impact on RS. On the other hand, interpersonal helping orientation and altruistic orientation were found to have a direct negative significant impact and an indirect positive significant impact on RS.; we were able to clarify the structure of work expectations that affect the RS of welfare professionals, which had not been clarified in previous studies. We also explained the limitations, practical implications, and directions for future research.

Keywords: human service professional, new hire turnover, SEM, reality shock

Procedia PDF Downloads 102
9164 Simulation IDM for Schedule Generation of Slip-Form Operations

Authors: Hesham A. Khalek, Shafik S. Khoury, Remon F. Aziz, Mohamed A. Hakam

Abstract:

Slipforming operation’s linearity is a source of planning complications, and operation is usually subjected to bottlenecks at any point, so careful planning is required in order to achieve success. On the other hand, Discrete-event simulation concepts can be applied to simulate and analyze construction operations and to efficiently support construction scheduling. Nevertheless, preparation of input data for construction simulation is very challenging, time-consuming and human prone-error source. Therefore, to enhance the benefits of using DES in construction scheduling, this study proposes an integrated module to establish a framework for automating the generation of time schedules and decision support for Slipform construction projects, particularly through the project feasibility study phase by using data exchange between project data stored in an Intermediate database, DES and Scheduling software. Using the stored information, proposed system creates construction tasks attribute [e.g. activities durations, material quantities and resources amount], then DES uses all the given information to create a proposal for the construction schedule automatically. This research is considered a demonstration of a flexible Slipform project modeling, rapid scenario-based planning and schedule generation approach that may be of interest to both practitioners and researchers.

Keywords: discrete-event simulation, modeling, construction planning, data exchange, scheduling generation, EZstrobe

Procedia PDF Downloads 382
9163 Survey of the Role of Contextualism in the Designing of Cultural Constructions Based on Rapoport Views

Authors: E. Zarei, M. Bazaei, A. Seifi, A. Keshavarzi

Abstract:

Amos Rapoport, based on his anthropology approach, believed that the space origins from the human body and influences on human body mutually. As a holistic approach in architecture, Contextualism describes a collection of views in philosophy which emphasize the context in which an action, utterance, or expression occurs, and argues that, in some important respect, the action, utterance, or expression can only be understood relative to that context. In this approach, the main goal – studying the role of cultural component in the Contextualism construction shaping up, based on Amos Rapoport’s anthropology approach- has being done by descriptive- analytic method. The results of the research indicate that in the field of Contextualism designing, referring to the cultural aspects are as necessary as the physical dimensions of a construction. Rapoport believes that the shape of a construction is influenced by cultural aspects and he suggests a kind of mutual interaction between human and environment that should be considered in housing. The mail goal of contextual architecture is to establish an interaction between environment, human and culture. According to this approach, a desirable design should be in harmony with this approach.

Keywords: Amos Rapoport, anthropology, contextual architecture, culture

Procedia PDF Downloads 403
9162 In-Flight Aircraft Performance Model Enhancement Using Adaptive Lookup Tables

Authors: Georges Ghazi, Magali Gelhaye, Ruxandra Botez

Abstract:

Over the years, the Flight Management System (FMS) has experienced a continuous improvement of its many features, to the point of becoming the pilot’s primary interface for flight planning operation on the airplane. With the assistance of the FMS, the concept of distance and time has been completely revolutionized, providing the crew members with the determination of the optimized route (or flight plan) from the departure airport to the arrival airport. To accomplish this function, the FMS needs an accurate Aircraft Performance Model (APM) of the aircraft. In general, APMs that equipped most modern FMSs are established before the entry into service of an individual aircraft, and results from the combination of a set of ordinary differential equations and a set of performance databases. Unfortunately, an aircraft in service is constantly exposed to dynamic loads that degrade its flight characteristics. These degradations endow two main origins: airframe deterioration (control surfaces rigging, seals missing or damaged, etc.) and engine performance degradation (fuel consumption increase for a given thrust). Thus, after several years of service, the performance databases and the APM associated to a specific aircraft are no longer representative enough of the actual aircraft performance. It is important to monitor the trend of the performance deterioration and correct the uncertainties of the aircraft model in order to improve the accuracy the flight management system predictions. The basis of this research lies in the new ability to continuously update an Aircraft Performance Model (APM) during flight using an adaptive lookup table technique. This methodology was developed and applied to the well-known Cessna Citation X business aircraft. For the purpose of this study, a level D Research Aircraft Flight Simulator (RAFS) was used as a test aircraft. According to Federal Aviation Administration the level D is the highest certification level for the flight dynamics modeling. Basically, using data available in the Flight Crew Operating Manual (FCOM), a first APM describing the variation of the engine fan speed and aircraft fuel flow w.r.t flight conditions was derived. This model was next improved using the proposed methodology. To do that, several cruise flights were performed using the RAFS. An algorithm was developed to frequently sample the aircraft sensors measurements during the flight and compare the model prediction with the actual measurements. Based on these comparisons, a correction was performed on the actual APM in order to minimize the error between the predicted data and the measured data. In this way, as the aircraft flies, the APM will be continuously enhanced, making the FMS more and more precise and the prediction of trajectories more realistic and more reliable. The results obtained are very encouraging. Indeed, using the tables initialized with the FCOM data, only a few iterations were needed to reduce the fuel flow prediction error from an average relative error of 12% to 0.3%. Similarly, the FCOM prediction regarding the engine fan speed was reduced from a maximum error deviation of 5.0% to 0.2% after only ten flights.

Keywords: aircraft performance, cruise, trajectory optimization, adaptive lookup tables, Cessna Citation X

Procedia PDF Downloads 267
9161 Towards Creative Movie Title Generation Using Deep Neural Models

Authors: Simon Espigolé, Igor Shalyminov, Helen Hastie

Abstract:

Deep machine learning techniques including deep neural networks (DNN) have been used to model language and dialogue for conversational agents to perform tasks, such as giving technical support and also for general chit-chat. They have been shown to be capable of generating long, diverse and coherent sentences in end-to-end dialogue systems and natural language generation. However, these systems tend to imitate the training data and will only generate the concepts and language within the scope of what they have been trained on. This work explores how deep neural networks can be used in a task that would normally require human creativity, whereby the human would read the movie description and/or watch the movie and come up with a compelling, interesting movie title. This task differs from simple summarization in that the movie title may not necessarily be derivable from the content or semantics of the movie description. Here, we train a type of DNN called a sequence-to-sequence model (seq2seq) that takes as input a short textual movie description and some information on e.g. genre of the movie. It then learns to output a movie title. The idea is that the DNN will learn certain techniques and approaches that the human movie titler may deploy that may not be immediately obvious to the human-eye. To give an example of a generated movie title, for the movie synopsis: ‘A hitman concludes his legacy with one more job, only to discover he may be the one getting hit.’; the original, true title is ‘The Driver’ and the one generated by the model is ‘The Masquerade’. A human evaluation was conducted where the DNN output was compared to the true human-generated title, as well as a number of baselines, on three 5-point Likert scales: ‘creativity’, ‘naturalness’ and ‘suitability’. Subjects were also asked which of the two systems they preferred. The scores of the DNN model were comparable to the scores of the human-generated movie title, with means m=3.11, m=3.12, respectively. There is room for improvement in these models as they were rated significantly less ‘natural’ and ‘suitable’ when compared to the human title. In addition, the human-generated title was preferred overall 58% of the time when pitted against the DNN model. These results, however, are encouraging given the comparison with a highly-considered, well-crafted human-generated movie title. Movie titles go through a rigorous process of assessment by experts and focus groups, who have watched the movie. This process is in place due to the large amount of money at stake and the importance of creating an effective title that captures the audiences’ attention. Our work shows progress towards automating this process, which in turn may lead to a better understanding of creativity itself.

Keywords: creativity, deep machine learning, natural language generation, movies

Procedia PDF Downloads 330
9160 An Application of Vector Error Correction Model to Assess Financial Innovation Impact on Economic Growth of Bangladesh

Authors: Md. Qamruzzaman, Wei Jianguo

Abstract:

Over the decade, it is observed that financial development, through financial innovation, not only accelerated development of efficient and effective financial system but also act as a catalyst in the economic development process. In this study, we try to explore insight about how financial innovation causes economic growth in Bangladesh by using Vector Error Correction Model (VECM) for the period of 1990-2014. Test of Cointegration confirms the existence of a long-run association between financial innovation and economic growth. For investigating directional causality, we apply Granger causality test and estimation explore that long-run growth will be affected by capital flow from non-bank financial institutions and inflation in the economy but changes of growth rate do not have any impact on Capital flow in the economy and level of inflation in long-run. Whereas, growth and Market capitalization, as well as market capitalization and capital flow, confirm feedback hypothesis. Variance decomposition suggests that any innovation in the financial sector can cause GDP variation fluctuation in both long run and short run. Financial innovation promotes efficiency and cost in financial transactions in the financial system, can boost economic development process. The study proposed two policy recommendations for further development. First, innovation friendly financial policy should formulate to encourage adaption and diffusion of financial innovation in the financial system. Second, operation of financial market and capital market should be regulated with implementation of rules and regulation to create conducive environment.

Keywords: financial innovation, economic growth, GDP, financial institution, VECM

Procedia PDF Downloads 275
9159 The Use of Bleomycin and Analogues to Probe the Chromatin Structure of Human Genes

Authors: Vincent Murray

Abstract:

The chromatin structure at the transcription start sites (TSSs) of genes is very important in the control of gene expression. In order for gene expression to occur, the chromatin structure at the TSS has to be altered so that the transcriptional machinery can be assembled and RNA transcripts can be produced. In particular, the nucleosome structure and positioning around the TSS has to be changed. Bleomycin is utilized as an anti-tumor agent to treat Hodgkin's lymphoma, squamous cell carcinoma, and testicular cancer. Bleomycin produces DNA damage in human cells and DNA strand breaks, especially double-strand breaks, are thought to be responsible for the cancer chemotherapeutic activity of bleomycin. Bleomycin is a large glycopeptide with molecular weight of approximately 1500 Daltons and hence its DNA strand cleavage activity can be utilized as a probe of chromatin structure. In this project, Illumina next-generation DNA sequencing technology was used to determine the position of DNA double-strand breaks at the TSSs of genes in intact cells. In this genome-wide study, it was found that bleomycin cleavage preferentially occurred at the TSSs of actively transcribed human genes in comparison with non-transcribed genes. There was a correlation between the level of enhanced bleomycin cleavage at TSSs and the degree of transcriptional activity. In addition, bleomycin was able to determine the position of nucleosomes at the TSSs of human genes. Bleomycin analogues were also utilized as probes of chromatin structure at the TSSs of human genes. In a similar manner to bleomycin, the bleomycin analogues 6′-deoxy-BLM Z and zorbamycin preferentially cleaved at the TSSs of human genes. Interestingly this degree of enhanced TSS cleavage inversely correlated with the cytotoxicity (IC50 values) of BLM analogues. This indicated that the degree of cleavage by bleomycin analogues at the TSSs of human genes was very important in the cytotoxicity of bleomycin and analogues. It also provided a deeper insight into the mechanism of action of this cancer chemotherapeutic agent since actively transcribed genes were preferentially targeted.

Keywords: anti-cancer activity, chromatin structure, cytotoxicity, gene expression, next-generation DNA sequencing

Procedia PDF Downloads 120
9158 Neural Network Models for Actual Cost and Actual Duration Estimation in Construction Projects: Findings from Greece

Authors: Panagiotis Karadimos, Leonidas Anthopoulos

Abstract:

Predicting the actual cost and duration in construction projects concern a continuous and existing problem for the construction sector. This paper addresses this problem with modern methods and data available from past public construction projects. 39 bridge projects, constructed in Greece, with a similar type of available data were examined. Considering each project’s attributes with the actual cost and the actual duration, correlation analysis is performed and the most appropriate predictive project variables are defined. Additionally, the most efficient subgroup of variables is selected with the use of the WEKA application, through its attribute selection function. The selected variables are used as input neurons for neural network models through correlation analysis. For constructing neural network models, the application FANN Tool is used. The optimum neural network model, for predicting the actual cost, produced a mean squared error with a value of 3.84886e-05 and it was based on the budgeted cost and the quantity of deck concrete. The optimum neural network model, for predicting the actual duration, produced a mean squared error with a value of 5.89463e-05 and it also was based on the budgeted cost and the amount of deck concrete.

Keywords: actual cost and duration, attribute selection, bridge construction, neural networks, predicting models, FANN TOOL, WEKA

Procedia PDF Downloads 138
9157 Human Trafficking and Terrorism: A Study on the Security Challenges Imposed upon Countries in Conflict

Authors: Christopher Holroyd

Abstract:

With the various terrorist organizations and drug cartels that are currently active, there is a myriad of security concerns facing countries around the world. Organizations that focus their attacks on others through terror, such as what is seen with the Islamic State of Iraq and the Levant (ISIS), have no boundaries when it comes to doing what is needed to fulfill their desired intent. For countries such as Iraq, who have been trying to rebuild their country since the fall of the Saddam Hussein Regime, organizations such as Al-Qaeda and ISIS have been impeding the country’s efforts toward peace and stability. One method utilized by terrorist organizations around the world is human trafficking. This method is one that is seen around the world; modern slavery is still exploited by those who have no concern for human decency and morality, their only concern is to achieve their goals by any means. It is understandable that some people may not have even heard of 'modern slavery', or they just might not believe that it is even an issue in today’s world. Organizations such as ISIS are not the only ones in the world that seek to benefit from the immoral trading of humans. Various drug cartels in the world, such as those seen in Mexico and Central America, have recently begun to take part in the trade – moving humans from state to state, or country to country, to better fuel their overall operations. This now makes the possibility of human trafficking more real for those in the United States because of the proximity of the cartels to the southern border of the country. An issue that, at one time, might have only seen as a distant threat, is now close to home for those in the United States. Looking at these two examples is how we begin to understand why human trafficking is utilized by various organizations around the world. This trade of human beings and the violation of basic human rights is a plague that effects the entire world and not just those that are in a country other than your own. One of the security issues that stem from the trade includes the movement and recruitment of members of the organizations. With individuals being smuggled from one location to another in secrecy, this only puts those trying to combat this trade at a disadvantage. This creates concern over the accurate number of potential recruits, combatants, and other individuals who are working against the host nation, and for the mission of the cartel or terrorist organization they are a part of. An uphill battle is created, and the goals of peace and stability are now harder to reach. Aside from security aspects, it cannot be forgotten that those being traded and forced into slavery, are being done so against their will. Families are separated, children trained to be fighters or worse. This makes the goal of eradicating human trafficking even more dire and important.

Keywords: human trafficking, reconstruction, security, terrorism

Procedia PDF Downloads 137
9156 A Comparative Study of Optimization Techniques and Models to Forecasting Dengue Fever

Authors: Sudha T., Naveen C.

Abstract:

Dengue is a serious public health issue that causes significant annual economic and welfare burdens on nations. However, enhanced optimization techniques and quantitative modeling approaches can predict the incidence of dengue. By advocating for a data-driven approach, public health officials can make informed decisions, thereby improving the overall effectiveness of sudden disease outbreak control efforts. The National Oceanic and Atmospheric Administration and the Centers for Disease Control and Prevention are two of the U.S. Federal Government agencies from which this study uses environmental data. Based on environmental data that describe changes in temperature, precipitation, vegetation, and other factors known to affect dengue incidence, many predictive models are constructed that use different machine learning methods to estimate weekly dengue cases. The first step involves preparing the data, which includes handling outliers and missing values to make sure the data is prepared for subsequent processing and the creation of an accurate forecasting model. In the second phase, multiple feature selection procedures are applied using various machine learning models and optimization techniques. During the third phase of the research, machine learning models like the Huber Regressor, Support Vector Machine, Gradient Boosting Regressor (GBR), and Support Vector Regressor (SVR) are compared with several optimization techniques for feature selection, such as Harmony Search and Genetic Algorithm. In the fourth stage, the model's performance is evaluated using Mean Square Error (MSE), Mean Absolute Error (MAE), and Root Mean Square Error (RMSE) as assistance. Selecting an optimization strategy with the least number of errors, lowest price, biggest productivity, or maximum potential results is the goal. In a variety of industries, including engineering, science, management, mathematics, finance, and medicine, optimization is widely employed. An effective optimization method based on harmony search and an integrated genetic algorithm is introduced for input feature selection, and it shows an important improvement in the model's predictive accuracy. The predictive models with Huber Regressor as the foundation perform the best for optimization and also prediction.

Keywords: deep learning model, dengue fever, prediction, optimization

Procedia PDF Downloads 70
9155 The Quranic Case for Resurrection

Authors: Maira Farooq Maneka

Abstract:

Death has increasingly caused humans to investigate its reality and what lies after it, if something at all, with personal conviction and concern. Till date it remains a matter of speculation. We do not encounter arguments other than ‘faith’ from major world religions when justifying claims about life after death (LAD) as it is an unseen phenomenon. This paper attempts to analyse the Islamic idea of resurrection (after death) and its justification that is distinct from faith but instead contemplative in nature. To do this a legal lens was adopted which allowed the categorisation of selected Quranic arguments under the heading of direct evidence, indirect evidence and intuitive reasoning. Results: Four kinds of direct evidences are discussed under the themes of sleep, droughts, predictions and Quranic challenge. The section of indirect evidences narrows its scope only to two, out of many, broad possible signs that pointed towards the reality of resurrection. These include the signs found in nature such as sun and water as well as signs one finds within the human body such as the creation and function of human fingertips. Finally the last section tries to amalgamate Quran’s appeal to human rationality that facilitates the reader in accepting the possibility of resurrection and hence a final Day of Judgement. These include the notion of accountability, pleasure, pain and human agency.

Keywords: Islam, life after death, Quran, resurrection

Procedia PDF Downloads 99
9154 Prediction of Boundary Shear Stress with Flood Plains Enlargements

Authors: Spandan Sahu, Amiya Kumar Pati, Kishanjit Kumar Khatua

Abstract:

The river is our main source of water which is a form of open channel flow and the flow in the open channel provides with many complex phenomena of sciences that need to be tackled such as the critical flow conditions, boundary shear stress, and depth-averaged velocity. The development of society, more or less solely depends upon the flow of rivers. The rivers are major sources of many sediments and specific ingredients which are much essential for human beings. During floods, part of a river is carried by the simple main channel and rest is carried by flood plains. For such compound asymmetric channels, the flow structure becomes complicated due to momentum exchange between the main channel and adjoining flood plains. Distribution of boundary shear in subsections provides us with the concept of momentum transfer between the interface of the main channel and the flood plains. Experimentally, to get better data with accurate results are very complex because of the complexity of the problem. Hence, CES software has been used to tackle the complex processes to determine the shear stresses at different sections of an open channel having asymmetric flood plains on both sides of the main channel, and the results are compared with the symmetric flood plains for various geometrical shapes and flow conditions. Error analysis is also performed to know the degree of accuracy of the model implemented.

Keywords: depth average velocity, non prismatic compound channel, relative flow depth, velocity distribution

Procedia PDF Downloads 180
9153 Human Health Risk Assessment from Metals Present in a Soil Contaminated by Crude Oil

Authors: M. A. Stoian, D. M. Cocarta, A. Badea

Abstract:

The main sources of soil pollution due to petroleum contaminants are industrial processes involve crude oil. Soil polluted with crude oil is toxic for plants, animals, and humans. Human exposure to the contaminated soil occurs through different exposure pathways: Soil ingestion, diet, inhalation, and dermal contact. The present study research is focused on soil contamination with heavy metals as a consequence of soil pollution with petroleum products. Human exposure pathways considered are: Accidentally ingestion of contaminated soil and dermal contact. The purpose of the paper is to identify the human health risk (carcinogenic risk) from soil contaminated with heavy metals. The human exposure and risk were evaluated for five contaminants of concern of the eleven which were identified in soil. Two soil samples were collected from a bioremediation platform from Muntenia Region of Romania. The soil deposited on the bioremediation platform was contaminated through extraction and oil processing. For the research work, two average soil samples from two different plots were analyzed: The first one was slightly contaminated with petroleum products (Total Petroleum Hydrocarbons (TPH) in soil was 1420 mg/kgd.w.), while the second one was highly contaminated (TPH in soil was 24306 mg/kgd.w.). In order to evaluate risks posed by heavy metals due soil pollution with petroleum products, five metals known as carcinogenic were investigated: Arsenic (As), Cadmium (Cd), ChromiumVI (CrVI), Nickel (Ni), and Lead (Pb). Results of the chemical analysis performed on samples collected from the contaminated soil evidence soil contamination with heavy metals as following: As in Site 1 = 6.96 mg/kgd.w; As in Site 2 = 11.62 mg/kgd.w, Cd in Site 1 = 0.9 mg/kgd.w; Cd in Site 2 = 1 mg/kgd.w; CrVI was 0.1 mg/kgd.w for both sites; Ni in Site 1 = 37.00 mg/kgd.w; Ni in Site 2 = 42.46 mg/kgd.w; Pb in Site 1 = 34.67 mg/kgd.w; Pb in Site 2 = 120.44 mg/kgd.w. The concentrations for these metals exceed the normal values established in the Romanian regulation, but are smaller than the alert level for a less sensitive use of soil (industrial). Although, the concentrations do not exceed the thresholds, the next step was to assess the human health risk posed by soil contamination with these heavy metals. Results for risk were compared with the acceptable one (10-6, according to World Human Organization). As, expected, the highest risk was identified for the soil with a higher degree of contamination: Individual Risk (IR) was 1.11×10-5 compared with 8.61×10-6

Keywords: carcinogenic risk, heavy metals, human health risk assessment, soil pollution

Procedia PDF Downloads 429