Search results for: tasks
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1471

Search results for: tasks

331 The Effect of Homework on Raising Educational Attainment in Mathematics

Authors: Yousef M. Abd Algani Mr.

Abstract:

Since the mid-1950s, students have been required to do homework. Literature research shows the importance of homework to teachers, parents, and students on one hand, and on the other, it exposes the emotional, social, and family problems caused by large, unintentional quantity of homework, difficult tasks, a lack explanation from the teacher and the type of parental involvement (Coutts, 2004). The objective of the present study from the importance of math homework and the achievements of students in this very field. One of the main goals of education systems across OECD countries is developing independent learners who are able to direct themselves. This issue was expressed mainly in doing homework preparation. Doing homework independently is a skill required of the student throughout his or her years of studying (Hong, Millgram and Rowell, 2001). This study aims at exposing and examining the students' perceptions of mathematics toward homework in junior-high schools (7th-10th grades) in the Arab population in northern Israel, and their impact on raising student achievements in mathematics. To answer the problem of homework in the study of mathematics, we are addressing two main questions: (1) What are the attitudes of Arab Middle School students in Israel towards the use of homework associated with mathematics? (2) What is the effect of using accompanying home exercises to raise the educational attainment of mathematics in Arab schools in northern Israel? The Study Community is: (1) 500 students to examine the attitudes of Arab Middle School students in Israel towards the use of homework associated with mathematics were chosen from junior-high schools in northern Israel, and (2) 180 students to examine the effect of using accompanying homework to raise the educational attainment of the minimum levels of thinking in Bloom's taxonomy (knowledge, comprehension, and application) of mathematics in Arab schools in northern Israel. (a) The researcher used the quantitative approach which aims to examine the attitudes of Arab Middle School students in Israel towards the use of homework associated with mathematics. (b) The researcher used the experimental approach with both pre- and post- semi-experimental design for two experimental groups, (Campbell, 1963), which aims to examine the effect of using accompanying homework to raise the educational attainment of mathematics in Arab schools in northern Israel.

Keywords: attitude, educational attainment, homework, mathematics

Procedia PDF Downloads 130
330 A Comparative Study of Specific Assessment Criteria Related to Commercial Vehicle Drivers

Authors: Nur Syahidatul Idany Abdul Ghani, Rahizar Ramli, Jamilah Mohamad, Ahmad Saifizul, Mohamed Rehan Karim

Abstract:

Increasing fatalities in road accidents in Malaysia over the last 10 years are quite alarming. Based on Malaysian Institute of Road Safety Research (Miros) latest research ‘Predicting Malaysian Road Fatalities for year 2020; it is predicted that road fatalities in Malaysia for 2015 is 8,780 and 10,716 for the year 2020 which 30 percent of fatalities were caused by accidents involving commercial vehicles. Government, related agencies and NGOs have continuously and persistently work to reduce the statistics through enforcement, educating the public, training to drivers, road safety campaigns, advertisements etc. However, the trend of casualties does not show encouraging pattern but instead, steadily growing. Thus, this comparative study reviews the literature pertaining on method of measurement used to evaluate commercial drivers competency. In several studies driving competency has been assessed with different assessment based on the license procedures and requirements according to the country regulation. The assessment criteria that has been establish for commercial drivers generally focus on driving tasks and assessment e.g. theory test, medical test and road assessment rather than driving competency test or physical test. Realizing the importance of specific assessment test for drivers competency this comparative study reviews the most discussed literature related to competency assessment method to identify competency of the drivers include (1. judgement and reaction, 2. skill of drivers, 3. experiences and fatigue). The concluding analysis of this paper is a comparative table for assessment methodology to access driver’s competency. A comparative study is a further discussion reviewing past literature to provide an overview on existing assessment test and potential subject matters that can be identified for further studies to increase awareness of the drivers, passengers as well as the authorities about the importance of competent drivers in order to improve safety in commercial vehicles.

Keywords: commercial vehicles, driver’s competency, specific assessment

Procedia PDF Downloads 431
329 On Cloud Computing: A Review of the Features

Authors: Assem Abdel Hamed Mousa

Abstract:

The Internet of Things probably already influences your life. And if it doesn’t, it soon will, say computer scientists; Ubiquitous computing names the third wave in computing, just now beginning. First were mainframes, each shared by lots of people. Now we are in the personal computing era, person and machine staring uneasily at each other across the desktop. Next comes ubiquitous computing, or the age of calm technology, when technology recedes into the background of our lives. Alan Kay of Apple calls this "Third Paradigm" computing. Ubiquitous computing is essentially the term for human interaction with computers in virtually everything. Ubiquitous computing is roughly the opposite of virtual reality. Where virtual reality puts people inside a computer-generated world, ubiquitous computing forces the computer to live out here in the world with people. Virtual reality is primarily a horse power problem; ubiquitous computing is a very difficult integration of human factors, computer science, engineering, and social sciences. The approach: Activate the world. Provide hundreds of wireless computing devices per person per office, of all scales (from 1" displays to wall sized). This has required new work in operating systems, user interfaces, networks, wireless, displays, and many other areas. We call our work "ubiquitous computing". This is different from PDA's, dynabooks, or information at your fingertips. It is invisible; everywhere computing that does not live on a personal device of any sort, but is in the woodwork everywhere. The initial incarnation of ubiquitous computing was in the form of "tabs", "pads", and "boards" built at Xerox PARC, 1988-1994. Several papers describe this work, and there are web pages for the Tabs and for the Boards (which are a commercial product now): Ubiquitous computing will drastically reduce the cost of digital devices and tasks for the average consumer. With labor intensive components such as processors and hard drives stored in the remote data centers powering the cloud , and with pooled resources giving individual consumers the benefits of economies of scale, monthly fees similar to a cable bill for services that feed into a consumer’s phone.

Keywords: internet, cloud computing, ubiquitous computing, big data

Procedia PDF Downloads 371
328 The Role of Situational Factors in User Experience during Human-Robot Interaction

Authors: Da Tao, Tieyan Wang, Mingfu Qin

Abstract:

While social robots have been increasingly developed and rapidly applied in our daily life, how robots should interact with humans is still an urgent problem to be explored. Appropriate use of interactive behavior is likely to create a good user experience in human-robot interaction situations, which in turn can improve people’s acceptance of robots. This paper aimed to systematically and quantitatively examine the effects of several important situational factors (i.e., interaction distance, interaction posture, and feedback style) on user experience during human-robot interaction. A three-factor mixed designed experiment was adopted in this study, where subjects were asked to interact with a social robot in different interaction situations by combinations of varied interaction distance, interaction posture, and feedback style. A set of data on users’ behavioral performance, subjective perceptions, and eye movement measures were tracked and collected, and analyzed by repeated measures analysis of variance. The results showed that the three situational factors showed no effects on behavioral performance in tasks during human-robot interaction. Interaction distance and feedback style yielded significant main effects and interaction effects on the proportion of fixation times. The proportion of fixation times on the robot is higher for negative feedback compared with positive feedback style. While the proportion of fixation times on the robot generally decreased with the increase of the interaction distance, it decreased more under the positive feedback style than under the negative feedback style. In addition, there were significant interaction effects on pupil diameter between interaction distance and posture. As interaction distance increased, mean pupil diameter became smaller in side interaction, while it became larger in frontal interaction. Moreover, the three situation factors had significant interaction effects on user acceptance of the interaction mode. The findings are helpful in the underlying mechanism of user experience in human-robot interaction situations and provide important implications for the design of robot behavioral expression and for optimal strategies to improve user experience during human-robot interaction.

Keywords: social robots, human-robot interaction, interaction posture, interaction distance, feedback style, user experience

Procedia PDF Downloads 113
327 Algorithm for Improved Tree Counting and Detection through Adaptive Machine Learning Approach with the Integration of Watershed Transformation and Local Maxima Analysis

Authors: Jigg Pelayo, Ricardo Villar

Abstract:

The Philippines is long considered as a valuable producer of high value crops globally. The country’s employment and economy have been dependent on agriculture, thus increasing its demand for the efficient agricultural mechanism. Remote sensing and geographic information technology have proven to effectively provide applications for precision agriculture through image-processing technique considering the development of the aerial scanning technology in the country. Accurate information concerning the spatial correlation within the field is very important for precision farming of high value crops, especially. The availability of height information and high spatial resolution images obtained from aerial scanning together with the development of new image analysis methods are offering relevant influence to precision agriculture techniques and applications. In this study, an algorithm was developed and implemented to detect and count high value crops simultaneously through adaptive scaling of support vector machine (SVM) algorithm subjected to object-oriented approach combining watershed transformation and local maxima filter in enhancing tree counting and detection. The methodology is compared to cutting-edge template matching algorithm procedures to demonstrate its effectiveness on a demanding tree is counting recognition and delineation problem. Since common data and image processing techniques are utilized, thus can be easily implemented in production processes to cover large agricultural areas. The algorithm is tested on high value crops like Palm, Mango and Coconut located in Misamis Oriental, Philippines - showing a good performance in particular for young adult and adult trees, significantly 90% above. The s inventories or database updating, allowing for the reduction of field work and manual interpretation tasks.

Keywords: high value crop, LiDAR, OBIA, precision agriculture

Procedia PDF Downloads 390
326 Memory Retrieval and Implicit Prosody during Reading: Anaphora Resolution by L1 and L2 Speakers of English

Authors: Duong Thuy Nguyen, Giulia Bencini

Abstract:

The present study examined structural and prosodic factors on the computation of antecedent-reflexive relationships and sentence comprehension in native English (L1) and Vietnamese-English bilinguals (L2). Participants read sentences presented on the computer screen in one of three presentation formats aimed at manipulating prosodic parsing: word-by-word (RSVP), phrase-segment (self-paced), or whole-sentence (self-paced), then completed a grammaticality rating and a comprehension task (following Pratt & Fernandez, 2016). The design crossed three factors: syntactic structure (simple; complex), grammaticality (target-match; target-mismatch) and presentation format. An example item is provided in (1): (1) The actress that (Mary/John) interviewed at the awards ceremony (about two years ago/organized outside the theater) described (herself/himself) as an extreme workaholic). Results showed that overall, both L1 and L2 speakers made use of a good-enough processing strategy at the expense of more detailed syntactic analyses. L1 and L2 speakers’ comprehension and grammaticality judgements were negatively affected by the most prosodically disrupting condition (word-by-word). However, the two groups demonstrated differences in their performance in the other two reading conditions. For L1 speakers, the whole-sentence and the phrase-segment formats were both facilitative in the grammaticality rating and comprehension tasks; for L2, compared with the whole-sentence condition, the phrase-segment paradigm did not significantly improve accuracy or comprehension. These findings are consistent with the findings of Pratt & Fernandez (2016), who found a similar pattern of results in the processing of subject-verb agreement relations using the same experimental paradigm and prosodic manipulation with English L1 and L2 English-Spanish speakers. The results provide further support for a Good-Enough cue model of sentence processing that integrates cue-based retrieval and implicit prosodic parsing (Pratt & Fernandez, 2016) and highlights similarities and differences between L1 and L2 sentence processing and comprehension.

Keywords: anaphora resolution, bilingualism, implicit prosody, sentence processing

Procedia PDF Downloads 140
325 The Relationship between Functional Movement Screening Test and Prevalence of Musculoskeletal Disorders in Emergency Nurse and Emergency Medical Services Staff Shiraz, Iran, 2017

Authors: Akram Sadat Jafari Roodbandi, Alireza Choobineh, Nazanin Hosseini, Vafa Feyzi

Abstract:

Introduction: Physical fitness and optimum functional movement are essential for efficiently performing job tasks without fatigue and injury. Functional Movement Screening (FMS) tests are used in screening of athletes and military forces. Nurses and emergency medical staff are obliged to perform many physical activities such as transporting patients, CPR operations, etc. due to the nature of their jobs. This study aimed to assess relationship between FMS test score and the prevalence of musculoskeletal disorders (MSDs) in emergency nurses and emergency medical services (EMS) staff. Methods: 134 male and female emergency nurses and EMS technicians participated in this cross-sectional, descriptive-analytical study. After video tutorial and practical training of how to do FMS test, the participants carried out the test while they were wearing comfortable clothes. The final score of the FMS test ranges from 0 to 21. The score of 14 is considered weak in the functional movement base on FMS test protocol. In addition to the demographic data questionnaire, the Nordic musculoskeletal questionnaire was also completed for each participant. SPSS software was used for statistical analysis with a significance level of 0.05. Results: Totally, 49.3% (n=66) of the subjects were female. The mean age and work experience of the subjects were 35.3 ± 8.7 and 11.4 ± 7.7, respectively. The highest prevalence of MSDs was observed at the knee and lower back with 32.8% (n=44) and 23.1% (n=31), respectively. 26 (19.4%) health worker had FMS test score of 14 and less. The results of the Spearman correlation test showed that the FMS test score was significantly associated with MSDs (r=-0.419, p < 0.0001). It meant that MSDs increased with the decrease of the FMS test score. Age, sex, and MSDs were the remaining significant factors in linear regression logistic model with dependent variable of FMS test score. Conclusion: FMS test seems to be a usable screening tool in pre-employment and periodic medical tests for occupations that require physical fitness and optimum functional movements.

Keywords: functional movement, musculoskeletal disorders, health care worker, screening test

Procedia PDF Downloads 118
324 Good Corporate Governance and Accountability in Microfinance Institutions

Authors: A. R. Nor Azlina, H. Salwana, I. Zuraeda, A. R. Rashidah, O. Normah

Abstract:

Transitioning towards globalization in the business environment has necessitated more essential growing changes such as competition, business strategy, innovation in technology and effectiveness of societal trends on adopting corporate governance are seen to be drivers of the future. This transformations on business environment has a significant impact to organizations’ performances. Many organizations are demanding for more proactive entrepreneurs with dynamic team, who can run and steer their business to success. Changing on strategy, roles, tasks, entrepreneurial skills and implementing corporate governance in relationship development is important to enhance the organization’s performance towards being more cost-efficient and subsequently increase its efficiency. Small Medium Enterprises (SMEs) in most developing countries are contributors to the economic growth of a nation. However, the potential of Microfinance Institutions (MFIs) is always overlooked in contributing towards SMEs development. The adoption of corporate governance and accountability in MFIs as driving forces for these SMEs is not incorporated in measurements of organization performance. This paper attempts to address some of the governance issues associated with dimensions of accountability in improving performances of microfinance institutions. Qualitative approach was adopted in this study to analyze the data collected. The qualitative approach emerges as contributing factor in understanding and critiquing accountability processes, as well as addressing the concerns of practitioners and policymakers. A close researcher engagement with the field which concerns process, embracing of situational complexity, as well as critical and reflective understandings of organizational phenomena remain as hallmarks of the tradition. It is concluded that in describing and scrutinizing an understanding of managerial behavior, organizational factors and macro-economic relationship in SMEs firm need to be improved. This is also the case in MFIs. A framework is developed to explore the linkage of corporate governance and accountability issues related to entrepreneurship as factors affecting MFIs performances in facing ongoing transformation of organization performance within Malaysian SMEs industries.

Keywords: accountability, corporate governance, microfinance, organization performance

Procedia PDF Downloads 376
323 Internal Audit Function Contributions to the External Audit

Authors: Douglas F. Prawitt, Nathan Y. Sharp, David A. Wood

Abstract:

Consistent with prior experimental and survey studies, we find that IAFs that spend more time directly assisting the external auditor is associated with lower external audit fees. Interestingly, we do not find evidence that external auditors reduce fees based on work previously performed by the IAF. We also find that the time spent assisting the external auditor has a greater negative effect on external audit fees than the time spent performing tasks upon which the auditor may rely but that are not performed as direct assistance to the external audit. Our results also show that previous proxies used to measure this relation is either not associated with or are negatively associated with our direct measures of how the IAF can contribute to the external audit and are highly positively associated with the size and the complexity of the organization. Thus, we conclude the disparate experimental and archival results may be attributable to issues surrounding the construct validity of measures used in previous archival studies and that when measures similar to those used in experimental studies are employed in archival tests, the archival results are consistent with experimental findings. Our research makes four primary contributions to the literature. First, we provide evidence that internal auditing contributes to a reduction in external audit fees. Second, we replicate and provide an explanation for why previous archival studies find that internal auditing has either no association with external audit fees or is associated with an increase in those fees: prior studies generally use proxies of internal audit contribution that do not adequately capture the intended construct. Third, our research expands on survey-based research (e.g., Oil Libya sh.co.) by separately examining the impact on the audit fee of the internal auditors’ work, indirectly assisting external auditors and internal auditors’ prior work upon which external auditors can rely. Finally, we extend prior research by using a new, independent data source to validate and extend prior studies. This data set also allows for a sample of examining the impact of internal auditing on the external audit fee and the use of a more comprehensive external audit fee model that better controls for determinants of the external audit fee.

Keywords: internal audit, contribution, external audit, function

Procedia PDF Downloads 107
322 Multi-Agent System Based Solution for Operating Agile and Customizable Micro Manufacturing Systems

Authors: Dylan Santos De Pinho, Arnaud Gay De Combes, Matthieu Steuhlet, Claude Jeannerat, Nabil Ouerhani

Abstract:

The Industry 4.0 initiative has been launched to address huge challenges related to ever-smaller batch sizes. The end-user need for highly customized products requires highly adaptive production systems in order to keep the same efficiency of shop floors. Most of the classical Software solutions that operate the manufacturing processes in a shop floor are based on rigid Manufacturing Execution Systems (MES), which are not capable to adapt the production order on the fly depending on changing demands and or conditions. In this paper, we present a highly modular and flexible solution to orchestrate a set of production systems composed of a micro-milling machine-tool, a polishing station, a cleaning station, a part inspection station, and a rough material store. The different stations are installed according to a novel matrix configuration of a 3x3 vertical shelf. The different cells of the shelf are connected through horizontal and vertical rails on which a set of shuttles circulate to transport the machined parts from a station to another. Our software solution for orchestrating the tasks of each station is based on a Multi-Agent System. Each station and each shuttle is operated by an autonomous agent. All agents communicate with a central agent that holds all the information about the manufacturing order. The core innovation of this paper lies in the path planning of the different shuttles with two major objectives: 1) reduce the waiting time of stations and thus reduce the cycle time of the entire part, and 2) reduce the disturbances like vibration generated by the shuttles, which highly impacts the manufacturing process and thus the quality of the final part. Simulation results show that the cycle time of the parts is reduced by up to 50% compared with MES operated linear production lines while the disturbance is systematically avoided for the critical stations like the milling machine-tool.

Keywords: multi-agent systems, micro-manufacturing, flexible manufacturing, transfer systems

Procedia PDF Downloads 120
321 The Fragility of Sense: The Twofold Temporality of Embodiment and Its Role for Depression

Authors: Laura Bickel

Abstract:

This paper aims to investigate to what extent Merleau-Ponty’s philosophy of body memory serves as a viable resource for the enactive approach to cognitive science and its first-person experience-based research on ‘recurrent depressive disorder’ coded F33 in ICD-10. In pursuit of this goal, the analysis begins by revisiting the neuroreductive paradigm. This paradigm serves biological psychiatry to explain the condition of vital contact in terms of underlying neurophysiological mechanisms. It is demonstrated that the neuroreductive model cannot sufficiently account for the depressed person’s episodical withdrawal in causal terms. The analysis of the irregular loss of vital resonance requires integrating the body as the subject of experience and its phenomenological time. Then, it is shown that the enactive approach to depression as disordered sense-making is a promising alternative. The enactive model of perception implies that living beings do not register pre-existing meaning ‘out there’ but unfold ‘sense’ in their action-oriented response to the world. For the enactive approach, Husserl’s passive synthesis of inner time consciousness is fundamental for what becomes perceptually present for action. It seems intuitive to bring together the enactive approach to depression with the long-standing view in phenomenological psychopathology that explains the loss of vital contact by appealing to the disruption of the temporal structure of consciousness. However, this paper argues that the disruption of the temporal structure is not justified conceptually. Instead, one may integrate Merleau-Ponty’s concept of the past as the unconscious into the enactive approach to depression. From this perspective, the living being’s experiential and biological past inserts itself in the form of habit and bodily skills and ensures action-oriented responses to the environment. Finally, it is concluded that the depressed person’s withdrawal indicates the impairment of this application process. The person suffering from F33 cannot actualize sedimented meaning to respond to the valences and tasks of a given situation.

Keywords: depression, enactivism, neuroreductionsim, phenomenology, temporality

Procedia PDF Downloads 121
320 Constructing a Semi-Supervised Model for Network Intrusion Detection

Authors: Tigabu Dagne Akal

Abstract:

While advances in computer and communications technology have made the network ubiquitous, they have also rendered networked systems vulnerable to malicious attacks devised from a distance. These attacks or intrusions start with attackers infiltrating a network through a vulnerable host and then launching further attacks on the local network or Intranet. Nowadays, system administrators and network professionals can attempt to prevent such attacks by developing intrusion detection tools and systems using data mining technology. In this study, the experiments were conducted following the Knowledge Discovery in Database Process Model. The Knowledge Discovery in Database Process Model starts from selection of the datasets. The dataset used in this study has been taken from Massachusetts Institute of Technology Lincoln Laboratory. After taking the data, it has been pre-processed. The major pre-processing activities include fill in missed values, remove outliers; resolve inconsistencies, integration of data that contains both labelled and unlabelled datasets, dimensionality reduction, size reduction and data transformation activity like discretization tasks were done for this study. A total of 21,533 intrusion records are used for training the models. For validating the performance of the selected model a separate 3,397 records are used as a testing set. For building a predictive model for intrusion detection J48 decision tree and the Naïve Bayes algorithms have been tested as a classification approach for both with and without feature selection approaches. The model that was created using 10-fold cross validation using the J48 decision tree algorithm with the default parameter values showed the best classification accuracy. The model has a prediction accuracy of 96.11% on the training datasets and 93.2% on the test dataset to classify the new instances as normal, DOS, U2R, R2L and probe classes. The findings of this study have shown that the data mining methods generates interesting rules that are crucial for intrusion detection and prevention in the networking industry. Future research directions are forwarded to come up an applicable system in the area of the study.

Keywords: intrusion detection, data mining, computer science, data mining

Procedia PDF Downloads 287
319 Alternative Approach to the Machine Vision System Operating for Solving Industrial Control Issue

Authors: M. S. Nikitenko, S. A. Kizilov, D. Y. Khudonogov

Abstract:

The paper considers an approach to a machine vision operating system combined with using a grid of light markers. This approach is used to solve several scientific and technical problems, such as measuring the capability of an apron feeder delivering coal from a lining return port to a conveyor in the technology of mining high coal releasing to a conveyor and prototyping an autonomous vehicle obstacle detection system. Primary verification of a method of calculating bulk material volume using three-dimensional modeling and validation in laboratory conditions with relative errors calculation were carried out. A method of calculating the capability of an apron feeder based on a machine vision system and a simplifying technology of a three-dimensional modelled examined measuring area with machine vision was offered. The proposed method allows measuring the volume of rock mass moved by an apron feeder using machine vision. This approach solves the volume control issue of coal produced by a feeder while working off high coal by lava complexes with release to a conveyor with accuracy applied for practical application. The developed mathematical apparatus for measuring feeder productivity in kg/s uses only basic mathematical functions such as addition, subtraction, multiplication, and division. Thus, this fact simplifies software development, and this fact expands the variety of microcontrollers and microcomputers suitable for performing tasks of calculating feeder capability. A feature of an obstacle detection issue is to correct distortions of the laser grid, which simplifies their detection. The paper presents algorithms for video camera image processing and autonomous vehicle model control based on obstacle detection machine vision systems. A sample fragment of obstacle detection at the moment of distortion with the laser grid is demonstrated.

Keywords: machine vision, machine vision operating system, light markers, measuring capability, obstacle detection system, autonomous transport

Procedia PDF Downloads 101
318 Assessment Literacy Levels of Mathematics Teachers to Implement Classroom Assessment in Ghanaian High Schools

Authors: Peter Akayuure

Abstract:

One key determinant of the quality of mathematics learning is the teacher’s ability to assess students adequately and effectively and make assessment an integral part of the instructional practices. If the mathematics teacher lacks the required literacy to perform classroom assessment roles, the true trajectory of learning success and attainment of curriculum expectations might be indeterminate. It is therefore important that educators and policymakers understand and seek ways to improve the literacy level of mathematics teachers to implement classroom assessments that would meet curriculum demands. This study employed a descriptive survey design to explore perceived levels of assessment literacy of mathematics teachers to implement classroom assessment with the school based assessment framework in Ghana. A 25-item classroom assessment inventory on teachers’ assessment scenarios was adopted, modified, and administered to a purposive sample of 48 mathematics teachers from eleven Senior High Schools. Seven other items were included to further collect data on their self-efficacy towards assessment literacy. Data were analyzed using descriptive and bivariate correlation statistics. The result shows that, on average, 48.6% of the mathematics teachers attained standard levels of assessment literacy. Specifically, 50.0% met standard one in choosing appropriate assessment methods, 68.3% reached standard two in developing appropriate assessment tasks, 36.6% reached standard three in administering, scoring, and interpreting assessment results, 58.3% reached standard four in making appropriate assessment decisions, 41.7% reached standard five in developing valid grading procedures, 45.8% reached standard six in communicating assessment results, and 36.2 % reached standard seven by identifying unethical, illegal and inappropriate use of assessment results. Participants rated their self-efficacy belief in performing assessments high, making the relationships between participants’ assessment literacy scores and self-efficacy scores weak and statistically insignificant. The study recommends that institutions training mathematics teachers or providing professional developments should accentuate assessment literacy development to ensure standard assessment practices and quality instruction in mathematics education at senior high schools.

Keywords: assessment literacy, mathematics teacher, senior high schools, Ghana

Procedia PDF Downloads 123
317 On or Off-Line: Dilemmas in Using Online Teaching-Learning in In-Service Teacher Education

Authors: Orly Sela

Abstract:

The lecture discusses a Language Teaching program in a Teacher Education College in northern Israel. An on-line course was added to the program in order to keep on-campus attendance at a minimum, thus allowing the students to keep their full-time jobs in school. In addition, the use of educational technology to allow students to study anytime anywhere, in keeping with 21st-century innovative teaching-learning practices, was also an issue, as was the wish for this course to serve as a model which the students could then possibly use in their K-12 teaching. On the other hand, there were strong considerations against including an online course in the program. The students in the program were mostly Israeli-Arab married women with young children, living in a traditional society which places a strong emphasis on the place of the woman as a wife, mother, and home-maker. In addition, as teachers, they used much of their free time on school-related tasks. Having careers at the same time as studying was ground-breaking for these women, and using their time at home for studying rather than taking care of their families may have been simply too much to ask of them. At the end of the course, feedback was collected through an online questionnaire including both open and closed questions. The data collected shows that the students believed in online teaching-learning in principle, but had trouble implementing it in practice. This evidence raised the question of whether or not such a course should be included in a graduate program for mature, professional students, particular women with families living in a traditional society. This issue is not relevant to Israel alone, but also to academic institutions worldwide serving such populations. The lecture discusses this issue, sharing the researcher’s conclusions with the audience. Based on the evidence offered, it is the researcher’s conclusion that online education should, indeed, be offered to such audiences. However, the courses should be designed with the students’ special needs in mind, with emphasis placed on initial planning and course organization based on acknowledgment of the teaching context; modeling of online teaching/learning suited for in-service teacher education, and special attention paid to social-constructivist aspects of learning.

Keywords: course design, in-service teacher-education, mature students, online teaching/learning

Procedia PDF Downloads 222
316 Uranoplasty Using Tongue Flap for Bilateral Clefts

Authors: Saidasanov Saidazal Shokhmurodovich, Topolnickiy Orest Zinovyevich, Afaunova Olga Arturovna

Abstract:

Relevance: Bilateral congenital cleft is one of the most complex forms of all clefts, which makes it difficult to choose a surgical method of treatment. During primary operations to close the hard and soft palate, there is a shortage of soft tissues and their lack during standard uranoplasty, and these factors aggravate the period of rehabilitation of patients. Materials and methods: The results of surgical treatment of children with bilateral cleft, who underwent uranoplasty using a flap from the tongue, were analyzed. The study used methods: clinical and statistical, which allowed us to solve the tasks, based on the principles of evidence-based medicine. Results and discussion: in our study, 15 patients were studied, who underwent surgical treatment in the following volume: uranoplasty using a flap from the tongue in two stages. Of these, 9 boys and 6 girls aged 2.5 to 6 years. The first stage was surgical treatment in the volume: veloplasty. The second stage was a surgical intervention in volume: uranoplasty using a flap from the tongue. In all patients, the width of the cleft ranged from 1.6-2.8 cm. All patients in this group were orthodontically prepared. Using this method, the surgeon can achieve the following results: maximum narrowing of the palatopharyngeal ring, long soft palate, complete closure of the hard palate, alveolar process, and the mucous membrane of the nasal cavity is also sutured, which creates good conditions for the next stage of osteoplastic surgery. Based on the result obtained, patients have positive results of working with a speech therapist. In all patients, the dynamics were positive without complications. Conclusions: Based on our observation, tongue flap uranoplasty is one of the effective techniques for patients with wide clefts of the hard and soft palate. The use of a flap from the tongue makes it possible to reduce the number of repeated reoperations and improve the quality of social adaptation of this group of patients, which is one of the important stages of rehabilitation. Upon completion of the stages of rehabilitation, all patients had the maximum improvement in functional, anatomical and social indicators.

Keywords: congenital cleft lips and palate, bilateral cleft, child surgery, maxillofacial surgery

Procedia PDF Downloads 109
315 Contextualization and Localization: Acceptability of the Developed Activity Sheets in Science 5 Integrating Climate Change Adaptation

Authors: Kim Alvin De Lara

Abstract:

The research aimed to assess the level of acceptability of the developed activity sheets in Science 5 integrating climate change adaptation of grade 5 science teachers in the District of Pililla school year 2016-2017. In this research, participants were able to recognize and understand the importance of environmental education in improving basic education and integrating them in lessons through localization and contextualization. The researcher conducted the study to develop a material to use by Science teachers in Grade 5. It served also as a self-learning resource for students. The respondents of the study were the thirteen Grade 5 teachers teaching Science 5 in the District of Pililla. Respondents were selected purposively and identified by the researcher. A descriptive method of research was utilized in the research. The main instrument was a checklist which includes items on the objectives, content, tasks, contextualization and localization of the developed activity sheets. The researcher developed a 2-week lesson in Science 5 for 4th Quarter based on the curriculum guide with integration of climate change adaptation. The findings revealed that majority of respondents are female, 31 years old and above, 10 years above in teaching science and have units in master’s degree. With regards to the level of acceptability, the study revealed developed activity sheets in science 5 is very much acceptable. In view of the findings, lessons in science 5 must be contextualized and localized to improve to make the curriculum responds, conforms, reflects, and be flexible to the needs of the learners, especially the 21st century learners who need to be holistically and skillfully developed. As revealed by the findings, it is more acceptable to localized and contextualized the learning materials for pupils. Policy formation and re-organization of the lessons and competencies in Science must be reviewed and re-evaluated. Lessons in science must also be integrated with climate change adaptation since nowadays, people are experiencing change in climate due to global warming and other factors. Through developed activity sheets, researcher strongly supports environmental education and believes this to serve as a way to instill environmental literacy to students.

Keywords: activity sheets, climate change adaptation, contextualization, localization

Procedia PDF Downloads 313
314 Developing Manufacturing Process for the Graphene Sensors

Authors: Abdullah Faqihi, John Hedley

Abstract:

Biosensors play a significant role in the healthcare sectors, scientific and technological progress. Developing electrodes that are easy to manufacture and deliver better electrochemical performance is advantageous for diagnostics and biosensing. They can be implemented extensively in various analytical tasks such as drug discovery, food safety, medical diagnostics, process controls, security and defence, in addition to environmental monitoring. Development of biosensors aims to create high-performance electrochemical electrodes for diagnostics and biosensing. A biosensor is a device that inspects the biological and chemical reactions generated by the biological sample. A biosensor carries out biological detection via a linked transducer and transmits the biological response into an electrical signal; stability, selectivity, and sensitivity are the dynamic and static characteristics that affect and dictate the quality and performance of biosensors. In this research, a developed experimental study for laser scribing technique for graphene oxide inside a vacuum chamber for processing of graphene oxide is presented. The processing of graphene oxide (GO) was achieved using the laser scribing technique. The effect of the laser scribing on the reduction of GO was investigated under two conditions: atmosphere and vacuum. GO solvent was coated onto a LightScribe DVD. The laser scribing technique was applied to reduce GO layers to generate rGO. The micro-details for the morphological structures of rGO and GO were visualised using scanning electron microscopy (SEM) and Raman spectroscopy so that they could be examined. The first electrode was a traditional graphene-based electrode model, made under normal atmospheric conditions, whereas the second model was a developed graphene electrode fabricated under a vacuum state using a vacuum chamber. The purpose was to control the vacuum conditions, such as the air pressure and the temperature during the fabrication process. The parameters to be assessed include the layer thickness and the continuous environment. Results presented show high accuracy and repeatability achieving low cost productivity.

Keywords: laser scribing, lightscribe DVD, graphene oxide, scanning electron microscopy

Procedia PDF Downloads 105
313 Identification of Architectural Design Error Risk Factors in Construction Projects Using IDEF0 Technique

Authors: Sahar Tabarroki, Ahad Nazari

Abstract:

The design process is one of the most key project processes in the construction industry. Although architects have the responsibility to produce complete, accurate, and coordinated documents, architectural design is accompanied by many errors. A design error occurs when the constraints and requirements of the design are not satisfied. Errors are potentially costly and time-consuming to correct if not caught early during the design phase, and they become expensive in either construction documents or in the construction phase. The aim of this research is to identify the risk factors of architectural design errors, so identification of risks is necessary. First, a literature review in the design process was conducted and then a questionnaire was designed to identify the risks and risk factors. The questions in the form of the questionnaire were based on the “similar service description of study and supervision of architectural works” published by “Vice Presidency of Strategic Planning & Supervision of I.R. Iran” as the base of architects’ tasks. Second, the top 10 risks of architectural activities were identified. To determine the positions of possible causes of risks with respect to architectural activities, these activities were located in a design process modeled by the IDEF0 technique. The research was carried out by choosing a case study, checking the design drawings, interviewing its architect and client, and providing a checklist in order to identify the concrete examples of architectural design errors. The results revealed that activities such as “defining the current and future requirements of the project”, “studies and space planning,” and “time and cost estimation of suggested solution” has a higher error risk than others. Moreover, the most important causes include “unclear goals of a client”, “time force by a client”, and “lack of knowledge of architects about the requirements of end-users”. For error detecting in the case study, lack of criteria, standards and design criteria, and lack of coordination among them, was a barrier, anyway, “lack of coordination between architectural design and electrical and mechanical facility”, “violation of the standard dimensions and sizes in space designing”, “design omissions” were identified as the most important design errors.

Keywords: architectural design, design error, risk management, risk factor

Procedia PDF Downloads 119
312 Three Issues for Integrating Artificial Intelligence into Legal Reasoning

Authors: Fausto Morais

Abstract:

Artificial intelligence has been widely used in law. Programs are able to classify suits, to identify decision-making patterns, to predict outcomes, and to formalize legal arguments as well. In Brazil, the artificial intelligence victor has been classifying cases to supreme court’s standards. When those programs act doing those tasks, they simulate some kind of legal decision and legal arguments, raising doubts about how artificial intelligence can be integrated into legal reasoning. Taking this into account, the following three issues are identified; the problem of hypernormatization, the argument of legal anthropocentrism, and the artificial legal principles. Hypernormatization can be seen in the Brazilian legal context in the Supreme Court’s usage of the Victor program. This program generated efficiency and consistency. On the other hand, there is a feasible risk of over standardizing factual and normative legal features. Then legal clerks and programmers should work together to develop an adequate way to model legal language into computational code. If this is possible, intelligent programs may enact legal decisions in easy cases automatically cases, and, in this picture, the legal anthropocentrism argument takes place. Such an argument argues that just humans beings should enact legal decisions. This is so because human beings have a conscience, free will, and self unity. In spite of that, it is possible to argue against the anthropocentrism argument and to show how intelligent programs may work overcoming human beings' problems like misleading cognition, emotions, and lack of memory. In this way, intelligent machines could be able to pass legal decisions automatically by classification, as Victor in Brazil does, because they are binding by legal patterns and should not deviate from them. Notwithstanding, artificial intelligent programs can be helpful beyond easy cases. In hard cases, they are able to identify legal standards and legal arguments by using machine learning. For that, a dataset of legal decisions regarding a particular matter must be available, which is a reality in Brazilian Judiciary. Doing such procedure, artificial intelligent programs can support a human decision in hard cases, providing legal standards and arguments based on empirical evidence. Those legal features claim an argumentative weight in legal reasoning and should serve as references for judges when they must decide to maintain or overcome a legal standard.

Keywords: artificial intelligence, artificial legal principles, hypernormatization, legal anthropocentrism argument, legal reasoning

Procedia PDF Downloads 133
311 ESL Material Evaluation: The Missing Link in Nigerian Classrooms

Authors: Abdulkabir Abdullahi

Abstract:

The paper is a pre-use evaluation of grammar activities in three primary English course books (two of which are international primary English course books and the other a popular Nigerian primary English course book). The titles are - Cambridge Global English, Collins International Primary English, and Nigeria Primary English – Primary English. Grammar points and grammar activities in the three-course books were identified, grouped, and evaluated. The grammar activity which was most common in the course books, simple past tense, was chosen for evaluation, and the units which present simple past tense activities were selected to evaluate the extent to which the treatment of simple past tense in each of the course books help the young learners of English as a second language in Nigeria, aged 8 – 11, level A1 to A2, who lack the basic grammatical knowledge, to know grammar/communicate effectively. A bespoke checklist was devised, through the modification of existing checklists for the purpose of the evaluation, to evaluate the extent to which the grammar activities promote the communicative effectiveness of Nigerian learners of English as a second language. The results of the evaluation and the analysis of the data reveal that the treatment of grammar, especially the treatment of the simple past tense, is evidently insufficient. While Cambridge Global English’s, and Collins International Primary English’s treatment of grammar, the simple past tense, is underpinned by state-of-the-art theories of learning, language learning theories, second language learning principles, second language curriculum-syllabus design principles, grammar learning and teaching theories, the grammar load is insignificantly low, and the grammar tasks do not promote creative grammar practice sufficiently. Nigeria Primary English – Primary English, on the other hand, treats grammar, the simple past tense, in the old-fashioned direct way. The book does not favour the communicative language teaching approach; no opportunity for learners to notice and discover grammar rules for themselves, and the book lacks the potency to promote creative grammar practice. The research and its findings, therefore, underscore the need to improve grammar contents and increase grammar activity types which engage learners effectively and promote sufficient creative grammar practice in EFL and ESL material design and development.

Keywords: evaluation, activity, second language, activity-types, creative grammar practice

Procedia PDF Downloads 67
310 Revolutionizing Legal Drafting: Leveraging Artificial Intelligence for Efficient Legal Work

Authors: Shreya Poddar

Abstract:

Legal drafting and revising are recognized as highly demanding tasks for legal professionals. This paper introduces an approach to automate and refine these processes through the use of advanced Artificial Intelligence (AI). The method employs Large Language Models (LLMs), with a specific focus on 'Chain of Thoughts' (CoT) and knowledge injection via prompt engineering. This approach differs from conventional methods that depend on comprehensive training or fine-tuning of models with extensive legal knowledge bases, which are often expensive and time-consuming. The proposed method incorporates knowledge injection directly into prompts, thereby enabling the AI to generate more accurate and contextually appropriate legal texts. This approach substantially decreases the necessity for thorough model training while preserving high accuracy and relevance in drafting. Additionally, the concept of guardrails is introduced. These are predefined parameters or rules established within the AI system to ensure that the generated content adheres to legal standards and ethical guidelines. The practical implications of this method for legal work are considerable. It has the potential to markedly lessen the time lawyers allocate to document drafting and revision, freeing them to concentrate on more intricate and strategic facets of legal work. Furthermore, this method makes high-quality legal drafting more accessible, possibly reducing costs and expanding the availability of legal services. This paper will elucidate the methodology, providing specific examples and case studies to demonstrate the effectiveness of 'Chain of Thoughts' and knowledge injection in legal drafting. The potential challenges and limitations of this approach will also be discussed, along with future prospects and enhancements that could further advance legal work. The impact of this research on the legal industry is substantial. The adoption of AI-driven methods by legal professionals can lead to enhanced efficiency, precision, and consistency in legal drafting, thereby altering the landscape of legal work. This research adds to the expanding field of AI in law, introducing a method that could significantly alter the nature of legal drafting and practice.

Keywords: AI-driven legal drafting, legal automation, futureoflegalwork, largelanguagemodels

Procedia PDF Downloads 45
309 Interruption Overload in an Office Environment: Hungarian Survey Focusing on the Factors that Affect Job Satisfaction and Work Efficiency

Authors: Fruzsina Pataki-Bittó, Edit Németh

Abstract:

On the one hand, new technologies and communication tools improve employee productivity and accelerate information and knowledge transfer, while on the other hand, information overload and continuous interruptions make it even harder to concentrate at work. It is a great challenge for companies to find the right balance, while there is also an ongoing demand to recruit and retain the talented employees who are able to adopt the modern work style and effectively use modern communication tools. For this reason, this research does not focus on the objective measures of office interruptions, but aims to find those disruption factors which influence the comfort and job satisfaction of employees, and the way how they feel generally at work. The focus of this research is on how employees feel about the different types of interruptions, which are those they themselves identify as hindering factors, and those they feel as stress factors. By identifying and then reducing these destructive factors, job satisfaction can reach a higher level and employee turnover can be reduced. During the research, we collected information from depth interviews and questionnaires asking about work environment, communication channels used in the workplace, individual communication preferences, factors considered as disruptions, and individual steps taken to avoid interruptions. The questionnaire was completed by 141 office workers from several types of workplaces based in Hungary. Even though 66 respondents are working at Hungarian offices of multinational companies, the research is about the characteristics of the Hungarian labor force. The most important result of the research shows that while more than one third of the respondents consider office noise as a disturbing factor, personal inquiries are welcome and considered useful, even if in such cases the work environment will not be convenient to solve tasks requiring concentration. Analyzing the sizes of the offices, in an open-space environment, the rate of those who consider office noise as a disturbing factor is surprisingly lower than in smaller office rooms. Opinions are more diverse regarding information communication technologies. In addition to the interruption factors affecting the employees' job satisfaction, the research also focuses on the role of the offices in the 21st century.

Keywords: information overload, interruption, job satisfaction, office environment, work efficiency

Procedia PDF Downloads 221
308 Exploring Pre-Trained Automatic Speech Recognition Model HuBERT for Early Alzheimer’s Disease and Mild Cognitive Impairment Detection in Speech

Authors: Monica Gonzalez Machorro

Abstract:

Dementia is hard to diagnose because of the lack of early physical symptoms. Early dementia recognition is key to improving the living condition of patients. Speech technology is considered a valuable biomarker for this challenge. Recent works have utilized conventional acoustic features and machine learning methods to detect dementia in speech. BERT-like classifiers have reported the most promising performance. One constraint, nonetheless, is that these studies are either based on human transcripts or on transcripts produced by automatic speech recognition (ASR) systems. This research contribution is to explore a method that does not require transcriptions to detect early Alzheimer’s disease (AD) and mild cognitive impairment (MCI). This is achieved by fine-tuning a pre-trained ASR model for the downstream early AD and MCI tasks. To do so, a subset of the thoroughly studied Pitt Corpus is customized. The subset is balanced for class, age, and gender. Data processing also involves cropping the samples into 10-second segments. For comparison purposes, a baseline model is defined by training and testing a Random Forest with 20 extracted acoustic features using the librosa library implemented in Python. These are: zero-crossing rate, MFCCs, spectral bandwidth, spectral centroid, root mean square, and short-time Fourier transform. The baseline model achieved a 58% accuracy. To fine-tune HuBERT as a classifier, an average pooling strategy is employed to merge the 3D representations from audio into 2D representations, and a linear layer is added. The pre-trained model used is ‘hubert-large-ls960-ft’. Empirically, the number of epochs selected is 5, and the batch size defined is 1. Experiments show that our proposed method reaches a 69% balanced accuracy. This suggests that the linguistic and speech information encoded in the self-supervised ASR-based model is able to learn acoustic cues of AD and MCI.

Keywords: automatic speech recognition, early Alzheimer’s recognition, mild cognitive impairment, speech impairment

Procedia PDF Downloads 115
307 Arginase Enzyme Activity in Human Serum as a Marker of Cognitive Function: The Role of Inositol in Combination with Arginine Silicate

Authors: Katie Emerson, Sara Perez-Ojalvo, Jim Komorowski, Danielle Greenberg

Abstract:

The purpose of this study was to evaluate arginase activity levels in response to combinations of an inositol-stabilized arginine silicate (ASI; Nitrosigine®), L-arginine, and Inositol. Arginine acts as a vasodilator that promotes increased blood flow resulting in enhanced delivery of oxygen and nutrients to the brain and other tissues. ASI alone has been shown to improve performance on cognitive tasks. Arginase, found in human serum, catalyzes the conversion of arginine to ornithine and urea, completing the last step in the urea cycle. Decreasing arginase levels maintains arginine and results in increased nitric oxide production. This study aimed to determine the most effective combination of ASI, L-arginine and inositol for minimizing arginase levels and therefore maximize ASI’s effect on cognition. Serum was taken from untreated healthy donors by separation from clotted factors. Arginase activity of serum in the presence or absence of test products was determined (QuantiChrom™, DARG-100, Bioassay Systems, Hayward CA). The remaining ultra-filtrated serum units were harvested and used as the source for the arginase enzyme. ASI alone or combined with varied levels of Inositol were tested as follows: ASI + inositol at 0.25 g, 0.5 g, 0.75 g, or 1.00 g. L-arginine was also tested as a positive control. All tests elicited changes in arginase activity demonstrating the efficacy of the method used. Adding L-arginine to serum from untreated subjects, with or without inositol only had a mild effect. Adding inositol at all levels reduced arginase activity. Adding 0.5 g to the standardized amount of ASI led to the lowest amount of arginase activity as compared to the 0.25g 0.75g or 1.00g doses of inositol or to L-arginine alone. The outcome of this study demonstrates an interaction of the pairing of inositol with ASI on the activity of the enzyme arginase. We found that neither the maximum nor minimum amount of inositol tested in this study led to maximal arginase inhibition. Since the inhibition of arginase activity is desirable for product formulations looking to maintain arginine levels, the most effective amount of inositol was deemed preferred. Subsequent studies suggest this moderate level of inositol in combination with ASI leads to cognitive improvements including reaction time, executive function, and concentration.

Keywords: arginine, inositol, arginase, cognitive benefits

Procedia PDF Downloads 99
306 The Oral Production of University EFL Students: An Analysis of Tasks, Format, and Quality in Foreign Language Development

Authors: Vera Lucia Teixeira da Silva, Sandra Regina Buttros Gattolin de Paula

Abstract:

The present study focuses on academic literacy and addresses the impact of semantic-discursive resources on the constitution of genres that are produced in such context. The research considers the development of writing in the academic context in Portuguese. Researches that address academic literacy and the characteristics of the texts produced in this context are rare, mainly with focus on the development of writing, considering three variables: the constitution of the writer, the perception of the reader/interlocutor and the organization of the informational text flow. The research aims to map the semantic-discursive resources of the written register in texts of several genres and produced by students in the first semester of the undergraduate course in Letters. The hypothesis raised is that writing in the academic environment is not a recurrent literacy practice for these learners and can be explained by the ontogenetic and phylogenetic nature of language development. Qualitative in nature, the present research has as empirical data texts produced in a half-yearly course of Reading and Textual Production; these data result from the proposition of four different writing proposals, in a total of 600 texts. The corpus is analyzed based on semantic-discursive resources, seeking to contemplate relevant aspects of language (grammar, discourse and social context) that reveal the choices made in the reader/writer interrelationship and the organizational flow of the Text. Among the semantic-discursive resources, the analysis includes three resources, including (a) appraisal and negotiation to understand the attitudes negotiated (roles of the participants of the discourse and their relationship with the other); (b) ideation to explain the construction of the experience (activities performed and participants); and (c) periodicity to outline the flow of information in the organization of the text according to the genre it instantiates. The results indicate the organizational difficulties of the flow of the text information. Cartography contributes to the understanding of the way writers use language in an effort to present themselves, evaluate someone else’s work, and communicate with readers.

Keywords: academic writing, Portuguese mother tongue, semantic-discursive resources, academic context

Procedia PDF Downloads 110
305 Disentangling the Sources and Context of Daily Work Stress: Study Protocol of a Comprehensive Real-Time Modelling Study Using Portable Devices

Authors: Larissa Bolliger, Junoš Lukan, Mitja Lustrek, Dirk De Bacquer, Els Clays

Abstract:

Introduction and Aim: Chronic workplace stress and its health-related consequences like mental and cardiovascular diseases have been widely investigated. This project focuses on the sources and context of psychosocial daily workplace stress in a real-world setting. The main objective is to analyze and model real-time relationships between (1) psychosocial stress experiences within the natural work environment, (2) micro-level work activities and events, and (3) physiological signals and behaviors in office workers. Methods: An Ecological Momentary Assessment (EMA) protocol has been developed, partly building on machine learning techniques. Empatica® wristbands will be used for real-life detection of stress from physiological signals; micro-level activities and events at work will be based on smartphone registrations, further processed according to an automated computer algorithm. A field study including 100 office-based workers with high-level problem-solving tasks like managers and researchers will be implemented in Slovenia and Belgium (50 in each country). Data mining and state-of-the-art statistical methods – mainly multilevel statistical modelling for repeated data – will be used. Expected Results and Impact: The project findings will provide novel contributions to the field of occupational health research. While traditional assessments provide information about global perceived state of chronic stress exposure, the EMA approach is expected to bring new insights about daily fluctuating work stress experiences, especially micro-level events and activities at work that induce acute physiological stress responses. The project is therefore likely to generate further evidence on relevant stressors in a real-time working environment and hence make it possible to advise on workplace procedures and policies for reducing stress.

Keywords: ecological momentary assessment, real-time, stress, work

Procedia PDF Downloads 142
304 Building a Blockchain-based Internet of Things

Authors: Rob van den Dam

Abstract:

Today’s Internet of Things (IoT) comprises more than a billion intelligent devices, connected via wired/wireless communications. The expected proliferation of hundreds of billions more places us at the threshold of a transformation sweeping across the communications industry. Yet, we found that the IoT architecture and solutions that currently work for billions of devices won’t necessarily scale to tomorrow’s hundreds of billions of devices because of high cost, lack of privacy, not future-proof, lack of functional value and broken business models. As the IoT scales exponentially, decentralized networks have the potential to reduce infrastructure and maintenance costs to manufacturers. Decentralization also promises increased robustness by removing single points of failure that could exist in traditional centralized networks. By shifting the power in the network from the center to the edges, devices gain greater autonomy and can become points of transactions and economic value creation for owners and users. To validate the underlying technology vision, IBM jointly developed with Samsung Electronics the autonomous decentralized peer-to- peer proof-of-concept (PoC). The primary objective of this PoC was to establish a foundation on which to demonstrate several capabilities that are fundamental to building a decentralized IoT. Though many commercial systems in the future will exist as hybrid centralized-decentralized models, the PoC demonstrated a fully distributed proof. The PoC (a) validated the future vision for decentralized systems to extensively augment today’s centralized solutions, (b) demonstrated foundational IoT tasks without the use of centralized control, (c) proved that empowered devices can engage autonomously in marketplace transactions. The PoC opens the door for the communications and electronics industry to further explore the challenges and opportunities of potential hybrid models that can address the complexity and variety of requirements posed by the internet that continues to scale. Contents: (a) The new approach for an IoT that will be secure and scalable, (b) The three foundational technologies that are key for the future IoT, (c) The related business models and user experiences, (d) How such an IoT will create an 'Economy of Things', (e) The role of users, devices, and industries in the IoT future, (f) The winners in the IoT economy.

Keywords: IoT, internet, wired, wireless

Procedia PDF Downloads 325
303 Turning Points in the Development of Translator Training in the West from the 1980s to the Present

Authors: B. Sayaheen

Abstract:

The translator’s competence is one of the topics that has received a great deal of research in the field of translation studies because such competencies are still debatable and not yet agreed upon. Besides, scholars tackle this topic from different points of view. Approaches to teaching these competencies have gone through some developments. This paper aims at investigating these developments, exploring the major turning points and shifts in the developments of teaching methods in translator training. The significance of these turning points and the external or internal causes will also be discussed. Based on the past and present status of teaching approaches in translator training, this paper tries to predict the future of these approaches. This paper is mainly concerned with developments of teaching approaches in the West since the 1980s to the present. The reason behind choosing this specific period is not because translator training started in the 1980s but because most criticism of the teacher-centered approach started at that time. The implications of this research stem from the fact that it identifies the turning points and the causes that led teachers to adopt student-centered approaches rather than teacher-centered approaches and then to incorporate technology and the Internet in translator training. These reasons were classified as external or internal reasons. Translation programs in the West and in other cultures can benefit from this study. Translation programs in the West can notice that teaching translation is geared toward incorporating more technologies. If these programs already use technology and the Internet to teach translation, they might benefit from the assumed future direction of teaching translation. On the other hand, some non-Western countries, and to be specific some professors, are still applying the teacher-centered approach. Moreover, these programs should include technology and the Internet in their teaching approaches to meet the drastic changes in the translation process, which seems to rely more on software and technologies to accomplish the translator’s tasks. Finally, translator training has borrowed many of its approaches from other disciplines, mainly language teaching. The teaching approaches in translator training have gone through some developments, from teacher-centered to student-centered and then toward the integration of technologies and the Internet. Both internal and external causes have played a crucial role in these developments. These borrowed approaches should be comprehensively evaluated in order to see if they achieve the goals of translator training. Such evaluation may lead us to come up with new teaching approaches developed specifically for translator training. While considering these methods and designing new approaches, we need to keep an eye on the future needs of the market.

Keywords: turning points, developments, translator training, market, The West

Procedia PDF Downloads 104
302 Experience of Inpatient Life in Korean Complex Regional Pain Syndrome: A Phenomenological Study

Authors: Se-Hwa Park, En-Kyung Han, Jae-Young Lim, Hye-Jung Ahn

Abstract:

Purpose: The objective of this study is to provide basic data for understanding the substance of inpatient life with CRPS (Complex Regional Pain Syndrome) and developing efficient and effective nursing intervention. Methods: From September 2018 to November, we have interviewed 10 CRPS patients about inpatient experiences. To understand the implication of inpatient life experiences with CRPS and intrinsic structure, we have used the question: 'How about the inpatient experiences with CRPS'. For data analysis, the method suggested by Colaizzi was applied as a phenomenological method. Results: According to the analysis, the study participants' inpatient life process was structured in six categories: (a) breakthrough pain experience (b) the limitation of pain treatment, (c) worsen factors of pain during inpatient period, (d) treat method for pain, (e) positive experience for inpatient period, (f) requirements for medical team, family and people in hospital room. Conclusion: Inpatient with CRPS have experienced the breakthrough pain. They had expected immediate treatment for breakthrough pain, but they experienced severe pain because immediate treatment was not implemented. Pain-worsening factors which patients with CRPS are as follows: personal factors from negative emotions such as insomnia, stress, sensitive character, pain part touch or vibration stimulus on the bed, physical factors from high threshold or rapid speed during fast transfer, conflict with other people, climate factors such as humidity or low temperature, noise, smell, lack of space because of many visitors. Patients actively manage the pain committing into another tasks or diversion. And also, patients passively manage the pain, just suppress, give-up. They think positively about rehabilitation treatment. And they require the understanding and sympathy for other people, and emotional support, immediate intervention for medical team. Based on the results of this study, we suppose the guideline of systematic breakthrough pain management for the relaxation of sudden pain, using notice of informing caution for touch or vibration. And we need to develop non-medicine pain management nursing intervention.

Keywords: breakthrough pain, CRPS, complex regional pain syndrome, inpatient life experiences, phenomenological method

Procedia PDF Downloads 118