Search results for: language issues
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8322

Search results for: language issues

1092 The Risk of Occupational Health in the Shipbuilding Industry in Bangladesh

Authors: Md. Rashel Sheikh

Abstract:

The shipbuilding industry in Bangladesh had become a fast-growing industry in recent years when it began to export newly built ships. The various activities of shipbuilding industries in their limited, confined spaces added occupational worker exposures to chemicals, dusts, and metal fumes. The aim of this literature search is to identify the potential sources of occupational health hazards in shipyards and to promote the regulation of appropriate personal protective equipment (PPE) for the workers. In shipyards, occupational workers are involved in various activities, such as the manufacture, repair, maintenance, dismantling of boats and ships, building small ocean-going vessels and ferries. The occupational workers in the shipbuilding industry suffer from a number of hazardous issues, such as asthma, dermatitis, hearing deficits, and musculoskeletal disorders. The use of modern technologies, such as underwater plasma welding, electron beam welding, and friction stir welding and laser cutting and welding, and appropriate PPE (i.e., long-sleeved shirt and long pants, shoes plus socks, safety masks, chemical resistant gloves, eyewear, face shield, and respirators) can help reduce the occupational exposure to environmental hazards created by different activities in the shipyards. However, most shipyards in Bangladesh use traditional methods, e.g., flame cutting and arc, that add hazardous waste and pollutants to the environment in and around the shipyard. The safety and security of occupational workers in the shipyard workplace are very important. It is the primary responsibility of employers to ensure the safety and security of occupational workers in the shipyards. Employers must use advanced technologies and supply adequate and appropriate PPE for the workers. There are a number of accidents and illnesses that happen daily in the shipyard industries in Bangladesh due to the negligence and lack of adequate technologies and appropriate PPE. In addition, there are no specific regulations and implementations available to use the PPE. It is essential to have PPE regulations and strict enforcement for the adoption of PPE in the shipbuilding industries in Bangladesh. Along with the adoption of PPE with regular health examinations, health education to the workers regarding occupational hazards and lifestyle diseases are also important and require reinforcement. Monitoring health and safety hazards in shipyards are essential to enhance worker protection, and ensure worker safety, and mitigate workplace injuries and illnesses.

Keywords: shipbuilding Industries, health education, occupational health hazards, personal protective equipment, shipyard workers, occupational workers, shipyards

Procedia PDF Downloads 153
1091 The Relationship Between Weight Gain, Cyclicality of Diabetologic Education and the Experienced Stress: A Study Involving Pregnant Women

Authors: Agnieszka Rolinska, Marta Makara-Studzinska

Abstract:

Introduction: In recent years, there has been an intensive development of research into the physiological relationships between the experienced stress and obesity. Moreover, strong chronic stress leads to the disorganization of a person’s activeness on various levels of functioning, including the behavioral and cognitive sphere (also in one’s diet). Aim: The present work addresses the following research questions: Is there a relationship between an increase in stress related to the disease and the need for the cyclicality of diabetologic education in gestational diabetes? Are there any differences in terms of the experienced stress during the last three months of pregnancy in women with gestational diabetes and in normal pregnancy between the patients with normal weight gains and those with abnormal weight gains? Are there any differences in terms of stress coping styles in women with gestational diabetes and in normal pregnancy between the patients with normal weight gains and those with abnormal weight gains? Method: The study involved pregnant women with gestational diabetes (treated with diet, without insulin therapy) and in normal pregnancy – 206 women in total. The following psychometric tools were employed: Perceived Stress Scale (PSS; Cohen, Kamarck, Mermelstein), Coping Inventory for Stressful Situations (CISS; Endler, Parker) and authors’ own questionnaire. Gestational diabetes mellitus was diagnosed on the basis of the results of fasting oral glucose tolerance test (75 g OGTT). Body weight measurements were confirmed in a diagnostic interview, taking into account medical data. Regularities in weight gains in pregnancy were determined according to the recommendations of the Polish Gynecological Society and American norms determined by the Institute of Medicine (IOM). Conclusions: An increase in stress related to the disease varies in patients with differing requirements for the cyclical nature of diabetologic education (i.e. education which is systematically repeated). There are no differences in terms of recently experienced stress and stress coping styles between women with gestational diabetes and those in normal pregnancy. There is a relationship between weight gains in pregnancy and the stress experienced in life as well as stress coping styles – both in pregnancy complicated by diabetes and in physiological pregnancy. In the discussion of the obtained results, the authors refer to scientific reports from English-language magazines of international range.

Keywords: diabetologic education, gestational diabetes, stress, weight gain in pregnancy

Procedia PDF Downloads 299
1090 Adversarial Attacks and Defenses on Deep Neural Networks

Authors: Jonathan Sohn

Abstract:

Deep neural networks (DNNs) have shown state-of-the-art performance for many applications, including computer vision, natural language processing, and speech recognition. Recently, adversarial attacks have been studied in the context of deep neural networks, which aim to alter the results of deep neural networks by modifying the inputs slightly. For example, an adversarial attack on a DNN used for object detection can cause the DNN to miss certain objects. As a result, the reliability of DNNs is undermined by their lack of robustness against adversarial attacks, raising concerns about their use in safety-critical applications such as autonomous driving. In this paper, we focus on studying the adversarial attacks and defenses on DNNs for image classification. There are two types of adversarial attacks studied which are fast gradient sign method (FGSM) attack and projected gradient descent (PGD) attack. A DNN forms decision boundaries that separate the input images into different categories. The adversarial attack slightly alters the image to move over the decision boundary, causing the DNN to misclassify the image. FGSM attack obtains the gradient with respect to the image and updates the image once based on the gradients to cross the decision boundary. PGD attack, instead of taking one big step, repeatedly modifies the input image with multiple small steps. There is also another type of attack called the target attack. This adversarial attack is designed to make the machine classify an image to a class chosen by the attacker. We can defend against adversarial attacks by incorporating adversarial examples in training. Specifically, instead of training the neural network with clean examples, we can explicitly let the neural network learn from the adversarial examples. In our experiments, the digit recognition accuracy on the MNIST dataset drops from 97.81% to 39.50% and 34.01% when the DNN is attacked by FGSM and PGD attacks, respectively. If we utilize FGSM training as a defense method, the classification accuracy greatly improves from 39.50% to 92.31% for FGSM attacks and from 34.01% to 75.63% for PGD attacks. To further improve the classification accuracy under adversarial attacks, we can also use a stronger PGD training method. PGD training improves the accuracy by 2.7% under FGSM attacks and 18.4% under PGD attacks over FGSM training. It is worth mentioning that both FGSM and PGD training do not affect the accuracy of clean images. In summary, we find that PGD attacks can greatly degrade the performance of DNNs, and PGD training is a very effective way to defend against such attacks. PGD attacks and defence are overall significantly more effective than FGSM methods.

Keywords: deep neural network, adversarial attack, adversarial defense, adversarial machine learning

Procedia PDF Downloads 177
1089 Library Support for the Intellectually Disabled: Book Clubs and Universal Design

Authors: Matthew Conner, Leah Plocharczyk

Abstract:

This study examines the role of academic libraries in support of the intellectually disabled (ID) in post-secondary education. With the growing public awareness of the ID, there has been recognition of their need for post-secondary educational opportunities. This was an unforeseen result for a population that has been associated with elementary levels of education, yet the reasons are compelling. After aging out of the school system, the ID need and deserve educational and social support as much as anyone. Moreover, the commitment to diversity in higher education rings hollow if this group is excluded. Yet, challenges remain to integrating the ID into a college curriculum. This presentation focuses on the role of academic libraries. Neglecting this vital resource for the support of the ID is not to be thought of, yet the library’s contribution is not clear. Library collections presume reading ability and libraries already struggle to meet their traditional goals with the resources available. This presentation examines how academic libraries can support post-secondary ID. For context, the presentation first examines the state of post-secondary education for the ID with an analysis of data on the United States compiled by the ThinkCollege! Project. Geographic Information Systems (GIS) and statistical analysis will show regional and methodological trends in post-secondary support of the ID which currently lack any significant involvement by college libraries. Then, the presentation analyzes a case study of a book club at the Florida Atlantic University (FAU) libraries which has run for several years. Issues such as the selection of books, effective pedagogies, and evaluation procedures will be examined. The study has found that the instruction pedagogies used by libraries can be extended through concepts of Universal Learning Design (ULD) to effectively engage the ID. In particular, student-centered, participatory methodologies that accommodate different learning styles have proven to be especially useful. The choice of text is complex and determined not only by reading ability but familiarity of subject and features of the ID’s developmental trajectory. The selection of text is not only a necessity but also promises to give insight into the ID. Assessment remains a complex and unresolved subject, but the voluntary, sustained, and enthusiastic attendance of the ID is an undeniable indicator. The study finds that, through the traditional library vehicle of the book club, academic libraries can support ID students through training in both reading and socialization, two major goals of their post-secondary education.

Keywords: academic libraries, intellectual disability, literacy, post-secondary education

Procedia PDF Downloads 143
1088 Implementing Online Blogging in Specific Context Using Process-Genre Writing Approach in Saudi EFL Writing Class to Improve Writing Learning and Teaching Quality

Authors: Sultan Samah A. Alenezi

Abstract:

Many EFL teachers are eager to look into the best way to suit the needs of their students in EFL writing courses. Numerous studies suggest that online blogging may present a social interaction opportunity for EFL writing students. Additionally, it can foster peer collaboration and social support in the form of scaffolding, which, when viewed from the perspective of socio-cultural theory, can boost social support and foster the development of students' writing abilities. This idea is based on Vygotsky's theories, which emphasize how collaboration and social interaction facilitate effective learning. In Saudi Arabia, students are taught to write using conventional methods that are totally under the teacher's control. Without any peer contact or cooperation, students are spoon-fed in a passive environment. This study included the cognitive processes of the genre-process approach into the EFL writing classroom to facilitate the use of internet blogging in EFL writing education. Thirty second-year undergraduate students from the Department of Languages and Translation at a Saudi college participated in this study. This study employed an action research project that blended qualitative and quantitative methodologies to comprehend Saudi students' perceptions and experiences with internet blogging in an EFL process-genre writing classroom. It also looked at the advantages and challenges people faced when blogging. They included a poll, interviews, and blog postings made by students. The intervention's outcomes showed that merging genre-process procedures with blogging was a successful tactic, and the Saudi students' perceptions of this method of online blogging for EFL writing were quite positive. The socio-cultural theory constructs that Vygotsky advocates, such as scaffolding, collaboration, and social interaction, were also improved by blogging. These elements demonstrated the improvement in the students' written, reading, social, and collaborative thinking skills, as well as their positive attitudes toward English-language writing. But the students encountered a variety of problems that made blogging difficult for them. These problems ranged from technological ones, such sluggish internet connections, to learner inadequacies, like a lack of computer know-how and ineffective time management.

Keywords: blogging, process-gnere approach, saudi learenrs, writing quality

Procedia PDF Downloads 97
1087 Validation of an Impedance-Based Flow Cytometry Technique for High-Throughput Nanotoxicity Screening

Authors: Melanie Ostermann, Eivind Birkeland, Ying Xue, Alexander Sauter, Mihaela R. Cimpan

Abstract:

Background: New reliable and robust techniques to assess biological effects of nanomaterials (NMs) in vitro are needed to speed up safety analysis and to identify key physicochemical parameters of NMs, which are responsible for their acute cytotoxicity. The central aim of this study was to validate and evaluate the applicability and reliability of an impedance-based flow cytometry (IFC) technique for the high-throughput screening of NMs. Methods: Eight inorganic NMs from the European Commission Joint Research Centre Repository were used: NM-302 and NM-300k (Ag: 200 nm rods and 16.7 nm spheres, respectively), NM-200 and NM- 203 (SiO₂: 18.3 nm and 24.7 nm amorphous, respectively), NM-100 and NM-101 (TiO₂: 100 nm and 6 nm anatase, respectively), and NM-110 and NM-111 (ZnO: 147 nm and 141 nm, respectively). The aim was to assess the biological effects of these materials on human monoblastoid (U937) cells. Dispersions of NMs were prepared as described in the NANOGENOTOX dispersion protocol and cells were exposed to NMs at relevant concentrations (2, 10, 20, 50, and 100 µg/mL) for 24 hrs. The change in electrical impedance was measured at 0.5, 2, 6, and 12 MHz using the IFC AmphaZ30 (Amphasys AG, Switzerland). A traditional toxicity assay, Trypan Blue Dye Exclusion assay, and dark-field microscopy were used to validate the IFC method. Results: Spherical Ag particles (NM-300K) showed the highest toxic effect on U937 cells followed by ZnO (NM-111 ≥ NM-110) particles. Silica particles were moderate to non-toxic at all used concentrations under these conditions. A higher toxic effect was seen with smaller sized TiO2 particles (NM-101) compared to their larger analogues (NM-100). No interferences between the IFC and the used NMs were seen. Uptake and internalization of NMs were observed after 24 hours exposure, confirming actual NM-cell interactions. Conclusion: Results collected with the IFC demonstrate the applicability of this method for rapid nanotoxicity assessment, which proved to be less prone to nano-related interference issues compared to some traditional toxicity assays. Furthermore, this label-free and novel technique shows good potential for up-scaling in directions of an automated high-throughput screening and for future NM toxicity assessment. This work was supported by the EC FP7 NANoREG (Grant Agreement NMP4-LA-2013-310584), the Research Council of Norway, project NorNANoREG (239199/O70), the EuroNanoMed II 'GEMN' project (246672), and the UH-Nett Vest project.

Keywords: cytotoxicity, high-throughput, impedance, nanomaterials

Procedia PDF Downloads 345
1086 Cartography through Picasso’s Eyes

Authors: Desiree Di Marco

Abstract:

The aim of this work is to show through the lens of art first which kind of reality was the one represented through fascist maps, and second to study the impact of the fascist regime’s cartography (FRC) on observers eye’s. In this study, it is assumed that the FRC’s representation of reality was simplified, timeless, and even a-spatial because it underrates the concept of territoriality. Cubism and Picasso’s paintings will be used as counter-examples to mystify fascist cartography’s ideological assumptions. The difference between the gaze of an observer looking at the surface of a fascist map and the gaze of someone observing a Picasso painting is impressive. Because there is always something dark, hidden, behind and inside a map, the world of fascist maps was a world built starting from the observation of a “window” that distorted reality and trapped the eyes of the observers. Moving across the map, they seem as if they were hypnotized. Cartohypnosis is the state in which the observer finds himself enslaved by the attractive force of the map, which uses a sort of “magic” geography, a geography that, by means of symbolic language, never has as its primary objective the attempt to show us reality in its complexity, but that of performing for its audience. Magical geography and hypnotic cartography in fascism blended together, creating an almost mystical, magical relationship that demystified reality to reduce the world to a conquerable space. This reduction offered the observer the possibility of conceiving new dimensions: of the limit, of the boundary, elements with which the subject felt fully involved and in which the aesthetic force of the images demonstrated all its strength. But in the early 20th century, the combination of art and cartography gave rise to new possibilities. Cubism which, more than all the other artistic currents showed us how much the observation of reality from a single point of view falls within dangerous logic, is an example. Cubism was an artistic movement that brought about a profound transformation in pictorial culture. It was not only a revolution of pictorial space, but it was a revolution of our conception of pictorial space. Up until that time, men and women were more inclined to believe in the power of images and their representations. Cubist painters rebelled against this blindness by claiming that art must always offer an alternative. Indeed the contribution of this work is precisely to show how art can be able to provide alternatives to even the most horrible regimes and the most atrocious human misfortunes. It also enriches the field of cartography because it "reassures" it by showing how much good it can be for cartography if also for other disciplines come close. Only in this way researcher can increase the chances for the cartography of a greater diffusion at the academic level.

Keywords: cartography, Picasso, fascism, culture

Procedia PDF Downloads 50
1085 The Role of Artificial Intelligence in Creating Personalized Health Content for Elderly People: A Systematic Review Study

Authors: Mahnaz Khalafehnilsaz, Rozina Rahnama

Abstract:

Introduction: The elderly population is growing rapidly, and with this growth comes an increased demand for healthcare services. Artificial intelligence (AI) has the potential to revolutionize the delivery of healthcare services to the elderly population. In this study, the various ways in which AI is used to create health content for elderly people and its transformative impact on the healthcare industry will be explored. Method: A systematic review of the literature was conducted to identify studies that have investigated the role of AI in creating health content specifically for elderly people. Several databases, including PubMed, Scopus, and Web of Science, were searched for relevant articles published between 2000 and 2022. The search strategy employed a combination of keywords related to AI, personalized health content, and the elderly. Studies that utilized AI to create health content for elderly individuals were included, while those that did not meet the inclusion criteria were excluded. A total of 20 articles that met the inclusion criteria were identified. Finding: The findings of this review highlight the diverse applications of AI in creating health content for elderly people. One significant application is the use of natural language processing (NLP), which involves the creation of chatbots and virtual assistants capable of providing personalized health information and advice to elderly patients. AI is also utilized in the field of medical imaging, where algorithms analyze medical images such as X-rays, CT scans, and MRIs to detect diseases and abnormalities. Additionally, AI enables the development of personalized health content for elderly patients by analyzing large amounts of patient data to identify patterns and trends that can inform healthcare providers in developing tailored treatment plans. Conclusion: AI is transforming the healthcare industry by providing a wide range of applications that can improve patient outcomes and reduce healthcare costs. From creating chatbots and virtual assistants to analyzing medical images and developing personalized treatment plans, AI is revolutionizing the way healthcare is delivered to elderly patients. Continued investment in this field is essential to ensure that elderly patients receive the best possible care.

Keywords: artificial intelligence, health content, older adult, healthcare

Procedia PDF Downloads 48
1084 To Be a Nurse in Turkey: A Comparison Based on International Labour Organization's Nursing Personnel Recommendation

Authors: Arzu K. Harmanci Seren, Feride Eskin Bacaksiz

Abstract:

The shortage of nursing personnel is considered one of the most important labour force issues in health sector of developed countries since early 1970s. International Labour Organization developed standards for working conditions of nurses in collaboration with World Health Organization with the aim of helping to solve nursing shortage problem all over the world. As a result of this collaboration, ILO Nursing Personnel Convention (C. 149), and the accompanying Recommendation (R. 157) were adopted in 1977. Turkey as a country that has a serious nurse shortage problem, has been a member of ILO since 1932, and has not signed this convention yet. This study was planned to compare some of the working standards in Convention with the present working conditions of nurses in Turkey. The data were collected by an on line survey between 19 January-16 February 2015 for this cross-sectional study. Participants were reached through social network accounts in collaboration with nursing associations. Totally 828 nurses from the 57 provinces of Turkey participated in the study. Survey was consisted of 14 open ended questions related to working conditions of nurses and 34 Likert statements related to nursing policies of the facilities they are working in. The data were analysed using the IBM SPSS 21.0 (licensed to Istanbul University) software. Descriptive and comparative statistics were performed. Most of the participants (81.5%) were staff and 18.5% of them were manager nurses. Most of them had baccalaureate (57.9%) or master (27.4%) degree in nursing. 18.5% of the participants were working in private hospitals, 34.9% of them in university hospitals and 46.6% of them were in Ministry of Health Hospitals. It was found that monthly working schedules were announced mostly 7 days ago (18%), working time of nurses was at least 8 hours (41.5%) and at most 24 hours (22.8%) in a day and had time for lunch or dinner 25.18 (SD=16.66), for resting 21.02 (SD=29.25) minutes. On the other hand, it was determined that 316 (43.2%) nurses did not have time for lunch and 61 (7.9%) of them could not find time for eating anything. It was also explored they were working 15-96 hours in a week (mean=48.28, SD=8.89 hours), 4-29 days in a month (mean=19.29, SD=5.03 days) and 597 (72%) nurses overworked changing form 1 hour to 150 hours (32.80, SD=23.42 hours) before the month in which surveys were filled. Most of the participants did not leave the job due to the sickness (47.5%) even if they felt sick. Also most of them did not leave the job due to any excuse (67.2%) or education (57.3%). This study has significance because of nurses from different provinces participated in and it provides brief information about the working conditions of nurses nationwide. It was explored that nurses in Turkey were working at worse conditions according the International Labour Organization’s recommendations.

Keywords: nurse, international labour organization, recommendations for nurses, working conditions

Procedia PDF Downloads 235
1083 Conflicts of Interest in the Private Sector and the Significance of the Public Interest Test

Authors: Opemiposi Adegbulu

Abstract:

Conflicts of interest is an elusive, diverse and engaging subject, a cross-cutting problem of governance; all levels of governance, ranging from local to global, public to corporate or financial sectors. In all these areas, its mismanagement could lead to the distortion of decision-making processes, corrosion of trust and the weakening of administration. According to Professor Peters, an expert in the area, conflict of interest, a problem at the root of many scandals has “become a pervasive ethical concern in our professional, organisational, and political life”. Conflicts of interest corrode trust, and like in the public sector, trust is mandatory for the market, consumers/clients, shareholders and other stakeholders in the private sector. However, conflicts of interest in the private sector are distinct and must be treated in like manner when regulatory efforts are made to address them. The research looks at identifying conflicts of interest in the private sector and differentiating them from those in the public sector. The public interest is submitted as a criterion which allows for such differentiation. This is significant because it would for the use of tailor-made or sector-specific approaches to addressing this complex issue. This is conducted through extensive review of literature and theories on the definition of conflicts of interest. This study will employ theoretical, doctrinal and comparative methods. The nature of conflicts of interest in the private sector will be explored, through an analysis of the public sector where the notion of conflicts of interest appears more clearly identified, reasons, why they are of business ethics concern, will be advanced, and then, once again, looking at public sector solutions and other solutions, the study will identify ways of mitigating and managing conflicts in the private sector. An exploration of public sector conflicts of interest and solutions will be carried out because the typologies of conflicts of interest in both sectors appear very similar at the core and thus, lessons can be learnt with regards to the management of these issues in the private sector. Conflicts of interest corrode trust, and like in the public sector, trust is mandatory for the market, consumers/clients, shareholders and other stakeholders in the private sector. This research will then focus on some specific challenges to understanding and identifying conflicts of interest in the private sector; origin, diverging theories, the psychological barrier to the definition, similarities with public sector conflicts of interest due to the notions of corrosion of trust, ‘being in a particular kind of situation,’ etc. The notion of public interest will be submitted as a key element at the heart of the distinction between public sector and private sector conflicts of interests. It will then be proposed that the appreciation of the notion of conflicts of interest differ according to sector, country to country, based on the public interest test, using the United Kingdom (UK), the United States of America (US), France and the Philippines as illustrations.

Keywords: conflicts of interest, corporate governance, global governance, public interest

Procedia PDF Downloads 377
1082 AI Predictive Modeling of Excited State Dynamics in OPV Materials

Authors: Pranav Gunhal., Krish Jhurani

Abstract:

This study tackles the significant computational challenge of predicting excited state dynamics in organic photovoltaic (OPV) materials—a pivotal factor in the performance of solar energy solutions. Time-dependent density functional theory (TDDFT), though effective, is computationally prohibitive for larger and more complex molecules. As a solution, the research explores the application of transformer neural networks, a type of artificial intelligence (AI) model known for its superior performance in natural language processing, to predict excited state dynamics in OPV materials. The methodology involves a two-fold process. First, the transformer model is trained on an extensive dataset comprising over 10,000 TDDFT calculations of excited state dynamics from a diverse set of OPV materials. Each training example includes a molecular structure and the corresponding TDDFT-calculated excited state lifetimes and key electronic transitions. Second, the trained model is tested on a separate set of molecules, and its predictions are rigorously compared to independent TDDFT calculations. The results indicate a remarkable degree of predictive accuracy. Specifically, for a test set of 1,000 OPV materials, the transformer model predicted excited state lifetimes with a mean absolute error of 0.15 picoseconds, a negligible deviation from TDDFT-calculated values. The model also correctly identified key electronic transitions contributing to the excited state dynamics in 92% of the test cases, signifying a substantial concordance with the results obtained via conventional quantum chemistry calculations. The practical integration of the transformer model with existing quantum chemistry software was also realized, demonstrating its potential as a powerful tool in the arsenal of materials scientists and chemists. The implementation of this AI model is estimated to reduce the computational cost of predicting excited state dynamics by two orders of magnitude compared to conventional TDDFT calculations. The successful utilization of transformer neural networks to accurately predict excited state dynamics provides an efficient computational pathway for the accelerated discovery and design of new OPV materials, potentially catalyzing advancements in the realm of sustainable energy solutions.

Keywords: transformer neural networks, organic photovoltaic materials, excited state dynamics, time-dependent density functional theory, predictive modeling

Procedia PDF Downloads 98
1081 Using Corpora in Semantic Studies of English Adjectives

Authors: Oxana Lukoshus

Abstract:

The methods of corpus linguistics, a well-established field of research, are being increasingly applied in cognitive linguistics. Corpora data are especially useful for different quantitative studies of grammatical and other aspects of language. The main objective of this paper is to demonstrate how present-day corpora can be applied in semantic studies in general and in semantic studies of adjectives in particular. Polysemantic adjectives have been the subject of numerous studies. But most of them have been carried out on dictionaries. Undoubtedly, dictionaries are viewed as one of the basic data sources, but only at the initial steps of a research. The author usually starts with the analysis of the lexicographic data after which s/he comes up with a hypothesis. In the research conducted three polysemantic synonyms true, loyal, faithful have been analyzed in terms of differences and similarities in their semantic structure. A corpus-based approach in the study of the above-mentioned adjectives involves the following. After the analysis of the dictionary data there was the reference to the following corpora to study the distributional patterns of the words under study – the British National Corpus (BNC) and the Corpus of Contemporary American English (COCA). These corpora are continually updated and contain thousands of examples of the words under research which make them a useful and convenient data source. For the purpose of this study there were no special needs regarding genre, mode or time of the texts included in the corpora. Out of the range of possibilities offered by corpus-analysis software (e.g. word lists, statistics of word frequencies, etc.), the most useful tool for the semantic analysis was the extracting a list of co-occurrence for the given search words. Searching by lemmas, e.g. true, true to, and grouping the results by lemmas have proved to be the most efficient corpora feature for the adjectives under the study. Following the search process, the corpora provided a list of co-occurrences, which were then to be analyzed and classified. Not every co-occurrence was relevant for the analysis. For example, the phrases like An enormous sense of responsibility to protect the minds and hearts of the faithful from incursions by the state was perceived to be the basic duty of the church leaders or ‘True,’ said Phoebe, ‘but I'd probably get to be a Union Official immediately were left out as in the first example the faithful is a substantivized adjective and in the second example true is used alone with no other parts of speech. The subsequent analysis of the corpora data gave the grounds for the distribution groups of the adjectives under the study which were then investigated with the help of a semantic experiment. To sum it up, the corpora-based approach has proved to be a powerful, reliable and convenient tool to get the data for the further semantic study.

Keywords: corpora, corpus-based approach, polysemantic adjectives, semantic studies

Procedia PDF Downloads 302
1080 Quantum Cum Synaptic-Neuronal Paradigm and Schema for Human Speech Output and Autism

Authors: Gobinathan Devathasan, Kezia Devathasan

Abstract:

Objective: To improve the current modified Broca-Wernicke-Lichtheim-Kussmaul speech schema and provide insight into autism. Methods: We reviewed the pertinent literature. Current findings, involving Brodmann areas 22, 46, 9,44,45,6,4 are based on neuropathology and functional MRI studies. However, in primary autism, there is no lucid explanation and changes described, whether neuropathology or functional MRI, appear consequential. Findings: We forward an enhanced model which may explain the enigma related to autism. Vowel output is subcortical and does need cortical representation whereas consonant speech is cortical in origin. Left lateralization is needed to commence the circuitry spin as our life have evolved with L-amino acids and left spin of electrons. A fundamental species difference is we are capable of three syllable-consonants and bi-syllable expression whereas cetaceans and songbirds are confined to single or dual consonants. The 4 key sites for speech are superior auditory cortex, Broca’s two areas, and the supplementary motor cortex. Using the Argand’s diagram and Reimann’s projection, we theorize that the Euclidean three dimensional synaptic neuronal circuits of speech are quantized to coherent waves, and then decoherence takes place at area 6 (spherical representation). In this quantum state complex, 3-consonant languages are instantaneously integrated and multiple languages can be learned, verbalized and differentiated. Conclusion: We postulate that evolutionary human speech is elevated to quantum interaction unlike cetaceans and birds to achieve the three consonants/bi-syllable speech. In classical primary autism, the sudden speech switches off and on noted in several cases could now be explained not by any anatomical lesion but failure of coherence. Area 6 projects directly into prefrontal saccadic area (8); and this further explains the second primary feature in autism: lack of eye contact. The third feature which is repetitive finger gestures, located adjacent to the speech/motor areas, are actual attempts to communicate with the autistic child akin to sign language for the deaf.

Keywords: quantum neuronal paradigm, cetaceans and human speech, autism and rapid magnetic stimulation, coherence and decoherence of speech

Procedia PDF Downloads 175
1079 Bioinformatics High Performance Computation and Big Data

Authors: Javed Mohammed

Abstract:

Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.

Keywords: high performance, big data, parallel computation, molecular data, computational biology

Procedia PDF Downloads 350
1078 An Exploratory Study of Vocational High School Students’ Needs in Learning English

Authors: Yi-Hsuan Gloria Lo

Abstract:

The educational objective of vocational high schools (VHSs) is to equip VHS students with practical skills and knowledge that can be applied in the job-related market. However, with the increasing number of technological universities over the past two decades, the majority of VHS students have chosen to receive higher education rather than enter the job market. VHS English education has been confronting a dilemma: Should an English for specific purposes (ESP) approach, which aligns with the educational goal of VHS education, be taken or should an English for general purposes (EGP) approach, which prepares VHS students for advanced studies in universities, be followed? While ESP theorists proposed that that ESP can be taught to secondary learners, little was known about VHS students’ perspective on this ESP-versus-EGP dilemma. Scant research has investigated different facets of students’ needs (necessities, wants, and lacks) for both ESP and EGP in terms of the four language skills and the factors that contribute to any differences. To address the gap in the literature, 100 VHS students responded to statements related to their necessities, wants, and lacks in learning ESP and EGP on a 6-point Likert scale. Six VHS students were interviewed to tap into the reasons for different facets of the needs for learning EGP and ESP. The statistical analysis indicates that at this stage of learning English, VHS subjects believed that EGP was more necessary than ESP; EGP was more desirable than ESP. However, they reported that they were more lacking in ESP than in EGP learning. Regarding EGP, the results show that the VHS subjects rated speaking as their most necessary skill, speaking as the most desirable skill, and writing as the most lacking skill. A significant difference was found between perceived learning necessities and lacks and between perceived wants and lacks. No statistical difference was found between necessities and wants. In the aspect of ESP, the results indicate that the VHS subjects marked reading as their most necessary skill, speaking as the most desirable skill, and writing as the most lacking skill. A significant difference exists between their perceived necessities and lacks and between their wants and lacks. However, there is no statistically significant difference between their perceived lacks and wants. Despite the lack of a significant difference between learning necessities and wants, the qualitative interview data reveal that the reasons for their perceived necessities and wants were different. The findings of the study confirm previous research that demonstrates that ‘needs’ is a multiple and conflicting construct. What VHS students felt most lacking was not necessarily what they believed they should learn or would like to learn. Although no statistical difference was found, different reasons were attributed to their perceived necessities and wants. Both theoretical and practical implications have been drawn and discussed for ESP research in general and teaching ESP in VHSs in particular.

Keywords: vocational high schools (VHSs), English for General Purposes (EGP), English for Specific Purposes (ESP), needs analysis

Procedia PDF Downloads 156
1077 Hidro-IA: An Artificial Intelligent Tool Applied to Optimize the Operation Planning of Hydrothermal Systems with Historical Streamflow

Authors: Thiago Ribeiro de Alencar, Jacyro Gramulia Junior, Patricia Teixeira Leite

Abstract:

The area of the electricity sector that deals with energy needs by the hydroelectric in a coordinated manner is called Operation Planning of Hydrothermal Power Systems (OPHPS). The purpose of this is to find a political operative to provide electrical power to the system in a given period, with reliability and minimal cost. Therefore, it is necessary to determine an optimal schedule of generation for each hydroelectric, each range, so that the system meets the demand reliably, avoiding rationing in years of severe drought, and that minimizes the expected cost of operation during the planning, defining an appropriate strategy for thermal complementation. Several optimization algorithms specifically applied to this problem have been developed and are used. Although providing solutions to various problems encountered, these algorithms have some weaknesses, difficulties in convergence, simplification of the original formulation of the problem, or owing to the complexity of the objective function. An alternative to these challenges is the development of techniques for simulation optimization and more sophisticated and reliable, it can assist the planning of the operation. Thus, this paper presents the development of a computational tool, namely Hydro-IA for solving optimization problem identified and to provide the User an easy handling. Adopted as intelligent optimization technique is Genetic Algorithm (GA) and programming language is Java. First made the modeling of the chromosomes, then implemented the function assessment of the problem and the operators involved, and finally the drafting of the graphical interfaces for access to the User. The results with the Genetic Algorithms were compared with the optimization technique nonlinear programming (NLP). Tests were conducted with seven hydroelectric plants interconnected hydraulically with historical stream flow from 1953 to 1955. The results of comparison between the GA and NLP techniques shows that the cost of operating the GA becomes increasingly smaller than the NLP when the number of hydroelectric plants interconnected increases. The program has managed to relate a coherent performance in problem resolution without the need for simplification of the calculations together with the ease of manipulating the parameters of simulation and visualization of output results.

Keywords: energy, optimization, hydrothermal power systems, artificial intelligence and genetic algorithms

Procedia PDF Downloads 406
1076 Conjugated Linoleic Acid Effect on Body Weight and Body Composition in Women: Systematic Review and Meta-Analysis

Authors: Hanady Hamdallah, H. Elyse Ireland, John H. H. Williams

Abstract:

Conjugated linoleic acid (CLA) is a food supplement that is reported to have multiple beneficial health effects, including being anti-carcinogenic, anti-inflammatory and anti-obesity. Animal studies have shown a significant anti-obesity effect of CLA, but results in humans were inconsistent, where some of the studies found an anti-obesity effect while other studies failed to find any decline in obesity markers after CLA supplementation. This meta-analysis aimed to determine if oral CLA supplementation has been shown to reduce obesity related markers in women. Pub Med, Cochrane Library, and Google Scholar were used to identify the eligible trials using two main searching strategies: the first one was to search eligible trials using keywords 'Conjugated linoleic acid', 'CLA', 'Women', and the second strategy was to extract the eligible trials from previously published systematic reviews and meta-analyses. The eligible trials were placebo control trials where women supplemented with CLA mixture in the form of oral capsules for 6 months or less. Also, these trials provided information about body composition expressed as body weight (BW), body mass index (BMI), total body fat (TBF), percentage body fat (BF %), and/ or lean body mass (LBM). The quality of each included study was assessed using both JADAD scale and an adapted CONSERT checklist. Meta-analysis of 8 eligible trials showed that CLA supplementation was significantly associated with reduced BW (Mean ± SD, 1.2 ± 0.26 kg, p < 0.001), BMI (0.6 ± 0.13kg/m², p < 0.001) and TBF (0.76 ± 0.26 kg, p= 0.003) in women, when supplemented over 6-16 weeks. Subgroup meta-analysis demonstrated a significant reduction in BW (1.29 ± 0.31 kg, p < 0.001), BMI (0.60 ± 0.14 kg/m², p < 0.001) and TBF (0.82 ± 0.28 kg, p= 0.003) in the trials that had recruited overweight-obese women. The second subgroup meta-analysis, that considered the menopausal status of the participants, found that CLA was significantly associated with reduced BW (1.35 ± 0.37 kg, p < 0.001; 1.05 ± 0.36 kg, p= 0.003) and BMI (0.50 ± 0.17 kg/m², p= 0.003; 0.75 ± 0.2 kg/m², p < 0.001) in both pre and post-menopausal age women, respectively. A reduction in TBF (1.09 ± 0.37 kg, p= 0.003) was only significant in post-menopausal women. Interestingly, CLA supplementation was associated with a significant reduction in BW (1.05 ± 0.35 kg, p< 0.003), BMI (0.73 ± 0.2 kg/m², p < 0.001) and TBF (1.07 ± 0.36 kg, p= 0.003) in the trials without lifestyle monitoring or interventions. No significant effect of CLA on LBM was detected in this meta-analysis. This meta-analysis suggests a moderate anti-obesity effect of CLA on BW, BMI and TBF reduction in women, when supplemented over 6-16 weeks, particularly in overweight-obese women and post-menopausal women. However, this finding requires careful interpretation due to several issues in the designs of available CLA supplementation trials. More well-designed trials are required to confirm this meta-analysis results.

Keywords: body composition, body mass index, body weight, conjugated linoleic acid

Procedia PDF Downloads 272
1075 Hand Movements and the Effect of Using Smart Teaching Aids: Quality of Writing Styles Outcomes of Pupils with Dysgraphia

Authors: Sadeq Al Yaari, Muhammad Alkhunayn, Sajedah Al Yaari, Adham Al Yaari, Ayman Al Yaari, Montaha Al Yaari, Ayah Al Yaari, Fatehi Eissa

Abstract:

Dysgraphia is a neurological disorder of written expression that impairs writing ability and fine motor skills, resulting primarily in problems relating not only to handwriting but also to writing coherence and cohesion. We investigate the properties of smart writing technology to highlight some unique features of the effects they cause on the academic performance of pupils with dysgraphia. In Amis, dysgraphics undergo writing problems to express their ideas due to ordinary writing aids, as the default strategy. The Amis data suggests a possible connection between available writing aids and pupils’ writing improvement; therefore, texts’ expression and comprehension. A group of thirteen dysgraphic pupils were placed in a regular classroom of primary school, with twenty-one pupils being recruited in the study as a control group. To ensure validity, reliability and accountability to the research, both groups studied writing courses for two semesters, of which the first was equipped with smart writing aids while the second took place in an ordinary classroom. Two pre-tests were undertaken at the beginning of the first two semesters, and two post-tests were administered at the end of both semesters. Tests examined pupils’ ability to write coherent, cohesive and expressive texts. The dysgraphic group received the treatment of a writing course in the first semester in classes with smart technology and produced significantly greater increases in writing expression than in an ordinary classroom, and their performance was better than that of the control group in the second semester. The current study concludes that using smart teaching aids is a ‘MUST’, both for teaching and learning dysgraphia. Furthermore, it is demonstrated that for young dysgraphia, expressive tasks are more challenging than coherent and cohesive tasks. The study, therefore, supports the literature suggesting a role for smart educational aids in writing and that smart writing techniques may be an efficient addition to regular educational practices, notably in special educational institutions and speech-language therapeutic facilities. However, further research is needed to prompt the adults with dysgraphia more often than is done to the older adults without dysgraphia in order to get them to finish the other productive and/or written skills tasks.

Keywords: smart technology, writing aids, pupils with dysgraphia, hands’ movement

Procedia PDF Downloads 18
1074 Treating Voxels as Words: Word-to-Vector Methods for fMRI Meta-Analyses

Authors: Matthew Baucum

Abstract:

With the increasing popularity of fMRI as an experimental method, psychology and neuroscience can greatly benefit from advanced techniques for summarizing and synthesizing large amounts of data from brain imaging studies. One promising avenue is automated meta-analyses, in which natural language processing methods are used to identify the brain regions consistently associated with certain semantic concepts (e.g. “social”, “reward’) across large corpora of studies. This study builds on this approach by demonstrating how, in fMRI meta-analyses, individual voxels can be treated as vectors in a semantic space and evaluated for their “proximity” to terms of interest. In this technique, a low-dimensional semantic space is built from brain imaging study texts, allowing words in each text to be represented as vectors (where words that frequently appear together are near each other in the semantic space). Consequently, each voxel in a brain mask can be represented as a normalized vector sum of all of the words in the studies that showed activation in that voxel. The entire brain mask can then be visualized in terms of each voxel’s proximity to a given term of interest (e.g., “vision”, “decision making”) or collection of terms (e.g., “theory of mind”, “social”, “agent”), as measured by the cosine similarity between the voxel’s vector and the term vector (or the average of multiple term vectors). Analysis can also proceed in the opposite direction, allowing word cloud visualizations of the nearest semantic neighbors for a given brain region. This approach allows for continuous, fine-grained metrics of voxel-term associations, and relies on state-of-the-art “open vocabulary” methods that go beyond mere word-counts. An analysis of over 11,000 neuroimaging studies from an existing meta-analytic fMRI database demonstrates that this technique can be used to recover known neural bases for multiple psychological functions, suggesting this method’s utility for efficient, high-level meta-analyses of localized brain function. While automated text analytic methods are no replacement for deliberate, manual meta-analyses, they seem to show promise for the efficient aggregation of large bodies of scientific knowledge, at least on a relatively general level.

Keywords: FMRI, machine learning, meta-analysis, text analysis

Procedia PDF Downloads 432
1073 A Unified Approach for Digital Forensics Analysis

Authors: Ali Alshumrani, Nathan Clarke, Bogdan Ghite, Stavros Shiaeles

Abstract:

Digital forensics has become an essential tool in the investigation of cyber and computer-assisted crime. Arguably, given the prevalence of technology and the subsequent digital footprints that exist, it could have a significant role across almost all crimes. However, the variety of technology platforms (such as computers, mobiles, Closed-Circuit Television (CCTV), Internet of Things (IoT), databases, drones, cloud computing services), heterogeneity and volume of data, forensic tool capability, and the investigative cost make investigations both technically challenging and prohibitively expensive. Forensic tools also tend to be siloed into specific technologies, e.g., File System Forensic Analysis Tools (FS-FAT) and Network Forensic Analysis Tools (N-FAT), and a good deal of data sources has little to no specialist forensic tools. Increasingly it also becomes essential to compare and correlate evidence across data sources and to do so in an efficient and effective manner enabling an investigator to answer high-level questions of the data in a timely manner without having to trawl through data and perform the correlation manually. This paper proposes a Unified Forensic Analysis Tool (U-FAT), which aims to establish a common language for electronic information and permit multi-source forensic analysis. Core to this approach is the identification and development of forensic analyses that automate complex data correlations, enabling investigators to investigate cases more efficiently. The paper presents a systematic analysis of major crime categories and identifies what forensic analyses could be used. For example, in a child abduction, an investigation team might have evidence from a range of sources including computing devices (mobile phone, PC), CCTV (potentially a large number), ISP records, and mobile network cell tower data, in addition to third party databases such as the National Sex Offender registry and tax records, with the desire to auto-correlate and across sources and visualize in a cognitively effective manner. U-FAT provides a holistic, flexible, and extensible approach to providing digital forensics in technology, application, and data-agnostic manner, providing powerful and automated forensic analysis.

Keywords: digital forensics, evidence correlation, heterogeneous data, forensics tool

Procedia PDF Downloads 177
1072 Sensory Ethnography and Interaction Design in Immersive Higher Education

Authors: Anna-Kaisa Sjolund

Abstract:

The doctoral thesis examines interaction design and sensory ethnography as tools to create immersive education environments. In recent years, there has been increasing interest and discussions among researchers and educators on immersive education like augmented reality tools, virtual glasses and the possibilities to utilize them in education at all levels. Using virtual devices as learning environments it is possible to create multisensory learning environments. Sensory ethnography in this study refers to the way of the senses consider the impact on the information dynamics in immersive learning environments. The past decade has seen the rapid development of virtual world research and virtual ethnography. Christine Hine's Virtual Ethnography offers an anthropological explanation of net behavior and communication change. Despite her groundbreaking work, time has changed the users’ communication style and brought new solutions to do ethnographical research. The virtual reality with all its new potential has come to the fore and considering all the senses. Movie and image have played an important role in cultural research for centuries, only the focus has changed in different times and in a different field of research. According to Karin Becker, the role of image in our society is information flow and she found two meanings what the research of visual culture is. The images and pictures are the artifacts of visual culture. Images can be viewed as a symbolic language that allows digital storytelling. Combining the sense of sight, but also the other senses, such as hear, touch, taste, smell, balance, the use of a virtual learning environment offers students a way to more easily absorb large amounts of information. It offers also for teachers’ different ways to produce study material. In this article using sensory ethnography as research tool approaches the core question. Sensory ethnography is used to describe information dynamics in immersive environment through interaction design. Immersive education environment is understood as three-dimensional, interactive learning environment, where the audiovisual aspects are central, but all senses can be taken into consideration. When designing learning environments or any digital service, interaction design is always needed. The question what is interaction design is justified, because there is no simple or consistent idea of what is the interaction design or how it can be used as a research method or whether it is only a description of practical actions. When discussing immersive learning environments or their construction, consideration should be given to interaction design and sensory ethnography.

Keywords: immersive education, sensory ethnography, interaction design, information dynamics

Procedia PDF Downloads 118
1071 Posterior Cortical Atrophy Phenotype of Alzheimer’s Dementia: A Case Report

Authors: Joana Beyer

Abstract:

Background: Alzheimer’s disease (AD) is the predominant cause of dementia, characterized by progressive cognitive decline. Posterior cortical atrophy (PCA) is a less common variant of AD, primarily affecting younger individuals and presenting with visual, visuospatial, and visuoperceptual deficits, often leading to delayed diagnosis due to its atypical presentation. Case Presentation: We report the case of a 58-year-old woman referred to psychiatric services with a two-year history of progressive visuospatial decline, mild memory difficulties, and language impairments, notably anomia. Despite undergoing cataract and squint surgeries, her visual symptoms persisted, impacting her professional life as a music educator. The neuropsychological evaluation revealed profound visuoperceptual and visuospatial disturbances, with neuroimaging supporting a diagnosis of PCA. Treatment with Donepezil showed symptom improvement, highlighting the challenges and importance of early intervention and managing this atypical form of AD. Methods: The diagnostic process involved comprehensive physical, neuropsychological assessments, and neuroimaging, including MRI and F18 FDG PET CT, which demonstrated severe bilateral posterior cortical involvement. The case underscores the utility of these modalities in diagnosing PCA. Results: The initiation of Donepezil, an acetylcholinesterase inhibitor, resulted in symptom improvement, emphasizing the potential for AD treatments to benefit PCA patients. However, challenges in management, including treatment side effects and the necessity of multidisciplinary care, are discussed. Conclusion: This case highlights PCA's diagnostic challenges due to its atypical presentation and the broader implications for managing younger patients with early-onset dementia. It underscores the necessity for early recognition, comprehensive assessment, and tailored management strategies, including both pharmacological and non-pharmacological interventions, to improve patients' quality of life. Additionally, the case illustrates the need for expanding community memory services to accommodate younger patients with atypical forms of dementia, advocating for a more inclusive approach to dementia care.

Keywords: Alzheimer’s disease, posterior cortical atrophy, dementia, diagnosis, management, donepezil, early-onset dementia

Procedia PDF Downloads 42
1070 Chinese Students’ Use of Corpus Tools in an English for Academic Purposes Writing Course: Influence on Learning Behaviour, Performance Outcomes and Perceptions

Authors: Jingwen Ou

Abstract:

Writing for academic purposes in a second or foreign language poses a significant challenge for non-native speakers, particularly at the tertiary level, where English academic writing for L2 students is often hindered by difficulties in academic discourse, including vocabulary, academic register, and organization. The past two decades have witnessed a rising popularity in the application of the data-driven learning (DDL) approach in EAP writing instruction. In light of such a trend, this study aims to enhance the integration of DDL into English for academic purposes (EAP) writing classrooms by investigating the perception of Chinese college students regarding the use of corpus tools for improving EAP writing. Additionally, the research explores their corpus consultation behaviors during training to provide insights into corpus-assisted EAP instruction for DDL practitioners. Given the uprising popularity of DDL, this research aims to investigate Chinese university students’ use of corpus tools with three main foci: 1) the influence of corpus tools on learning behaviours, 2) the influence of corpus tools on students’ academic writing performance outcomes, and 3) students’ perceptions and potential perceptional changes towards the use of such tools. Three corpus tools, CQPWeb, Sketch Engine, and LancsBox X, are selected for investigation due to the scarcity of empirical research on patterns of learners’ engagement with a combination of multiple corpora. The research adopts a pre-test / post-test design for the evaluation of students’ academic writing performance before and after the intervention. Twenty participants will be divided into two groups: an intervention and a non-intervention group. Three corpus training workshops will be delivered at the beginning, middle, and end of a semester. An online survey and three separate focus group interviews are designed to investigate students’ perceptions of the use of corpus tools for improving academic writing skills, particularly the rhetorical functions in different essay sections. Insights from students’ consultation sessions indicated difficulties with DDL practice, including insufficiency of time to complete all tasks, struggle with technical set-up, unfamiliarity with the DDL approach and difficulty with some advanced corpus functions. Findings from the main study aim to provide pedagogical insights and training resources for EAP practitioners and learners.

Keywords: corpus linguistics, data-driven learning, English for academic purposes, tertiary education in China

Procedia PDF Downloads 38
1069 Public Procurement and Innovation: A Municipal Approach

Authors: M. Moso-Diez, J. L. Moragues-Oregi, K. Simon-Elorz

Abstract:

Innovation procurement is designed to steer the development of solutions towards concrete public sector needs as a driver for innovation from the demand side (in public services as well as in market opportunities for companies), is horizontally emerging as a new policy instrument. In 2014 the new EU public procurement directives 2014/24/EC and 2014/25/EC reinforced the support for Public Procurement for Innovation, dedicating funding instruments that can be used across all areas supported by Horizon 2020, and targeting potential buyers of innovative solutions: groups of public procurers with similar needs. Under this programme, new policy adapters and networks emerge, aiming to embed innovation criteria into new procurement processes. As these initiatives are in process, research related to is scarce. We argue that Innovation Public Procurement can arise as an innovative policy instrument to public procurement in different policy domains, in spite of existing institutional and cultural barriers (legal guarantee versus innovation). The presentation combines insights from public procurement to supply management chain management in a sustainability and innovation policy arena, as a means of providing understanding of: (1) the circumstances that emerge; (2) the relationship between public and private actors; and (3) the emerging capacities in the definition of the agenda. The policy adopters are the contracting authorities that mainly are at municipal level where they interact with the supply management chain, interconnecting sustainability and climate measures with other policy priorities such as innovation and urban planning; and through the Competitive Dialogue procedure. We found that geography and territory affect both the level of municipal budget (due to municipal income per capita) and its institutional competencies (due to demographic reasons). In spite of the relevance of institutional determinants for public procurement, other factors play an important role such as human factors as well as both public policy and private intervention. The experience is a ‘city project’ (Bilbao) in the field of brownfield decontamination. Brownfield sites typically refer to abandoned or underused industrial and commercial properties—such as old process plants, mining sites, and landfills—that are available but contain low levels of environmental contaminants that may complicate reuse or redevelopment of the land. This article concludes that Innovation Public Procurement in sustainability and climate issues should be further developed both as a policy instrument and as a policy research line that could enable further relevant changes in public procurement as well as in climate innovation.

Keywords: innovation, city projects, public policy, public procurement

Procedia PDF Downloads 292
1068 Changing Behaviour in the Digital Era: A Concrete Use Case from the Domain of Health

Authors: Francesca Spagnoli, Shenja van der Graaf, Pieter Ballon

Abstract:

Humans do not behave rationally. We are emotional, easily influenced by others, as well as by our context. The study of human behaviour became a supreme endeavour within many academic disciplines, including economics, sociology, and clinical and social psychology. Understanding what motivates humans and triggers them to perform certain activities, and what it takes to change their behaviour, is central both for researchers and companies, as well as policy makers to implement efficient public policies. While numerous theoretical approaches for diverse domains such as health, retail, environment have been developed, the methodological models guiding the evaluation of such research have reached for a long time their limits. Within this context, digitisation, the Information and communication technologies (ICT) and wearable, the Internet of Things (IoT) connecting networks of devices, and new possibilities to collect and analyse massive amounts of data made it possible to study behaviour from a realistic perspective, as never before. Digital technologies make it possible to (1) capture data in real-life settings, (2) regain control over data by capturing the context of behaviour, and (3) analyse huge set of information through continuous measurement. Within this complex context, this paper describes a new framework for initiating behavioural change, capitalising on the digital developments in applied research projects and applicable both to academia, enterprises and policy makers. By applying this model, behavioural research can be conducted to address the issues of different domains, such as mobility, environment, health or media. The Modular Behavioural Analysis Approach (MBAA) is here described and firstly validated through a concrete use case within the domain of health. The results gathered have proven that disclosing information about health in connection with the use of digital apps for health, can be a leverage for changing behaviour, but it is only a first component requiring further follow-up actions. To this end, a clear definition of different 'behavioural profiles', towards which addressing several typologies of interventions, it is essential to effectively enable behavioural change. In the refined version of the MBAA a strong focus will rely on defining a methodology for shaping 'behavioural profiles' and related interventions, as well as the evaluation of side-effects on the creation of new business models and sustainability plans.

Keywords: behavioural change, framework, health, nudging, sustainability

Procedia PDF Downloads 208
1067 Interactive Garments: Flexible Technologies for Textile Integration

Authors: Anupam Bhatia

Abstract:

Upon reviewing the literature and the pragmatic work done in the field of E- textiles, it is observed that the applications of wearable technologies have found a steady growth in the field of military, medical, industrial, sports; whereas fashion is at a loss to know how to treat this technology and bring it to market. The purpose of this paper is to understand the practical issues of integration of electronics in garments; cutting patterns for mass production, maintaining the basic properties of textiles and daily maintenance of garments that hinder the wide adoption of interactive fabric technology within Fashion and leisure wear. To understand the practical hindrances an experimental and laboratory approach is taken. “Techno Meets Fashion” has been an interactive fashion project where sensor technologies have been embedded with textiles that result in set of ensembles that are light emitting garments, sound sensing garments, proximity garments, shape memory garments etc. Smart textiles, especially in the form of textile interfaces, are drastically underused in fashion and other lifestyle product design. Clothing and some other textile products must be washable, which subjects to the interactive elements to water and chemical immersion, physical stress, and extreme temperature. The current state of the art tends to be too fragile for this treatment. The process for mass producing traditional textiles becomes difficult in interactive textiles. As cutting patterns from larger rolls of cloth and sewing them together to make garments breaks and reforms electronic connections in an uncontrolled manner. Because of this, interactive fabric elements are integrated by hand into textiles produced by standard methods. The Arduino has surely made embedding electronics into textiles much easier than before; even then electronics are not integral to the daily wear garments. Soft and flexible interfaces of MEMS (micro sensors and Micro actuators) can be an option to make this possible by blending electronics within E-textiles in a way that’s seamless and still retains functions of the circuits as well as the garment. Smart clothes, which offer simultaneously a challenging design and utility value, can be only mass produced if the demands of the body are taken care of i.e. protection, anthropometry, ergonomics of human movement, thermo- physiological regulation.

Keywords: ambient intelligence, proximity sensors, shape memory materials, sound sensing garments, wearable technology

Procedia PDF Downloads 376
1066 Techno Commercial Aspects of Using LPG as an Alternative Energy Solution for Transport and Industrial Sector in Bangladesh: Case Studies in Industrial Sector

Authors: Mahadehe Hassan

Abstract:

Transport system and industries which are the main basis of industrial and socio-economic development of any country. It is mainly dependent on fossil fuels. Bangladesh has fossil fuel reserves of 9.51 TCF as of July 2023, and if no new gas fields are discovered in the next 7-9 years and if the existing gas consumption rate continues, the fossil fuel reserves will be exhausted. The demand for petroleum products in Bangladesh is increasing steadily, with 63% imported by BPC and 37% imported by private companies. 61.61% of BPC imported products are used in the transport sector and 5.49% in the industrial sector, which is expensive and harmful to the environment. Liquefied Petroleum Gas (LPG) should be considered as an alternative energy for Bangladesh based on Sustainable Development Goals (SDGs) criteria for sustainable, clean and affordable energy. This will not only lead to the much desired mitigation of energy famine in the country but also contribute favorably to the macroeconomic indicators. Considering the environmental and economic issues, the government has referred to CNG (compressed natural gas) as the fuel carrier since 2000, but currently due to the decline mode of gas reserves, the government of Bangladesh is thinking of new energy sources for transport and industrial sectors which will be sustainable, environmentally friendly and economically viable. Liquefied Petroleum Gas (LPG) is the best choice for fueling transport and industrial sectors in Bangladesh. At present, a total of 1.54 million metric tons of liquefied petroleum gas (LPG) is marketed in Bangladesh by the public and private sectors. 83% of it is used by households, 12% by industry and commerce and 5% by transportation. Industrial and transport sector consumption is negligible compared to household consumption. So the purpose of the research is to find out the challenges of LPG market development in transport and industrial sectors in Bangladesh and make recommendations to reduce the challenges. Secure supply chain, inadequate infrastructure, insufficient investment, lack of government monitoring and consumer awareness in the transport sector and industrial sector are major challenges for LPG market development in Bangladesh. Bangladesh government as well as private owners should come forward in the development of liquefied petroleum gas (LPG) industry to reduce the challenges of secure energy sector for sustainable development. Furthermore, ensuring adequate Liquefied Petroleum Gas (LPG) supply in Bangladesh requires government regulations, infrastructure improvements in port areas, awareness raising and most importantly proper pricing of Liquefied Petroleum Gas (LPG) to address the energy crisis in Bangladesh.

Keywords: transportand industries fuel, LPG consumption, challenges, economical sustainability

Procedia PDF Downloads 68
1065 CRISPR/Cas9 Based Gene Stacking in Plants for Virus Resistance Using Site-Specific Recombinases

Authors: Sabin Aslam, Sultan Habibullah Khan, James G. Thomson, Abhaya M. Dandekar

Abstract:

Losses due to viral diseases are posing a serious threat to crop production. A quick breakdown of resistance to viruses like Cotton Leaf Curl Virus (CLCuV) demands the application of a proficient technology to engineer durable resistance. Gene stacking has recently emerged as a potential approach for integrating multiple genes in crop plants. In the present study, recombinase technology has been used for site-specific gene stacking. A target vector (pG-Rec) was designed for engineering a predetermined specific site in the plant genome whereby genes can be stacked repeatedly. Using Agrobacterium-mediated transformation, the pG-Rec was transformed into Coker-312 along with Nicotiana tabacum L. cv. Xanthi and Nicotiana benthamiana. The transgene analysis of target lines was conducted through junction PCR. The transgene positive target lines were used for further transformations to site-specifically stack two genes of interest using Bxb1 and PhiC31 recombinases. In the first instance, Cas9 driven by multiplex gRNAs (for Rep gene of CLCuV) was site-specifically integrated into the target lines and determined by the junction PCR and real-time PCR. The resulting plants were subsequently used to stack the second gene of interest (AVP3 gene from Arabidopsis for enhancing cotton plant growth). The addition of the genes is simultaneously achieved with the removal of marker genes for recycling with the next round of gene stacking. Consequently, transgenic marker-free plants were produced with two genes stacked at the specific site. These transgenic plants can be potential germplasm to introduce resistance against various strains of cotton leaf curl virus (CLCuV) and abiotic stresses. The results of the research demonstrate gene stacking in crop plants, a technology that can be used to introduce multiple genes sequentially at predefined genomic sites. The current climate change scenario highlights the use of such technologies so that gigantic environmental issues can be tackled by several traits in a single step. After evaluating virus resistance in the resulting plants, the lines can be a primer to initiate stacking of further genes in Cotton for other traits as well as molecular breeding with elite cotton lines.

Keywords: cotton, CRISPR/Cas9, gene stacking, genome editing, recombinases

Procedia PDF Downloads 129
1064 On the Semantics and Pragmatics of 'Be Able To': Modality and Actualisation

Authors: Benoît Leclercq, Ilse Depraetere

Abstract:

The goal of this presentation is to shed new light on the semantics and pragmatics of be able to. It presents the results of a corpus analysis based on data from the BNC (British National Corpus), and discusses these results in light of a specific stance on the semantics-pragmatics interface taking into account recent developments. Be able to is often discussed in relation to can and could, all of which can be used to express ability. Such an onomasiological approach often results in the identification of usage constraints for each expression. In the case of be able to, it is the formal properties of the modal expression (unlike can and could, be able to has non-finite forms) that are in the foreground, and the modal expression is described as the verb that conveys future ability. Be able to is also argued to expressed actualised ability in the past (I was able/could to open the door). This presentation aims to provide a more accurate pragmatic-semantic profile of be able to, based on extensive data analysis and one that is embedded in a very explicit view on the semantics-pragmatics interface. A random sample of 3000 examples (1000 for each modal verb) extracted from the BNC was analysed to account for the following issues. First, the challenge is to identify the exact semantic range of be able to. The results show that, contrary to general assumption, be able to does not only express ability but it shares most of the root meanings usually associated with the possibility modals can and could. The data reveal that what is called opportunity is, in fact, the most frequent meaning of be able to. Second, attention will be given to the notion of actualisation. It is commonly argued that be able to is the preferred form when the residue actualises: (1) The only reason he was able to do that was because of the restriction (BNC, spoken) (2) It is only through my imaginative shuffling of the aces that we are able to stay ahead of the pack. (BNC, written) Although this notion has been studied in detail within formal semantic approaches, empirical data is crucially lacking and it is unclear whether actualisation constitutes a conventional (and distinguishing) property of be able to. The empirical analysis provides solid evidence that actualisation is indeed a conventional feature of the modal. Furthermore, the dataset reveals that be able to expresses actualised 'opportunities' and not actualised 'abilities'. In the final part of this paper, attention will be given to the theoretical implications of the empirical findings, and in particular to the following paradox: how can the same expression encode both modal meaning (non-factual) and actualisation (factual)? It will be argued that this largely depends on one's conception of the semantics-pragmatics interface, and that this need not be an issue when actualisation (unlike modality) is analysed as a generalised conversational implicature and thus is considered part of the conventional pragmatic layer of be able to.

Keywords: Actualisation, Modality, Pragmatics, Semantics

Procedia PDF Downloads 111
1063 Transdisciplinary Pedagogy: An Arts-Integrated Approach to Promote Authentic Science, Technology, Engineering, Arts, and Mathematics Education in Initial Teacher Education

Authors: Anne Marie Morrin

Abstract:

This paper will focus on the design, delivery and assessment of a transdisciplinary STEAM (Science, Technology, Engineering, Arts, and Mathematics) education initiative in a college of education in Ireland. The project explores a transdisciplinary approach to supporting STEAM education where the concepts, methodologies and assessments employed derive from visual art sessions within initial teacher education. The research will demonstrate that the STEAM Education approach is effective when visual art concepts and methods are placed at the core of the teaching and learning experience. Within this study, emphasis is placed on authentic collaboration and transdisciplinary pedagogical approaches with the STEAM subjects. The partners included a combination of teaching expertise in STEM and Visual Arts education, artists, in-service and pre-service teachers and children. The inclusion of all stakeholders mentioned moves towards a more authentic approach where transdisciplinary practice is at the core of the teaching and learning. Qualitative data was collected using a combination of questionnaires (focused and open-ended questions) and focus groups. In addition, the data was collected through video diaries where students reflected on their visual journals and transdisciplinary practice, which gave rich insight into participants' experiences and opinions on their learning. It was found that an effective program of STEAM education integration was informed by co-teaching (continuous professional development), which involved a commitment to adaptable and flexible approaches to teaching, learning, and assessment, as well as the importance of continuous reflection-in-action by all participants. The delivery of a transdisciplinary model of STEAM education was devised to reconceptualizatise how individual subject areas can develop essential skills and tackle critical issues (such as self-care and climate change) through data visualisation and technology. The success of the project can be attributed to the collaboration, which was inclusive, flexible and a willingness between various stakeholders to be involved in the design and implementation of the project from conception to completion. The case study approach taken is particularistic (focusing on the STEAM-ED project), descriptive (providing in-depth descriptions from varied and multiple perspectives), and heuristic (interpreting the participants’ experiences and what meaning they attributed to their experiences).

Keywords: collaboration, transdisciplinary, STEAM, visual arts education

Procedia PDF Downloads 33