Search results for: first language acquisition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4466

Search results for: first language acquisition

2156 System for Electromyography Signal Emulation Through the Use of Embedded Systems

Authors: Valentina Narvaez Gaitan, Laura Valentina Rodriguez Leguizamon, Ruben Dario Hernandez B.

Abstract:

This work describes a physiological signal emulation system that uses electromyography (EMG) signals obtained from muscle sensors in the first instance. These signals are used to extract their characteristics to model and emulate specific arm movements. The main objective of this effort is to develop a new biomedical software system capable of generating physiological signals through the use of embedded systems by establishing the characteristics of the acquired signals. The acquisition system used was Biosignals, which contains two EMG electrodes used to acquire signals from the forearm muscles placed on the extensor and flexor muscles. Processing algorithms were implemented to classify the signals generated by the arm muscles when performing specific movements such as wrist flexion extension, palmar grip, and wrist pronation-supination. Matlab software was used to condition and preprocess the signals for subsequent classification. Subsequently, the mathematical modeling of each signal is performed to be generated by the embedded system, with a validation of the accuracy of the obtained signal using the percentage of cross-correlation, obtaining a precision of 96%. The equations are then discretized to be emulated in the embedded system, obtaining a system capable of generating physiological signals according to the characteristics of medical analysis.

Keywords: classification, electromyography, embedded system, emulation, physiological signals

Procedia PDF Downloads 83
2155 Adopting Flocks of Birds Approach to Predator for Anomalies Detection on Industrial Control Systems

Authors: M. Okeke, A. Blyth

Abstract:

Industrial Control Systems (ICS) such as Supervisory Control And Data Acquisition (SCADA) can be seen in many different critical infrastructures, from nuclear management to utility, medical equipment, power, waste and engine management on ships and planes. The role SCADA plays in critical infrastructure has resulted in a call to secure them. Many lives depend on it for daily activities and the attack vectors are becoming more sophisticated. Hence, the security of ICS is vital as malfunction of it might result in huge risk. This paper describes how the application of Prey Predator (PP) approach in flocks of birds could enhance the detection of malicious activities on ICS. The PP approach explains how these animals in groups or flocks detect predators by following some simple rules. They are not necessarily very intelligent animals but their approach in solving complex issues such as detection through corporation, coordination and communication worth emulating. This paper will emulate flocking behavior seen in birds in detecting predators. The PP approach will adopt six nearest bird approach in detecting any predator. Their local and global bests are based on the individual detection as well as group detection. The PP algorithm was designed following MapReduce methodology that follows a Split Detection Convergence (SDC) approach.

Keywords: artificial life, industrial control system (ICS), IDS, prey predator (PP), SCADA, SDC

Procedia PDF Downloads 286
2154 Learners as Consultants: Knowledge Acquisition and Client Organisations-A Student as Producer Case Study

Authors: Barry Ardley, Abi Hunt, Nick Taylor

Abstract:

As a theoretical and practical framework, this study uses the student-as-producer approach to learning in higher education, as adopted by the Lincoln International Business School, University of Lincoln, UK. Students as producer positions learners as skilled and capable agents, able to participate as partners with tutors in live research projects. To illuminate the nature of this approach to learning and to highlight its critical issues, the authors report on two guided student consultancy projects. These were set up with the assistance of two local organisations in the city of Lincoln, UK. Using the student as a producer model to deliver the projects enabled learners to acquire and develop a range of key skills and knowledge not easily accessible in more traditional educational settings. This paper presents a systematic case study analysis of the eight organising principles of the student-as-producer model, as adopted by university tutors. The experience of tutors implementing students as producers suggests that the model can be widely applied to benefit not only the learning and teaching experiences of higher education students and staff but additionally a university’s research programme and its community partners.

Keywords: consultancy, learning, student as producer, research

Procedia PDF Downloads 63
2153 Refining Scheme Using Amphibious Epistemologies

Authors: David Blaine, George Raschbaum

Abstract:

The evaluation of DHCP has synthesized SCSI disks, and current trends suggest that the exploration of e-business that would allow for further study into robots will soon emerge. Given the current status of embedded algorithms, hackers worldwide obviously desire the exploration of replication, which embodies the confusing principles of programming languages. In our research we concentrate our efforts on arguing that erasure coding can be made "fuzzy", encrypted, and game-theoretic.

Keywords: SCHI disks, robot, algorithm, hacking, programming language

Procedia PDF Downloads 405
2152 A Soft System Approach to Explore Ill-Defined Issues in Distance Education System - A Case of Saudi Arabia

Authors: Sulafah Basahel

Abstract:

Nowadays, Higher Education Institutions (HEIs) around the world are attempting to utilize Information and Communication Technologies (ICTs) to enhance learning process and strategies of knowledge delivery for students through Distance Education (DE) system. Stakeholders in DE system face a complex situation of different ill-defined and related issues that influence decision making process. In this study system thinking as a body of knowledge is used to explore the emergent properties that produced from these connections between issues and could have either positive or negative outcomes for the DE development. Checkland Soft System Methodology (SSM) - Mode 2 is employed in a cultural context of Saudi Arabia for more knowledge acquisition purposes among multiple stakeholders in DE rather than solving problems to achieve an overall development of DE system. This paper will discuss some political, cultural issues and connections between them that impact on effectiveness of stakeholders’ activities and relations. This study will significantly contribute to both system thinking and education fields by leading decision makers in DE to reconsider future plans, strategies and right actions for more successful educational practices.

Keywords: distance education, higher education institutions, ill-defined issues, soft system methodology-Mode 2

Procedia PDF Downloads 256
2151 Preserving Urban Cultural Heritage with Deep Learning: Color Planning for Japanese Merchant Towns

Authors: Dongqi Li, Yunjia Huang, Tomo Inoue, Kohei Inoue

Abstract:

With urbanization, urban cultural heritage is facing the impact and destruction of modernization and urbanization. Many historical areas are losing their historical information and regional cultural characteristics, so it is necessary to carry out systematic color planning for historical areas in conservation. As an early focus on urban color planning, Japan has a systematic approach to urban color planning. Hence, this paper selects five merchant towns from the category of important traditional building preservation areas in Japan as the subject of this study to explore the color structure and emotion of this type of historic area. First, the image semantic segmentation method identifies the buildings, roads, and landscape environments. Their color data were extracted for color composition and emotion analysis to summarize their common features. Second, the obtained Internet evaluations were extracted by natural language processing for keyword extraction. The correlation analysis of the color structure and keywords provides a valuable reference for conservation decisions for this historic area in the town. This paper also combines the color structure and Internet evaluation results with generative adversarial networks to generate predicted images of color structure improvements and color improvement schemes. The methods and conclusions of this paper can provide new ideas for the digital management of environmental colors in historic districts and provide a valuable reference for the inheritance of local traditional culture.

Keywords: historic districts, color planning, semantic segmentation, natural language processing

Procedia PDF Downloads 64
2150 Understanding Algerian International Student Mental Health Experiences in UK (United Kingdom) Universities: Difficulties of Disclosure, Help-Seeking and Coping Strategies

Authors: Nesrine Boussaoui

Abstract:

Background: International students often encounter challenges while studying in the UK, including communication and language barriers, lack of social networks, and socio-cultural differences that adversely impact on their mental health. For Algerian international students (AISs), these challenges may be heightened as English is not their first language and the culture of their homeland is substantially different from British culture, yet research has to incorporate their experiences and perspectives. Aim: The current study aimed to explore AISs’ 1) understandings of mental health; 2) issues of disclosure for mental health difficulties; and 3) mental health help-seeking and coping strategies. Method: In-depth, audio recorded semi-structured interviews (n = 20) with AISs in UK universities were conducted. An inductive, reflective thematic approach analysis was used. Finding: The following themes and associated sub-themes were developed: (1) Algerian cultural influences on mental health understanding(socio-cultural comparisons); (2) the paradox of the family (pressure vs. support); (3) stigma and fear of disclosure; (4) Barriers to formal help-seeking (informal disclosure as first step to seeking help); (5) Communication barriers (resort to mother tongue to disclose); (6) Self-reliance and religious coping. Conclusion: Recognising and understanding the challenges faced by AISs in terms of disclosure and mental health help-seeking is essential to reduce barriers to formal help-seeking. Informal disclosure among peers is often the first step to seeking help. Enhancing practitioners’ cultural competences and awareness of diverse understandings of mental health and the role of religious coping among AISs’ may have transferable benefits to a wider international student population.

Keywords: mental health, stegma, coping, disclosure

Procedia PDF Downloads 117
2149 Enhancing Experiential Learning in a Smart Flipped Classroom: A Case Study

Authors: Fahri Benli, Sitalakshmi Venkartraman, Ye Wei, Fiona Wahr

Abstract:

A flipped classroom which is a form of blended learning shifts the focus from a teacher-centered approach to a learner-centered approach. However, not all learners are ready to take the active role of knowledge and skill acquisition through a flipped classroom and they continue to delve in a passive mode of learning. This challenges educators in designing, scaffolding and facilitating in-class activities for students to have active learning experiences in a flipped classroom environment. Experiential learning theories have been employed by educators in the past in physical classrooms based on the principle that knowledge could be actively developed through direct experience. However, with more of online teaching witnessed recently, there are inherent limitations in designing and simulating an experiential learning activity for an online environment. In this paper, we explore enhancing experiential learning using smart digital tools that could be employed in a flipped classroom within a higher education setting. We present the use of smart collaborative tools online to enhance the experiential learning activity to teach higher-order cognitive concepts of business process modelling as a case study.

Keywords: experiential learning, flipped classroom, smart software tools, online learning higher-order learning attributes

Procedia PDF Downloads 173
2148 Intensive Crosstalk between Autophagy and Intracellular Signaling Regulates Osteosarcoma Cell Survival Response under Cisplatin Stress

Authors: Jyothi Nagraj, Sudeshna Mukherjee, Rajdeep Chowdhury

Abstract:

Autophagy has recently been linked with cancer cell survival post drug insult contributing to acquisition of resistance. However, the molecular signaling governing autophagic survival response is poorly explored. In our study, in osteosarcoma (OS) cells cisplatin shock was found to activate both MAPK and autophagy signaling. An activation of JNK and autophagy acted as pro-survival strategy, while ERK1/2 triggered apoptotic signals upon cisplatin stress. An increased sensitivity of the cells to cisplatin was obtained with simultaneous inhibition of both autophagy and JNK pathway. Furthermore, we observed that the autophagic stimulation upon drug stress regulates other developmentally active signaling pathways like the Hippo pathway in OS cells. Cisplatin resistant cells were thereafter developed by repetitive drug exposure followed by clonal selection. Basal levels of autophagy were found to be high in resistant cells to. However, the signaling mechanism leading to autophagic up-regulation and its regulatory effect differed in OS cells upon attaining drug resistance. Our results provide valuable clues to regulatory dynamics of autophagy that can be considered for development of improved therapeutic strategy against resistant type cancers.

Keywords: JNK, autophagy, drug resistance, cancer

Procedia PDF Downloads 275
2147 Building an Opinion Dynamics Model from Experimental Data

Authors: Dino Carpentras, Paul J. Maher, Caoimhe O'Reilly, Michael Quayle

Abstract:

Opinion dynamics is a sub-field of agent-based modeling that focuses on people’s opinions and their evolutions over time. Despite the rapid increase in the number of publications in this field, it is still not clear how to apply these models to real-world scenarios. Indeed, there is no agreement on how people update their opinion while interacting. Furthermore, it is not clear if different topics will show the same dynamics (e.g., more polarized topics may behave differently). These problems are mostly due to the lack of experimental validation of the models. Some previous studies started bridging this gap in the literature by directly measuring people’s opinions before and after the interaction. However, these experiments force people to express their opinion as a number instead of using natural language (and then, eventually, encoding it as numbers). This is not the way people normally interact, and it may strongly alter the measured dynamics. Another limitation of these studies is that they usually average all the topics together, without checking if different topics may show different dynamics. In our work, we collected data from 200 participants on 5 unpolarized topics. Participants expressed their opinions in natural language (“agree” or “disagree”). We also measured the certainty of their answer, expressed as a number between 1 and 10. However, this value was not shown to other participants to keep the interaction based on natural language. We then showed the opinion (and not the certainty) of another participant and, after a distraction task, we repeated the measurement. To make the data compatible with opinion dynamics models, we multiplied opinion and certainty to obtain a new parameter (here called “continuous opinion”) ranging from -10 to +10 (using agree=1 and disagree=-1). We firstly checked the 5 topics individually, finding that all of them behaved in a similar way despite having different initial opinions distributions. This suggested that the same model could be applied for different unpolarized topics. We also observed that people tend to maintain similar levels of certainty, even when they changed their opinion. This is a strong violation of what is suggested from common models, where people starting at, for example, +8, will first move towards 0 instead of directly jumping to -8. We also observed social influence, meaning that people exposed with “agree” were more likely to move to higher levels of continuous opinion, while people exposed with “disagree” were more likely to move to lower levels. However, we also observed that the effect of influence was smaller than the effect of random fluctuations. Also, this configuration is different from standard models, where noise, when present, is usually much smaller than the effect of social influence. Starting from this, we built an opinion dynamics model that explains more than 80% of data variance. This model was also able to show the natural conversion of polarization from unpolarized states. This experimental approach offers a new way to build models grounded on experimental data. Furthermore, the model offers new insight into the fundamental terms of opinion dynamics models.

Keywords: experimental validation, micro-dynamics rule, opinion dynamics, update rule

Procedia PDF Downloads 96
2146 Detailed Quantum Circuit Design and Evaluation of Grover's Algorithm for the Bounded Degree Traveling Salesman Problem Using the Q# Language

Authors: Wenjun Hou, Marek Perkowski

Abstract:

The Traveling Salesman problem is famous in computing and graph theory. In short, it asks for the Hamiltonian cycle of the least total weight in a given graph with N nodes. All variations on this problem, such as those with K-bounded-degree nodes, are classified as NP-complete in classical computing. Although several papers propose theoretical high-level designs of quantum algorithms for the Traveling Salesman Problem, no quantum circuit implementation of these algorithms has been created up to our best knowledge. In contrast to previous papers, the goal of this paper is not to optimize some abstract complexity measures based on the number of oracle iterations, but to be able to evaluate the real circuit and time costs of the quantum computer. Using the emerging quantum programming language Q# developed by Microsoft, which runs quantum circuits in a quantum computer simulation, an implementation of the bounded-degree problem and its respective quantum circuit were created. To apply Grover’s algorithm to this problem, a quantum oracle was designed, evaluating the cost of a particular set of edges in the graph as well as its validity as a Hamiltonian cycle. Repeating the Grover algorithm with an oracle that finds successively lower cost each time allows to transform the decision problem to an optimization problem, finding the minimum cost of Hamiltonian cycles. N log₂ K qubits are put into an equiprobablistic superposition by applying the Hadamard gate on each qubit. Within these N log₂ K qubits, the method uses an encoding in which every node is mapped to a set of its encoded edges. The oracle consists of several blocks of circuits: a custom-written edge weight adder, node index calculator, uniqueness checker, and comparator, which were all created using only quantum Toffoli gates, including its special forms, which are Feynman and Pauli X. The oracle begins by using the edge encodings specified by the qubits to calculate each node that this path visits and adding up the edge weights along the way. Next, the oracle uses the calculated nodes from the previous step and check that all the nodes are unique. Finally, the oracle checks that the calculated cost is less than the previously-calculated cost. By performing the oracle an optimal number of times, a correct answer can be generated with very high probability. The oracle of the Grover Algorithm is modified using the recalculated minimum cost value, and this procedure is repeated until the cost cannot be further reduced. This algorithm and circuit design have been verified, using several datasets, to generate correct outputs.

Keywords: quantum computing, quantum circuit optimization, quantum algorithms, hybrid quantum algorithms, quantum programming, Grover’s algorithm, traveling salesman problem, bounded-degree TSP, minimal cost, Q# language

Procedia PDF Downloads 171
2145 Valuation of Entrepreneurship Education (EE) Curriculum and Self-Employment Generation among Graduates of Tertiary Institutions in Edo State, Nigeria

Authors: Angela Obose Oriazowanlan

Abstract:

Despite the introduction of Entrepreneurship education into the Nigerian University curriculum to prepare graduates for self-employment roles in order to abate employment challenges, their unemployment rate still soars high. The study, therefore, examined the relevance of the curriculum contents and its delivery mechanism to equip graduates with appropriate entrepreneurial skills prior to graduation. Four research questions and two hypotheses guided the study. The survey research design was adopted for the study. An infinite population of graduates of a period of five years with 200 sample representatives using the simple random sampling technique was adopted. A 45-item structured questionnaire was used for data gathering. The gathered data thereof was anlysed using the descriptive statistics of mean and standard deviation, while the formulated hypotheses were tested with Z-score at 0.5 level of significance. The findings revealed, among others, that graduates acquisition of appropriate entrepreneurial skills for self-employment generation is low due to curriculum deficiencies, insufficient time allotment, and the delivery mechanism. It was recommended, among others, that the curriculum should be reviewed to improve its relevancy and that sufficient time should be allotted to enable adequate teaching and learning process.

Keywords: evaluation of entrepreneurship education (EE) curriculum, self-employment generation, graduates of tertiary institutions, Edo state, Nigeria

Procedia PDF Downloads 85
2144 Voices and Pictures from an Online Course and a Face to Face Course

Authors: Eti Gilad, Shosh Millet

Abstract:

In light of the technological development and its introduction into the field of education, an online course was designed in parallel to the 'conventional' course for teaching the ''Qualitative Research Methods''. This course aimed to characterize learning-teaching processes in a 'Qualitative Research Methods' course studied in two different frameworks. Moreover its objective was to explore the difference between the culture of a physical learning environment and that of online learning. The research monitored four learner groups, a total of 72 students, for two years, two groups from the two course frameworks each year. The courses were obligatory for M.Ed. students at an academic college of education and were given by one female-lecturer. The research was conducted in the qualitative method as a case study in order to attain insights about occurrences in the actual contexts and sites in which they transpire. The research tools were open-ended questionnaire and reflections in the form of vignettes (meaningful short pictures) to all students as well as an interview with the lecturer. The tools facilitated not only triangulation but also collecting data consisting of voices and pictures of teaching and learning. The most prominent findings are: differences between the two courses in the change features of the learning environment culture for the acquisition of contents and qualitative research tools. They were manifested by teaching methods, illustration aids, lecturer's profile and students' profile.

Keywords: face to face course, online course, qualitative research, vignettes

Procedia PDF Downloads 403
2143 Comprehensive Expert and Social Assessment of the Urban Environment of Almaty in the Process of Training Master's and Doctoral Students on Architecture and Urban Planning

Authors: Alexey Abilov

Abstract:

The article highlights the experience of training master's and doctoral students at Satbayev University by preparing their course works for disciplines "Principles of Sustainable Architecture", "Energy Efficiency in Urban planning", "Urban planning analysis, "Social foundations of Architecture". The purpose of these works is the acquisition by students of practical skills necessary in their future professional activities, which are achieved through comprehensive assessment of individual sections of the Almaty urban environment. The methodology of student’s researches carried out under the guidance of the author of this publication is based on an expert assessment of the territory through its full-scale survey, analysis of project documents and statistical data, as well as on a social assessment of the territory based on the results of a questionnaire survey of residents. A comprehensive qualitative and quantitative assessment of the selected sites according to the criteria of the quality of the living environment also allows to formulate specific recommendations for designers who carry out a pre-project analysis of the city territory in the process of preparing draft master plans and detailed planning projects.

Keywords: urban environment, expert/social assessment of the territory, questionnaire survey, comprehensive approach

Procedia PDF Downloads 57
2142 Role of Geomatics in Architectural and Cultural Conservation

Authors: Shweta Lall

Abstract:

The intent of this paper is to demonstrate the role of computerized auxiliary science in advancing the desired and necessary alliance of historians, surveyors, topographers, and analysts of architectural conservation and management. The digital era practice of recording architectural and cultural heritage in view of its preservation, dissemination, and planning developments are discussed in this paper. Geomatics include practices like remote sensing, photogrammetry, surveying, Geographic Information System (GIS), laser scanning technology, etc. These all resources help in architectural and conservation applications which will be identified through various case studies analysed in this paper. The standardised outcomes and the methodologies using relevant case studies are listed and described. The main component of geomatics methodology adapted in conservation is data acquisition, processing, and presentation. Geomatics is used in a wide range of activities involved in architectural and cultural heritage – damage and risk assessment analysis, documentation, 3-D model construction, virtual reconstruction, spatial and structural decision – making analysis and monitoring. This paper will project the summary answers of the capabilities and limitations of the geomatics field in architectural and cultural conservation. Policy-makers, urban planners, architects, and conservationist not only need answers to these questions but also need to practice them in a predictable, transparent, spatially explicit and inexpensive manner.

Keywords: architectural and cultural conservation, geomatics, GIS, remote sensing

Procedia PDF Downloads 127
2141 The Dynamic Metadata Schema in Neutron and Photon Communities: A Case Study of X-Ray Photon Correlation Spectroscopy

Authors: Amir Tosson, Mohammad Reza, Christian Gutt

Abstract:

Metadata stands at the forefront of advancing data management practices within research communities, with particular significance in the realms of neutron and photon scattering. This paper introduces a groundbreaking approach—dynamic metadata schema—within the context of X-ray Photon Correlation Spectroscopy (XPCS). XPCS, a potent technique unravelling nanoscale dynamic processes, serves as an illustrative use case to demonstrate how dynamic metadata can revolutionize data acquisition, sharing, and analysis workflows. This paper explores the challenges encountered by the neutron and photon communities in navigating intricate data landscapes and highlights the prowess of dynamic metadata in addressing these hurdles. Our proposed approach empowers researchers to tailor metadata definitions to the evolving demands of experiments, thereby facilitating streamlined data integration, traceability, and collaborative exploration. Through tangible examples from the XPCS domain, we showcase how embracing dynamic metadata standards bestows advantages, enhancing data reproducibility, interoperability, and the diffusion of knowledge. Ultimately, this paper underscores the transformative potential of dynamic metadata, heralding a paradigm shift in data management within the neutron and photon research communities.

Keywords: metadata, FAIR, data analysis, XPCS, IoT

Procedia PDF Downloads 46
2140 Neural Network based Risk Detection for Dyslexia and Dysgraphia in Sinhala Language Speaking Children

Authors: Budhvin T. Withana, Sulochana Rupasinghe

Abstract:

The educational system faces a significant concern with regards to Dyslexia and Dysgraphia, which are learning disabilities impacting reading and writing abilities. This is particularly challenging for children who speak the Sinhala language due to its complexity and uniqueness. Commonly used methods to detect the risk of Dyslexia and Dysgraphia rely on subjective assessments, leading to limited coverage and time-consuming processes. Consequently, delays in diagnoses and missed opportunities for early intervention can occur. To address this issue, the project developed a hybrid model that incorporates various deep learning techniques to detect the risk of Dyslexia and Dysgraphia. Specifically, Resnet50, VGG16, and YOLOv8 models were integrated to identify handwriting issues. The outputs of these models were then combined with other input data and fed into an MLP model. Hyperparameters of the MLP model were fine-tuned using Grid Search CV, enabling the identification of optimal values for the model. This approach proved to be highly effective in accurately predicting the risk of Dyslexia and Dysgraphia, providing a valuable tool for early detection and intervention. The Resnet50 model exhibited a training accuracy of 0.9804 and a validation accuracy of 0.9653. The VGG16 model achieved a training accuracy of 0.9991 and a validation accuracy of 0.9891. The MLP model demonstrated impressive results with a training accuracy of 0.99918, a testing accuracy of 0.99223, and a loss of 0.01371. These outcomes showcase the high accuracy achieved by the proposed hybrid model in predicting the risk of Dyslexia and Dysgraphia.

Keywords: neural networks, risk detection system, dyslexia, dysgraphia, deep learning, learning disabilities, data science

Procedia PDF Downloads 46
2139 Collaborative Online International Learning with Different Learning Goals: A Second Language Curriculum Perspective

Authors: Andrew Nowlan

Abstract:

During the Coronavirus pandemic, collaborative online international learning (COIL) emerged as an alternative to overseas sojourns. However, now that face-to-face classes have resumed and students are studying abroad, the rationale for doing COIL is not always clear amongst educators and students. Also, the logistics of COIL become increasingly complicated when participants involved in a potential collaboration have different second language (L2) learning goals. In this paper, the researcher will report on a study involving two bilingual, cross-cultural COIL courses between students at a university in Japan and those studying in North America, from April to December, 2022. The students in Japan were enrolled in an intercultural communication class in their L2 of English, while the students in Canada and the United States were studying intermediate Japanese as their L2. Based on a qualitative survey and journaling data received from 31 students in Japan, and employing a transcendental phenomenological research design, the researcher will highlight the students’ essence of experience during COIL. Essentially, students benefited from the experience through improved communicative competences and increased knowledge of the target culture, even when the L2 learning goals between institutions differed. Students also reported that the COIL experience was effective in preparation for actual study abroad, as opposed to a replacement for it, which challenges the existing literature. Both educators and administrators will be exposed to the perceptions of Japanese university students towards COIL, which could be generalized to other higher education contexts, including those in Southeast Asia. Readers will also be exposed to ideas for developing more effective pre-departure study abroad programs and domestic intercultural curriculum through COIL, even when L2 learning goals may differ between participants.

Keywords: collaborative online international learning, study abroad, phenomenology, EdTech, intercultural communication

Procedia PDF Downloads 68
2138 Microbiota Effect with Cytokine in Hl and NHL Patient Group

Authors: Ekin Ece Gürer, Tarık Onur Tiryaki, Sevgi Kalayoğlu Beşışık, Fatma Savran Oğuz, Uğur Sezerman, Fatma Erdem, Gülşen Günel, Dürdane Serap Kuruca, Zerrin Aktaş, Oral Öncül

Abstract:

Aim: Chemotherapytreatment in HodgkinLymphomaandNon-HodgkinLymphoma (NHL) diseasescausesgastrointestinalepithelialdamage, disruptstheintestinalmicrobiotabalanceandcausesdysbiosis. Inourstudy, it wasaimedtoshowtheeffect of thedamagecausedbychemotherapy on themicrobiotaandtheeffect of thechangingmicrobiota flora on thecourse of thedisease. Materials And Methods: Seven adult HL and seven adult HL patients to be treatedwithchemotherapywereincluded in the study. Stoolsamplesweretakentwice, beforechemotherapytreatmentandafterthe 3th course of treatment. SamplesweresequencedusingNextGenerationSequencing (NGS) methodafternucleicacidisolation. OTU tableswerepreparedusing NCBI blastnversion 2.0.12 accordingtothe NCBI general 16S bacterialtaxonomyreferencedated 10.08.2021. Thegenerated OTU tableswerecalculatedwith R Statistical Computer Language version 4.04 (readr, phyloseq, microbiome, vegan, descrand ggplot2 packages) to calculate Alpha diversityandtheirgraphicswerecreated. Statistical analyzeswerealsoperformedusing R Statistical Computer Language version 4.0.4 and studio IDE 1.4 (tidyverse, readr, xlsxand ggplot2 packages). Expression of IL-12 and IL-17 cytokineswasperformedbyrtPCRtwice, beforeandaftertreatment. Results: InHL patients, a significantdecreasewasobserved in themicrobiota flora of Ruminococcaceae_UCG-014 genus (p:0.036) andUndefined Ruminococcaceae_UCG-014 species (p:0.036) comparedtopre-treatment. When the post-treatment of HL patientswerecomparedwithhealthycontrols, a significantdecreasewasfound in themicrobiota of Prevotella_7 genus (p:0.049) andButyricimonas (p:0.006) in the post-treatmentmicrobiota of HL patients. InNHL patients, a significantdecreasewasobserved in themicrobiota flora of Coprococccus_3 genus (p:0.015) andUndefined Ruminoclostridium_5 (p:0.046) speciescomparedtopre-treatment. When post-treatment of NHL patientswerecomparedwithhealthycontrols, a significantabundance in theBacilliclass (p:0.029) and a significantdecrease in theUndefinedAlistipesspecies (p:0.047) wereobserved in the post-treatmentmicrobiota of NHL patients. While a decreasewasobserved in IL-12 cytokineexpressionuntilbeforetreatment, an increase in IL-17 cytokineexpressionwasdetected. Discussion: Intestinal flora monitoringafterchemotherapytreatmentshowsthat it can be a guide in thetreatment of thedisease. It is thoughtthatincreasingthediversity of commensalbacteria can alsopositivelyaffecttheprognosis of thedisease.

Keywords: hodgkin lymphoma, non-hodgkin, microbiota, cytokines

Procedia PDF Downloads 89
2137 Computer Aided Analysis of Breast Based Diagnostic Problems from Mammograms Using Image Processing and Deep Learning Methods

Authors: Ali Berkan Ural

Abstract:

This paper presents the analysis, evaluation, and pre-diagnosis of early stage breast based diagnostic problems (breast cancer, nodulesorlumps) by Computer Aided Diagnosing (CAD) system from mammogram radiological images. According to the statistics, the time factor is crucial to discover the disease in the patient (especially in women) as possible as early and fast. In the study, a new algorithm is developed using advanced image processing and deep learning method to detect and classify the problem at earlystagewithmoreaccuracy. This system first works with image processing methods (Image acquisition, Noiseremoval, Region Growing Segmentation, Morphological Operations, Breast BorderExtraction, Advanced Segmentation, ObtainingRegion Of Interests (ROIs), etc.) and segments the area of interest of the breast and then analyzes these partly obtained area for cancer detection/lumps in order to diagnosis the disease. After segmentation, with using the Spectrogramimages, 5 different deep learning based methods (specified Convolutional Neural Network (CNN) basedAlexNet, ResNet50, VGG16, DenseNet, Xception) are applied to classify the breast based problems.

Keywords: computer aided diagnosis, breast cancer, region growing, segmentation, deep learning

Procedia PDF Downloads 75
2136 Deep Learning-Based Approach to Automatic Abstractive Summarization of Patent Documents

Authors: Sakshi V. Tantak, Vishap K. Malik, Neelanjney Pilarisetty

Abstract:

A patent is an exclusive right granted for an invention. It can be a product or a process that provides an innovative method of doing something, or offers a new technical perspective or solution to a problem. A patent can be obtained by making the technical information and details about the invention publicly available. The patent owner has exclusive rights to prevent or stop anyone from using the patented invention for commercial uses. Any commercial usage, distribution, import or export of a patented invention or product requires the patent owner’s consent. It has been observed that the central and important parts of patents are scripted in idiosyncratic and complex linguistic structures that can be difficult to read, comprehend or interpret for the masses. The abstracts of these patents tend to obfuscate the precise nature of the patent instead of clarifying it via direct and simple linguistic constructs. This makes it necessary to have an efficient access to this knowledge via concise and transparent summaries. However, as mentioned above, due to complex and repetitive linguistic constructs and extremely long sentences, common extraction-oriented automatic text summarization methods should not be expected to show a remarkable performance when applied to patent documents. Other, more content-oriented or abstractive summarization techniques are able to perform much better and generate more concise summaries. This paper proposes an efficient summarization system for patents using artificial intelligence, natural language processing and deep learning techniques to condense the knowledge and essential information from a patent document into a single summary that is easier to understand without any redundant formatting and difficult jargon.

Keywords: abstractive summarization, deep learning, natural language Processing, patent document

Procedia PDF Downloads 110
2135 Inversion of Electrical Resistivity Data: A Review

Authors: Shrey Sharma, Gunjan Kumar Verma

Abstract:

High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.

Keywords: inversion, limitations, optimization, resistivity

Procedia PDF Downloads 345
2134 Redefining “Minor”: An Empirical Research on Two Biennials in Contemporary China

Authors: Mengwei Li

Abstract:

Since the 1990s, biennials, and large-scale transnational art exhibitions, have proliferated exponentially across the globe, particularly in Asia, Africa, and Latin America. It has spurred debates regarding the inclusion of "new art cultures" and the deconstruction of the mechanism of exclusion embedded in the Western monopoly on art. Hans Belting introduced the concept of "global art" in 2013 to denounce the West's privileged canons in art by emphasising the inclusion of art practices from alleged non-Western regions. Arguably, the rise of new biennial networks developed by these locations has contributed to the asserted "inclusion of new art worlds." However, phrases such as "non-Western" and "beyond Euro-American" attached to these discussions raise the question of non- or beyond- in relation to whom. In this narrative, to become "integrated" and "equal" implies entry into the "core," a universal system in which preexisting authoritative voices define "newcomers" by what they are not. Possibly, if there is a global biennial system that symbolises a "universal language" of the contemporary art world, it is centered on the inherently dynamic yet asymmetrical interaction and negotiation between the "core" and the rest of the world's "periphery." Engaging with theories of "minor literature" developed by Deleuze and Guattari, this research proposes an epistemological framework to comprehend the global biennial discourse since the 1990s. Using this framework, this research looks at two biennial models in China: the 13th Shanghai Biennale, which was organised in the country's metropolitan art centre, and the 2nd Yinchuan Biennale, which was inaugurated in a geographically and economically marginalised city compared to domestic centres. By analysing how these two biennials from different locations in China positioned themselves and conveyed their local profiles through the universal language of the biennial, this research identifies a potential "minor" positionality within the global biennial discourse from China's perspective.

Keywords: biennials, China, contemporary, global art, minor literature

Procedia PDF Downloads 73
2133 Using Satellite Images Datasets for Road Intersection Detection in Route Planning

Authors: Fatma El-Zahraa El-Taher, Ayman Taha, Jane Courtney, Susan Mckeever

Abstract:

Understanding road networks plays an important role in navigation applications such as self-driving vehicles and route planning for individual journeys. Intersections of roads are essential components of road networks. Understanding the features of an intersection, from a simple T-junction to larger multi-road junctions, is critical to decisions such as crossing roads or selecting the safest routes. The identification and profiling of intersections from satellite images is a challenging task. While deep learning approaches offer the state-of-the-art in image classification and detection, the availability of training datasets is a bottleneck in this approach. In this paper, a labelled satellite image dataset for the intersection recognition problem is presented. It consists of 14,692 satellite images of Washington DC, USA. To support other users of the dataset, an automated download and labelling script is provided for dataset replication. The challenges of construction and fine-grained feature labelling of a satellite image dataset is examined, including the issue of how to address features that are spread across multiple images. Finally, the accuracy of the detection of intersections in satellite images is evaluated.

Keywords: satellite images, remote sensing images, data acquisition, autonomous vehicles

Procedia PDF Downloads 123
2132 Neural Network-based Risk Detection for Dyslexia and Dysgraphia in Sinhala Language Speaking Children

Authors: Budhvin T. Withana, Sulochana Rupasinghe

Abstract:

The problem of Dyslexia and Dysgraphia, two learning disabilities that affect reading and writing abilities, respectively, is a major concern for the educational system. Due to the complexity and uniqueness of the Sinhala language, these conditions are especially difficult for children who speak it. The traditional risk detection methods for Dyslexia and Dysgraphia frequently rely on subjective assessments, making it difficult to cover a wide range of risk detection and time-consuming. As a result, diagnoses may be delayed and opportunities for early intervention may be lost. The project was approached by developing a hybrid model that utilized various deep learning techniques for detecting risk of Dyslexia and Dysgraphia. Specifically, Resnet50, VGG16 and YOLOv8 were integrated to detect the handwriting issues, and their outputs were fed into an MLP model along with several other input data. The hyperparameters of the MLP model were fine-tuned using Grid Search CV, which allowed for the optimal values to be identified for the model. This approach proved to be effective in accurately predicting the risk of Dyslexia and Dysgraphia, providing a valuable tool for early detection and intervention of these conditions. The Resnet50 model achieved an accuracy of 0.9804 on the training data and 0.9653 on the validation data. The VGG16 model achieved an accuracy of 0.9991 on the training data and 0.9891 on the validation data. The MLP model achieved an impressive training accuracy of 0.99918 and a testing accuracy of 0.99223, with a loss of 0.01371. These results demonstrate that the proposed hybrid model achieved a high level of accuracy in predicting the risk of Dyslexia and Dysgraphia.

Keywords: neural networks, risk detection system, Dyslexia, Dysgraphia, deep learning, learning disabilities, data science

Procedia PDF Downloads 79
2131 Enhancing the Interpretation of Group-Level Diagnostic Results from Cognitive Diagnostic Assessment: Application of Quantile Regression and Cluster Analysis

Authors: Wenbo Du, Xiaomei Ma

Abstract:

With the empowerment of Cognitive Diagnostic Assessment (CDA), various domains of language testing and assessment have been investigated to dig out more diagnostic information. What is noticeable is that most of the extant empirical CDA-based research puts much emphasis on individual-level diagnostic purpose with very few concerned about learners’ group-level performance. Even though the personalized diagnostic feedback is the unique feature that differentiates CDA from other assessment tools, group-level diagnostic information cannot be overlooked in that it might be more practical in classroom setting. Additionally, the group-level diagnostic information obtained via current CDA always results in a “flat pattern”, that is, the mastery/non-mastery of all tested skills accounts for the two highest proportion. In that case, the outcome does not bring too much benefits than the original total score. To address these issues, the present study attempts to apply cluster analysis for group classification and quantile regression analysis to pinpoint learners’ performance at different proficiency levels (beginner, intermediate and advanced) thus to enhance the interpretation of the CDA results extracted from a group of EFL learners’ reading performance on a diagnostic reading test designed by PELDiaG research team from a key university in China. The results show that EM method in cluster analysis yield more appropriate classification results than that of CDA, and quantile regression analysis does picture more insightful characteristics of learners with different reading proficiencies. The findings are helpful and practical for instructors to refine EFL reading curriculum and instructional plan tailored based on the group classification results and quantile regression analysis. Meanwhile, these innovative statistical methods could also make up the deficiencies of CDA and push forward the development of language testing and assessment in the future.

Keywords: cognitive diagnostic assessment, diagnostic feedback, EFL reading, quantile regression

Procedia PDF Downloads 135
2130 Investigating Introvert and Extrovert University Students’ Perception of the Use of Interactive Digital Tools in a Face-To-Face ESP Class

Authors: Eunice Tang

Abstract:

The main focus of this study is investigating introvert and extrovert university students’ perception of the use of interactive digital tools (such as Padlet and Mentimeter) in a face-to-face English for Specific Purposes (ESP) class after all classes in the university had been switched to online mode for three semesters. The subjects of the study were business students from three ESP classes at The Hong Kong University of Science and Technology. The basic tool for data collection was an anonymous online survey, which included 3 required multiple-choice questions and 3 open questions (2 required; 1 optional) about the effects of interactive digital tools on their amount of contribution to the class discussions, their perception of the role of interactive digital tools to the sharing of ideas and whether the students considered themselves introvert or extrovert. The online survey will be emailed to all 54 students in the three ESP classes and subjected to a three-week data collection period. The survey results will then be analyzed qualitatively, particularly on the effect the use of interactive digital tools had on the amount of contribution to the class among introvert and extrovert students, their perception of a language class with and without digital tools and most importantly, the implication to educators about how interactive digital tools can be used (or not) to cater for the needs of the introvert and extrovert students. The pandemic has given educators various opportunities to use interactive digital tools in class, especially in an online environment. It is interesting for educators to explore the potential of such tools when classes are back face-to-face. This research thus offers the students’ perspective on using interactive digital tools in a face-to-face classroom. While a lot has been said about introverted students responding positively to digital learning online, the student's perception of their own personality collected in the survey and the digital impact tools have on their contribution to class may shed some light on the potential of interactive digital tools in a post-pandemic era.

Keywords: psychology for language learning, interactive digital tools, personality-based investigation, ESP

Procedia PDF Downloads 166
2129 Information Extraction for Short-Answer Question for the University of the Cordilleras

Authors: Thelma Palaoag, Melanie Basa, Jezreel Mark Panilo

Abstract:

Checking short-answer questions and essays, whether it may be paper or electronic in form, is a tiring and tedious task for teachers. Evaluating a student’s output require wide array of domains. Scoring the work is often a critical task. Several attempts in the past few years to create an automated writing assessment software but only have received negative results from teachers and students alike due to unreliability in scoring, does not provide feedback and others. The study aims to create an application that will be able to check short-answer questions which incorporate information extraction. Information extraction is a subfield of Natural Language Processing (NLP) where a chunk of text (technically known as unstructured text) is being broken down to gather necessary bits of data and/or keywords (structured text) to be further analyzed or rather be utilized by query tools. The proposed system shall be able to extract keywords or phrases from the individual’s answers to match it into a corpora of words (as defined by the instructor), which shall be the basis of evaluation of the individual’s answer. The proposed system shall also enable the teacher to provide feedback and re-evaluate the output of the student for some writing elements in which the computer cannot fully evaluate such as creativity and logic. Teachers can formulate, design, and check short answer questions efficiently by defining keywords or phrases as parameters by assigning weights for checking answers. With the proposed system, teacher’s time in checking and evaluating students output shall be lessened, thus, making the teacher more productive and easier.

Keywords: information extraction, short-answer question, natural language processing, application

Procedia PDF Downloads 413
2128 Facility Data Model as Integration and Interoperability Platform

Authors: Nikola Tomasevic, Marko Batic, Sanja Vranes

Abstract:

Emerging Semantic Web technologies can be seen as the next step in evolution of the intelligent facility management systems. Particularly, this considers increased usage of open source and/or standardized concepts for data classification and semantic interpretation. To deliver such facility management systems, providing the comprehensive integration and interoperability platform in from of the facility data model is a prerequisite. In this paper, one of the possible modelling approaches to provide such integrative facility data model which was based on the ontology modelling concept was presented. Complete ontology development process, starting from the input data acquisition, ontology concepts definition and finally ontology concepts population, was described. At the beginning, the core facility ontology was developed representing the generic facility infrastructure comprised of the common facility concepts relevant from the facility management perspective. To develop the data model of a specific facility infrastructure, first extension and then population of the core facility ontology was performed. For the development of the full-blown facility data models, Malpensa and Fiumicino airports in Italy, two major European air-traffic hubs, were chosen as a test-bed platform. Furthermore, the way how these ontology models supported the integration and interoperability of the overall airport energy management system was analyzed as well.

Keywords: airport ontology, energy management, facility data model, ontology modeling

Procedia PDF Downloads 426
2127 Validating the Micro-Dynamic Rule in Opinion Dynamics Models

Authors: Dino Carpentras, Paul Maher, Caoimhe O'Reilly, Michael Quayle

Abstract:

Opinion dynamics is dedicated to modeling the dynamic evolution of people's opinions. Models in this field are based on a micro-dynamic rule, which determines how people update their opinion when interacting. Despite the high number of new models (many of them based on new rules), little research has been dedicated to experimentally validate the rule. A few studies started bridging this literature gap by experimentally testing the rule. However, in these studies, participants are forced to express their opinion as a number instead of using natural language. Furthermore, some of these studies average data from experimental questions, without testing if differences existed between them. Indeed, it is possible that different topics could show different dynamics. For example, people may be more prone to accepting someone's else opinion regarding less polarized topics. In this work, we collected data from 200 participants on 5 unpolarized topics. Participants expressed their opinions using natural language ('agree' or 'disagree') and the certainty of their answer, expressed as a number between 1 and 10. To keep the interaction based on natural language, certainty was not shown to other participants. We then showed to the participant someone else's opinion on the same topic and, after a distraction task, we repeated the measurement. To produce data compatible with standard opinion dynamics models, we multiplied the opinion (encoded as agree=1 and disagree=-1) with the certainty to obtain a single 'continuous opinion' ranging from -10 to 10. By analyzing the topics independently, we observed that each one shows a different initial distribution. However, the dynamics (i.e., the properties of the opinion change) appear to be similar between all topics. This suggested that the same micro-dynamic rule could be applied to unpolarized topics. Another important result is that participants that change opinion tend to maintain similar levels of certainty. This is in contrast with typical micro-dynamics rules, where agents move to an average point instead of directly jumping to the opposite continuous opinion. As expected, in the data, we also observed the effect of social influence. This means that exposing someone with 'agree' or 'disagree' influenced participants to respectively higher or lower values of the continuous opinion. However, we also observed random variations whose effect was stronger than the social influence’s one. We even observed cases of people that changed from 'agree' to 'disagree,' even if they were exposed to 'agree.' This phenomenon is surprising, as, in the standard literature, the strength of the noise is usually smaller than the strength of social influence. Finally, we also built an opinion dynamics model from the data. The model was able to explain more than 80% of the data variance. Furthermore, by iterating the model, we were able to produce polarized states even starting from an unpolarized population. This experimental approach offers a way to test the micro-dynamic rule. This also allows us to build models which are directly grounded on experimental results.

Keywords: experimental validation, micro-dynamic rule, opinion dynamics, update rule

Procedia PDF Downloads 141