Search results for: teaching and learning
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8016

Search results for: teaching and learning

2586 Using Textual Pre-Processing and Text Mining to Create Semantic Links

Authors: Ricardo Avila, Gabriel Lopes, Vania Vidal, Jose Macedo

Abstract:

This article offers a approach to the automatic discovery of semantic concepts and links in the domain of Oil Exploration and Production (E&P). Machine learning methods combined with textual pre-processing techniques were used to detect local patterns in texts and, thus, generate new concepts and new semantic links. Even using more specific vocabularies within the oil domain, our approach has achieved satisfactory results, suggesting that the proposal can be applied in other domains and languages, requiring only minor adjustments.

Keywords: semantic links, data mining, linked data, SKOS

Procedia PDF Downloads 148
2585 Physics-Informed Neural Network for Predicting Strain Demand in Inelastic Pipes under Ground Movement with Geometric and Soil Resistance Nonlinearities

Authors: Pouya Taraghi, Yong Li, Nader Yoosef-Ghodsi, Muntaseer Kainat, Samer Adeeb

Abstract:

Buried pipelines play a crucial role in the transportation of energy products such as oil, gas, and various chemical fluids, ensuring their efficient and safe distribution. However, these pipelines are often susceptible to ground movements caused by geohazards like landslides, fault movements, lateral spreading, and more. Such ground movements can lead to strain-induced failures in pipes, resulting in leaks or explosions, leading to fires, financial losses, environmental contamination, and even loss of human life. Therefore, it is essential to study how buried pipelines respond when traversing geohazard-prone areas to assess the potential impact of ground movement on pipeline design. As such, this study introduces an approach called the Physics-Informed Neural Network (PINN) to predict the strain demand in inelastic pipes subjected to permanent ground displacement (PGD). This method uses a deep learning framework that does not require training data and makes it feasible to consider more realistic assumptions regarding existing nonlinearities. It leverages the underlying physics described by differential equations to approximate the solution. The study analyzes various scenarios involving different geohazard types, PGD values, and crossing angles, comparing the predictions with results obtained from finite element methods. The findings demonstrate a good agreement between the results of the proposed method and the finite element method, highlighting its potential as a simulation-free, data-free, and meshless alternative. This study paves the way for further advancements, such as the simulation-free reliability assessment of pipes subjected to PGD, as part of ongoing research that leverages the proposed method.

Keywords: strain demand, inelastic pipe, permanent ground displacement, machine learning, physics-informed neural network

Procedia PDF Downloads 38
2584 The Reflection on Pre-Service Teacher Training Program in Science Education

Authors: Sumalee Tientongdee

Abstract:

The pre-service teacher training program at Suan Sunandha Rajabhat University, Bankgok Thailand has been provided for undergraduate students for more than 80 years. It was established as the first teacher college in the country. The pre-service teacher program in science education is considered as one of the new training programs to prepare pre-service teacher to teach science in secondary school level. The need of program assessment is strongly important. Therefore, this study was conducted to gain the opinions and recommendations from the principals, in-service teachers, and mentoring teachers from the partnership schools of Bangkok. The invited 120 participants for the annual meeting was hold in May 2017. The focus group discussion and questionnaires were used to collect the data during the reflection session. The content analysis was used to analyze the qualitative data. The results showed that the pre-service teacher training program in science education should improve students’ creative thinking skill, service mind, personality, and attitudes toward teaching science career. Also, the future science teachers must be able to teach in English to have more opportunities to teach science in Southeast Asian countries.

Keywords: pre-service teacher training program, reflection, science education, Suan Sunandha Rajabhat university

Procedia PDF Downloads 188
2583 The Developing of Teaching Materials Online for Students in Thailand

Authors: Pitimanus Bunlue

Abstract:

The objectives of this study were to identify the unique characteristics of Salaya Old market, Phutthamonthon, Nakhon Pathom and develop the effective video media to promote the homeland awareness among local people and the characteristic features of this community were collectively summarized based on historical data, community observation, and people’s interview. The acquired data were used to develop a media describing prominent features of the community. The quality of the media was later assessed by interviewing local people in the old market in terms of content accuracy, video, and narration qualities, and sense of homeland awareness after watching the video. The result shows a 6-minute video media containing historical data and outstanding features of this community was developed. Based on the interview, the content accuracy was good. The picture quality and the narration were very good. Most people developed a sense of homeland awareness after watching the video also as well.

Keywords: audio-visual, creating homeland awareness, Phutthamonthon Nakhon Pathom, research and development

Procedia PDF Downloads 272
2582 Teaching in the Post Truth Era: A Narrative Analysis of Modern Anti-Scientific Discourses in the Classroom

Authors: Jason T. Hilton

Abstract:

The ‘post-truth era’ is marked by a shift toward a period in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief. Applying narrative analysis techniques to current public discourses in education that run counter to scientific findings, it becomes possible to identify weakness in modern pedagogy and suggest ways to counter false narratives in the classroom. Results of this study indicate that a failure to engage with popular narratives lessens teachers’ ability to be convincing in the classroom, even when presenting information supported by scientific evidence. This study seeks to empower teachers by illustrating the influence of story within the post-truth era and the ways in which narrative and rhetorical elements take hold in social media contexts. Equipped with this knowledge, teachers can create a shift in pedagogy, away from transmission of knowledge toward the crafting of powerful narratives, built upon evidence, and connected to the lives of modern learners.

Keywords: 21st century learner, critical pedagogy, culture, narrative, post-truth era, social media

Procedia PDF Downloads 237
2581 Assessment Literacy Levels of Mathematics Teachers to Implement Classroom Assessment in Ghanaian High Schools

Authors: Peter Akayuure

Abstract:

One key determinant of the quality of mathematics learning is the teacher’s ability to assess students adequately and effectively and make assessment an integral part of the instructional practices. If the mathematics teacher lacks the required literacy to perform classroom assessment roles, the true trajectory of learning success and attainment of curriculum expectations might be indeterminate. It is therefore important that educators and policymakers understand and seek ways to improve the literacy level of mathematics teachers to implement classroom assessments that would meet curriculum demands. This study employed a descriptive survey design to explore perceived levels of assessment literacy of mathematics teachers to implement classroom assessment with the school based assessment framework in Ghana. A 25-item classroom assessment inventory on teachers’ assessment scenarios was adopted, modified, and administered to a purposive sample of 48 mathematics teachers from eleven Senior High Schools. Seven other items were included to further collect data on their self-efficacy towards assessment literacy. Data were analyzed using descriptive and bivariate correlation statistics. The result shows that, on average, 48.6% of the mathematics teachers attained standard levels of assessment literacy. Specifically, 50.0% met standard one in choosing appropriate assessment methods, 68.3% reached standard two in developing appropriate assessment tasks, 36.6% reached standard three in administering, scoring, and interpreting assessment results, 58.3% reached standard four in making appropriate assessment decisions, 41.7% reached standard five in developing valid grading procedures, 45.8% reached standard six in communicating assessment results, and 36.2 % reached standard seven by identifying unethical, illegal and inappropriate use of assessment results. Participants rated their self-efficacy belief in performing assessments high, making the relationships between participants’ assessment literacy scores and self-efficacy scores weak and statistically insignificant. The study recommends that institutions training mathematics teachers or providing professional developments should accentuate assessment literacy development to ensure standard assessment practices and quality instruction in mathematics education at senior high schools.

Keywords: assessment literacy, mathematics teacher, senior high schools, Ghana

Procedia PDF Downloads 107
2580 An Attempt to Improve Student´s Understanding on Thermal Conductivity Using Thermal Cameras

Authors: Mariana Faria Brito Francisquini

Abstract:

Many thermal phenomena are present and play a substantial role in our daily lives. This presence makes the study of this area at both High School and University levels a very widely explored topic in the literature. However, a lot of important concepts to a meaningful understanding of the world are neglected at the expense of a traditional approach with senseless algebraic problems. In this work, we intend to show how the introduction of new technologies in the classroom, namely thermal cameras, can work in our favor to make a clearer understanding of many of these concepts, such as thermal conductivity. The use of thermal cameras in the classroom tends to diminish the everlasting abstractness in thermal phenomena as they enable us to visualize something that happens right before our eyes, yet we cannot see it. In our study, we will provide the same amount of heat to metallic cylindrical rods of the same length, but different materials in order to study the thermal conductivity of each one. In this sense, the thermal camera allows us to visualize the increase in temperature along each rod in real time enabling us to infer how heat is being transferred from one part of the rod to another. Therefore, we intend to show how this approach can contribute to the exposure of students to more enriching, intellectually prolific, scenarios than those provided by traditional approaches.

Keywords: teaching physics, thermal cameras, thermal conductivity, thermal physics

Procedia PDF Downloads 254
2579 An E-coaching Methodology for Higher Education in Saudi Arabia

Authors: Essam Almuhsin, Ben Soh, Alice Li, Azmat Ullah

Abstract:

It is widely accepted that university students must acquire new knowledge, skills, awareness, and understanding to increase opportunities for professional and personal growth. The study reveals a significant increase in users engaging in e-coaching activities and a growing need for it during the COVID-19 pandemic. The paper proposes an e-coaching methodology for higher education in Saudi Arabia to address the need for effective coaching in the current online learning environment.

Keywords: role of e-coaching, e-coaching in higher education, Saudi higher education environment, e-coaching methodology, the importance of e-coaching

Procedia PDF Downloads 82
2578 Convolutional Neural Network Based on Random Kernels for Analyzing Visual Imagery

Authors: Ja-Keoung Koo, Kensuke Nakamura, Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Byung-Woo Hong

Abstract:

The machine learning techniques based on a convolutional neural network (CNN) have been actively developed and successfully applied to a variety of image analysis tasks including reconstruction, noise reduction, resolution enhancement, segmentation, motion estimation, object recognition. The classical visual information processing that ranges from low level tasks to high level ones has been widely developed in the deep learning framework. It is generally considered as a challenging problem to derive visual interpretation from high dimensional imagery data. A CNN is a class of feed-forward artificial neural network that usually consists of deep layers the connections of which are established by a series of non-linear operations. The CNN architecture is known to be shift invariant due to its shared weights and translation invariance characteristics. However, it is often computationally intractable to optimize the network in particular with a large number of convolution layers due to a large number of unknowns to be optimized with respect to the training set that is generally required to be large enough to effectively generalize the model under consideration. It is also necessary to limit the size of convolution kernels due to the computational expense despite of the recent development of effective parallel processing machinery, which leads to the use of the constantly small size of the convolution kernels throughout the deep CNN architecture. However, it is often desired to consider different scales in the analysis of visual features at different layers in the network. Thus, we propose a CNN model where different sizes of the convolution kernels are applied at each layer based on the random projection. We apply random filters with varying sizes and associate the filter responses with scalar weights that correspond to the standard deviation of the random filters. We are allowed to use large number of random filters with the cost of one scalar unknown for each filter. The computational cost in the back-propagation procedure does not increase with the larger size of the filters even though the additional computational cost is required in the computation of convolution in the feed-forward procedure. The use of random kernels with varying sizes allows to effectively analyze image features at multiple scales leading to a better generalization. The robustness and effectiveness of the proposed CNN based on random kernels are demonstrated by numerical experiments where the quantitative comparison of the well-known CNN architectures and our models that simply replace the convolution kernels with the random filters is performed. The experimental results indicate that our model achieves better performance with less number of unknown weights. The proposed algorithm has a high potential in the application of a variety of visual tasks based on the CNN framework. Acknowledgement—This work was supported by the MISP (Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by IITP, and NRF-2014R1A2A1A11051941, NRF2017R1A2B4006023.

Keywords: deep learning, convolutional neural network, random kernel, random projection, dimensionality reduction, object recognition

Procedia PDF Downloads 262
2577 Conducting Computational Physics Laboratory Course Using Cloud Storage Space

Authors: Ajay Wadhwa

Abstract:

A Laboratory course on computational physics is different from the conventional lab course on other topics of physics like Mechanics, Heat, Optics, etc. because it involves active participation of the teacher as well as one-to-one interaction between teacher and the student. The course content requires the teacher to teach programming language as well as numerical methods along with their applications in physics. The task becomes more daunting when about 90% of the students in the class have no previous experience of any programming language. In the presented work, we have described a methodology for conducting the computational physics course by using the Google Drive and Dropitto.me cloud storage services. We have evaluated the performance in a class of sixty students by dividing them equally into four groups. One of the groups was made the peer group on whom the presented methodology was tested. The other groups were taught by using conventional method of classroom lectures. In order to assess our methodology, we analyzed the performance of students in four class tests. A study of certain statistical parameters like the mean, standard deviation, and Z-test hypothesis revealed that the cyber methodology based on cloud storage is more efficient than the conventional method of teaching.

Keywords: computational Physics, Z-test hypothesis, cloud storage, Google drive

Procedia PDF Downloads 279
2576 Achieving Maximum Performance through the Practice of Entrepreneurial Ethics: Evidence from SMEs in Nigeria

Authors: S. B. Tende, H. L. Abubakar

Abstract:

It is acknowledged that small and medium enterprises (SMEs) may encounter different ethical issues and pressures that could affect the way in which they strategize or make decisions concerning the outcome of their business. Therefore, this research aimed at assessing entrepreneurial ethics in the business of SMEs in Nigeria. Secondary data were adopted as source of corpus for the analysis. The findings conclude that a sound entrepreneurial ethics system has a significant effect on the level of performance of SMEs in Nigeria. The Nigerian Government needs to provide both guiding and physical structures; as well as learning systems that could inculcate these entrepreneurial ethics.

Keywords: culture, entrepreneurial ethics, performance, SME

Procedia PDF Downloads 354
2575 Innovation Management: A Comparative Analysis among Organizations from United Arab Emirates, Saudi Arabia, Brazil and China

Authors: Asmaa Abazaid, Maram Al-Ostah, Nadeen Abu-Zahra, Ruba Bawab, Refaat Abdel-Razek

Abstract:

Innovation audit is defined as a tool that can be used to reflect on how the innovation is managed in an organization. The aim of this study is to audit innovation in the second top Engineering Firms in the world, and one of the Small Medium Enterprises (SMEs) companies that are working in United Arab Emirates (UAE). The obtained results are then compared with four international companies from China and Brazil. The Diamond model has been used for auditing innovation in the two companies in UAE to evaluate their innovation management and to identify each company’s strengths and weaknesses from an innovation perspective. The results of the comparison between the two companies (Jacobs and Hyper General Contracting) revealed that Jacobs has support for innovation, its innovation processes are well managed, the company is committed to the development of its employees worldwide and the innovation system is flexible. Jacobs was doing best in all innovation management dimensions: strategy, process, organization, linkages and learning, while Hyper General Contracting did not score as Jacobs in any of the innovation dimensions. Furthermore, the audit results of both companies were compared with international companies to examine how well the two construction companies in UAE manage innovation relative to SABIC (Saudi company), Poly Easy and Arnious (Brazilian companies), Huagong tools and Guizohou Yibai (Chinese companies). The results revealed that Jacobs is doing best in learning and organization dimensions, while PolyEasy and Jacobs are equal in the linkage dimension. Huagong Tools scored the highest score in process dimension among all the compared companies. However, the highest score of strategy dimension was given to PolyEasy. On the other hand, Hyper General Contracting scored the lowest in all of the innovation management dimensions. It needs to improve its management of all the innovation management dimensions with special attention to be given to strategy, process, and linkage as they got scores below 4 out of 7 comparing with other dimensions. Jacobs scored the highest in three innovation management dimensions related to the six companies. However, the strategy dimension is considered low, and special attention is needed in this dimension.

Keywords: Brazil, China, innovation audit, innovation evaluation, innovation management, Saudi Arabia, United Arab Emirates

Procedia PDF Downloads 263
2574 Game “EZZRA” as an Innovative Solution

Authors: Mane Varosyan, Diana Tumanyan, Agnesa Martirosyan

Abstract:

There are many catastrophic events that end with dire consequences, and to avoid them, people should be well-armed with the necessary information about these situations. During the last years, Serious Games have increasingly gained popularity for training people for different types of emergencies. The major discussed problem is the usage of gamification in education. Moreover, it is mandatory to understand how and what kind of gamified e-learning modules promote engagement. As the theme is emergency, we also find out people’s behavior for creating the final approach. Our proposed solution is an educational video game, “EZZRA”.

Keywords: gamification, education, emergency, serious games, game design, virtual reality, digitalisation

Procedia PDF Downloads 53
2573 Smartphone-Based Human Activity Recognition by Machine Learning Methods

Authors: Yanting Cao, Kazumitsu Nawata

Abstract:

As smartphones upgrading, their software and hardware are getting smarter, so the smartphone-based human activity recognition will be described as more refined, complex, and detailed. In this context, we analyzed a set of experimental data obtained by observing and measuring 30 volunteers with six activities of daily living (ADL). Due to the large sample size, especially a 561-feature vector with time and frequency domain variables, cleaning these intractable features and training a proper model becomes extremely challenging. After a series of feature selection and parameters adjustment, a well-performed SVM classifier has been trained.

Keywords: smart sensors, human activity recognition, artificial intelligence, SVM

Procedia PDF Downloads 122
2572 Making the Right Call for Falls: Evaluating the Efficacy of a Multi-Faceted Trust Wide Approach to Improving Patient Safety Post Falls

Authors: Jawaad Saleem, Hannah Wright, Peter Sommerville, Adrian Hopper

Abstract:

Introduction: Inpatient falls are the most commonly reported patient safety incidents, and carry a significant burden on resources, morbidity, and mortality. Ensuring adequate post falls management of patients by staff is therefore paramount to maintaining patient safety especially in out of hours and resource stretched settings. Aims: This quality improvement project aims to improve the current practice of falls management at Guys St Thomas Hospital, London as compared to our 2016 Quality Improvement Project findings. Furthermore, it looks to increase current junior doctors confidence in managing falls and their use of new guidance protocols. Methods: Multifaceted Interventions implemented included: the development of new trust wide guidelines detailing management pathways for patients post falls, available for intranet access. Furthermore, the production of 2000 lanyard cards distributed amongst junior doctors and staff which summarised these guidelines. Additionally, a ‘safety signal’ email was sent from the Trust chief medical officer to all staff raising awareness of falls and the guidelines. Formal falls teaching was also implemented for new doctors at induction. Using an established incident database, 189 consecutive falls in 2017were retrospectively analysed electronically to assess and compared to the variables measured in 2016 post interventions. A separate serious incident database was used to analyse 50 falls from May 2015 to March 2018 to ascertain the statistical significance of the impact of our interventions on serious incidents. A similar questionnaire for the 2017 cohort of foundation year one (FY1) doctors was performed and compared to 2016 results. Results: Questionnaire data demonstrated improved awareness and utility of guidelines and increased confidence as well as an increase in training. 97% of FY1 trainees felt that the interventions had increased their awareness of the impact of falls on patients in the trust. Data from the incident database demonstrated the time to review patients post fall had decreased from an average of 130 to 86 minutes. Improvement was also demonstrated in the reduced time to order and schedule X-ray and CT imaging, 3 and 5 hours respectively. Data from the serious incident database show that ‘the time from fall until harm was detected’ was statistically significantly lower (P = 0.044) post intervention. We also showed the incidence of significant delays in detecting harm ( > 10 hours) reduced post intervention. Conclusions: Our interventions have helped to significantly reduce the average time to assess, order and schedule appropriate imaging post falls. Delays of over ten hours to detect serious injuries after falls were commonplace; since the intervention, their frequency has markedly reduced. We suggest this will lead to identifying patient harm sooner, reduced clinical incidents relating to falls and thus improve overall patient safety. Our interventions have also helped increase clinical staff confidence, management, and awareness of falls in the trust. Next steps include expanding teaching sessions, improving multidisciplinary team involvement to aid this improvement.

Keywords: patient safety, quality improvement, serious incidents, falls, clinical care

Procedia PDF Downloads 105
2571 Rapid Building Detection in Population-Dense Regions with Overfitted Machine Learning Models

Authors: V. Mantey, N. Findlay, I. Maddox

Abstract:

The quality and quantity of global satellite data have been increasing exponentially in recent years as spaceborne systems become more affordable and the sensors themselves become more sophisticated. This is a valuable resource for many applications, including disaster management and relief. However, while more information can be valuable, the volume of data available is impossible to manually examine. Therefore, the question becomes how to extract as much information as possible from the data with limited manpower. Buildings are a key feature of interest in satellite imagery with applications including telecommunications, population models, and disaster relief. Machine learning tools are fast becoming one of the key resources to solve this problem, and models have been developed to detect buildings in optical satellite imagery. However, by and large, most models focus on affluent regions where buildings are generally larger and constructed further apart. This work is focused on the more difficult problem of detection in populated regions. The primary challenge with detecting small buildings in densely populated regions is both the spatial and spectral resolution of the optical sensor. Densely packed buildings with similar construction materials will be difficult to separate due to a similarity in color and because the physical separation between structures is either non-existent or smaller than the spatial resolution. This study finds that training models until they are overfitting the input sample can perform better in these areas than a more robust, generalized model. An overfitted model takes less time to fine-tune from a generalized pre-trained model and requires fewer input data. The model developed for this study has also been fine-tuned using existing, open-source, building vector datasets. This is particularly valuable in the context of disaster relief, where information is required in a very short time span. Leveraging existing datasets means that little to no manpower or time is required to collect data in the region of interest. The training period itself is also shorter for smaller datasets. Requiring less data means that only a few quality areas are necessary, and so any weaknesses or underpopulated regions in the data can be skipped over in favor of areas with higher quality vectors. In this study, a landcover classification model was developed in conjunction with the building detection tool to provide a secondary source to quality check the detected buildings. This has greatly reduced the false positive rate. The proposed methodologies have been implemented and integrated into a configurable production environment and have been employed for a number of large-scale commercial projects, including continent-wide DEM production, where the extracted building footprints are being used to enhance digital elevation models. Overfitted machine learning models are often considered too specific to have any predictive capacity. However, this study demonstrates that, in cases where input data is scarce, overfitted models can be judiciously applied to solve time-sensitive problems.

Keywords: building detection, disaster relief, mask-RCNN, satellite mapping

Procedia PDF Downloads 151
2570 Economics in Primary Schools – Positive Education and Well-being

Authors: Judit Nagy

Abstract:

Many scientific studies claim that financial education should start as early as possible. Children are much more capable of and willing to absorb new concepts than adults. If we introduce children to financial knowledge early, their behaviour and attitudes to this subject will change, increasing later success in this area of life. However, poor financial decisions may entail severe consequences, not only to individuals but even to the wider society. Good financial decisions and economic attitudes may contribute to economic growth and well-being. Whilst in several countries, education about financial awareness and fundamentals is available, the understanding and acquisition of complex economic knowledge and the development of children’s independent problem-solving skills are still lacking. The results suggest that teaching economic and financial knowledge through accounting and making lectures interactive by using special tools of positive education is critical to stimulating children’s interest. Eighty percent of the students in the study liked the combined and interactive lecture. Introducing this kind of knowledge to individuals is a relevant objective, even at the societal level.

Keywords: positive psychology, education innovation, primary school, gender, economics, accounting, finance, personal finance, mathematics, economic growth, well-being, sustainability

Procedia PDF Downloads 72
2569 A Dataset of Program Educational Objectives Mapped to ABET Outcomes: Data Cleansing, Exploratory Data Analysis and Modeling

Authors: Addin Osman, Anwar Ali Yahya, Mohammed Basit Kamal

Abstract:

Datasets or collections are becoming important assets by themselves and now they can be accepted as a primary intellectual output of a research. The quality and usage of the datasets depend mainly on the context under which they have been collected, processed, analyzed, validated, and interpreted. This paper aims to present a collection of program educational objectives mapped to student’s outcomes collected from self-study reports prepared by 32 engineering programs accredited by ABET. The manual mapping (classification) of this data is a notoriously tedious, time consuming process. In addition, it requires experts in the area, which are mostly not available. It has been shown the operational settings under which the collection has been produced. The collection has been cleansed, preprocessed, some features have been selected and preliminary exploratory data analysis has been performed so as to illustrate the properties and usefulness of the collection. At the end, the collection has been benchmarked using nine of the most widely used supervised multiclass classification techniques (Binary Relevance, Label Powerset, Classifier Chains, Pruned Sets, Random k-label sets, Ensemble of Classifier Chains, Ensemble of Pruned Sets, Multi-Label k-Nearest Neighbors and Back-Propagation Multi-Label Learning). The techniques have been compared to each other using five well-known measurements (Accuracy, Hamming Loss, Micro-F, Macro-F, and Macro-F). The Ensemble of Classifier Chains and Ensemble of Pruned Sets have achieved encouraging performance compared to other experimented multi-label classification methods. The Classifier Chains method has shown the worst performance. To recap, the benchmark has achieved promising results by utilizing preliminary exploratory data analysis performed on the collection, proposing new trends for research and providing a baseline for future studies.

Keywords: ABET, accreditation, benchmark collection, machine learning, program educational objectives, student outcomes, supervised multi-class classification, text mining

Procedia PDF Downloads 147
2568 Acrochordons and Diabetes Mellitus: A Case Control Study

Authors: Pratistha Shrestha

Abstract:

Background: Acrochordons (Skin tags) are common benign skin tumors usually occurring on the neck and major flexors of older people. These range in size from 1 mm to 1cm in diameter and are skin-colored or brownish. A possible association with diabetes mellitus has been suggested in previous studies, but the result is not conclusive. Objective: The aim of this study was to find out the association of diabetes mellitus with acrochordons. Material and Methods: One hundred and two patients were selected for the study. Among them, 51 (males–23 and females–28) with acrochordons were taken as cases, and 51 with other dermatologic diseases after matching age and sex were taken as controls. The patients were selected from OPD of the Department of Dermatology and Venereology in Universal College of Medical Sciences–Teaching Hospital (UCMS-TH). Blood glucose levels, including both fasting plasma glucose and 2-hour post-glucose load, were determined for both case and control and compared. Results: Patients with acrochordons had a significantly higher frequency of diabetes than the control group (p < 0.001). A total of 48.5% and 40% of patients with acrochordons having diabetes were obese and overweight, respectively. Conclusion: There is an increased risk of diabetes mellitus in patients with acrochordons. With regard to the importance of early diagnosis of diabetes, it is recommended a high level of suspicion for diabetes mellitus in patients with acrochordons.

Keywords: acrochordons, diabetes mellitus, obesity, skin tags

Procedia PDF Downloads 129
2567 Using Optical Character Recognition to Manage the Unstructured Disaster Data into Smart Disaster Management System

Authors: Dong Seop Lee, Byung Sik Kim

Abstract:

In the 4th Industrial Revolution, various intelligent technologies have been developed in many fields. These artificial intelligence technologies are applied in various services, including disaster management. Disaster information management does not just support disaster work, but it is also the foundation of smart disaster management. Furthermore, it gets historical disaster information using artificial intelligence technology. Disaster information is one of important elements of entire disaster cycle. Disaster information management refers to the act of managing and processing electronic data about disaster cycle from its’ occurrence to progress, response, and plan. However, information about status control, response, recovery from natural and social disaster events, etc. is mainly managed in the structured and unstructured form of reports. Those exist as handouts or hard-copies of reports. Such unstructured form of data is often lost or destroyed due to inefficient management. It is necessary to manage unstructured data for disaster information. In this paper, the Optical Character Recognition approach is used to convert handout, hard-copies, images or reports, which is printed or generated by scanners, etc. into electronic documents. Following that, the converted disaster data is organized into the disaster code system as disaster information. Those data are stored in the disaster database system. Gathering and creating disaster information based on Optical Character Recognition for unstructured data is important element as realm of the smart disaster management. In this paper, Korean characters were improved to over 90% character recognition rate by using upgraded OCR. In the case of character recognition, the recognition rate depends on the fonts, size, and special symbols of character. We improved it through the machine learning algorithm. These converted structured data is managed in a standardized disaster information form connected with the disaster code system. The disaster code system is covered that the structured information is stored and retrieve on entire disaster cycle such as historical disaster progress, damages, response, and recovery. The expected effect of this research will be able to apply it to smart disaster management and decision making by combining artificial intelligence technologies and historical big data.

Keywords: disaster information management, unstructured data, optical character recognition, machine learning

Procedia PDF Downloads 101
2566 Quality Assurance in Higher Education: Doha Institute for Graduate Studies as a Case Study

Authors: Ahmed Makhoukh

Abstract:

Quality assurance (QA) has recently become a common practice, which is endorsed by most Higher Education (HE) institutions worldwide, due to the pressure of internal and external forces. One of the aims of this quality movement is to make the contribution of university education to socio-economic development highly significant. This entails that graduates are currently required have a high-quality profile, i.e., to be competent and master the 21st-century skills needed in the labor market. This wave of change, mostly imposed by globalization, has the effect that university education should be learner-centered in order to satisfy the different needs of students and meet the expectations of other stakeholders. Such a shift of focus on the student learning outcomes has led HE institutions to reconsider their strategic planning, their mission, the curriculum, the pedagogical competence of the academic staff, among other elements. To ensure that the overall institutional performance is on the right way, a QA system should be established to assume this task of checking regularly the extent to which the set of standards of evaluation are strictly respected as expected. This operation of QA has the advantage of proving the accountability of the institution, gaining the trust of the public with transparency and enjoying an international recognition. This is the case of Doha Institute (DI) for Graduate Studies, in Qatar, the object of the present study. The significance of this contribution is to show that the conception of quality has changed in this digital age, and the need to integrate a department responsible for QA in every HE institution to ensure educational quality, enhance learners and achieve academic leadership. Thus, to undertake the issue of QA in DI for Graduate Studies, an elite university (in the academic sense) that focuses on a small and selected number of students, a qualitative method will be adopted in the description and analysis of the data (document analysis). In an attempt to investigate the extent to which QA is achieved in Doha Institute for Graduate Studies, three broad indicators will be evaluated (input, process and learning outcomes). This investigation will be carried out in line with the UK Quality Code for Higher Education represented by Quality Assurance Agency (QAA).

Keywords: accreditation, higher education, quality, quality assurance, standards

Procedia PDF Downloads 127
2565 Comparison between Approaches Used in Two Walk About Projects

Authors: Derek O Reilly, Piotr Milczarski, Shane Dowdall, Artur Hłobaż, Krzysztof Podlaski, Hiram Bollaert

Abstract:

Learning through creation of contextual games is a very promising way/tool for interdisciplinary and international group projects. During 2013 and 2014 we took part and organized two intensive students projects in different conditions. The projects enrolled 68 students and 12 mentors from 5 countries. In the paper we want to share our experience how to strengthen the chances to succeed in short (12-15 days long) student projects. In our case almost all teams prepared working prototype and the results were highly appreciated by external experts.

Keywords: contextual games, mobile games, GGULIVRR, walkabout, Erasmus intensive programme

Procedia PDF Downloads 476
2564 Teacher Culture Inquiry of Classroom Observation at an Elementary School in Taiwan

Authors: Tsai-Hsiu Lin

Abstract:

Three dimensions of teacher culture hinder educational improvement: individualism, conservatism and presentism. To promote the professional development of teachers, these three aspects in teacher culture should be eliminated. Classroom observation may be a useful method of eliminating individualism. The Ministry of Education in Taiwan has attempted to reduce the isolation of teachers to promote their professional growth. Because classroom observation discourse varies, teachers are generally unwilling to allow their teaching to be observed. However, classroom observations take place in the country in the form of school evaluations. The main purpose of this study was to explore the differences in teachers’ conservatism, individualism and presentism after classroom observations had been conducted at an elementary school in Taiwan. The research method was a qualitative case study involving interviews with the school principal, the director of academic affairs, and two classroom teachers. The following conclusions were drawn: (1) Educators in different positions viewed classroom observations differently; (2) The classroom teachers did not highly value classroom observation; (3) There was little change in the teachers’ conservatism, individualism and presentism after classroom observation.

Keywords: classroom observation, Lortie’s Trinity, teacher culture, teacher professional development

Procedia PDF Downloads 283
2563 Towards Learning Query Expansion

Authors: Ahlem Bouziri, Chiraz Latiri, Eric Gaussier

Abstract:

The steady growth in the size of textual document collections is a key progress-driver for modern information retrieval techniques whose effectiveness and efficiency are constantly challenged. Given a user query, the number of retrieved documents can be overwhelmingly large, hampering their efficient exploitation by the user. In addition, retaining only relevant documents in a query answer is of paramount importance for an effective meeting of the user needs. In this situation, the query expansion technique offers an interesting solution for obtaining a complete answer while preserving the quality of retained documents. This mainly relies on an accurate choice of the added terms to an initial query. Interestingly enough, query expansion takes advantage of large text volumes by extracting statistical information about index terms co-occurrences and using it to make user queries better fit the real information needs. In this respect, a promising track consists in the application of data mining methods to extract dependencies between terms, namely a generic basis of association rules between terms. The key feature of our approach is a better trade off between the size of the mining result and the conveyed knowledge. Thus, face to the huge number of derived association rules and in order to select the optimal combination of query terms from the generic basis, we propose to model the problem as a classification problem and solve it using a supervised learning algorithm such as SVM or k-means. For this purpose, we first generate a training set using a genetic algorithm based approach that explores the association rules space in order to find an optimal set of expansion terms, improving the MAP of the search results. The experiments were performed on SDA 95 collection, a data collection for information retrieval. It was found that the results were better in both terms of MAP and NDCG. The main observation is that the hybridization of text mining techniques and query expansion in an intelligent way allows us to incorporate the good features of all of them. As this is a preliminary attempt in this direction, there is a large scope for enhancing the proposed method.

Keywords: supervised leaning, classification, query expansion, association rules

Procedia PDF Downloads 302
2562 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 139
2561 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 127
2560 Framework for Explicit Social Justice Nursing Education and Practice: A Constructivist Grounded Theory Research

Authors: Victor Abu

Abstract:

Background: Social justice ideals are considered as the foundation of nursing practice. These ideals are not always clearly integrated into nursing professional standards or curricula. This hinders concerted global nursing agendas for becoming aware of social injustice or engaging in action for social justice to improve the health of individuals and groups. Aim and objectives: The aim was to create an educational framework for empowering nursing students for social justice awareness and action. This purpose was attained by understanding the meaning of social justice, the effect of social injustice, the visibility of social justice learning, and ways of integrating social justice in nursing education and practice. Methods: Critical interpretive methodologies and constructivist grounded theory research designs guided the processes of recruiting nursing students (n = 11) and nurse educators (n = 11) at a London nursing university to participate in interviews and focus groups, which were analysed by coding systems. Findings: Firstly, social justice was described as ethical practices that enable individuals and groups to have good access to health resources. Secondly, social injustice was understood as unfair practices that caused minimal access to resources, social deprivation, and poor health. Thirdly, social justice learning was considered to be invisible in nursing education due to a lack of explicit modules, educator knowledge, and organisational support. Lastly, explicit modules, educating educators, and attracting leaders’ support were suggested as approaches for the visible integration of social justice in nursing education and practice. Discussion: This research proposes approaches for nursing awareness and action for the development of critical active nurse-learner, critical conscious nurse-educator, and servant nurse leader. The framework on Awareness for Social Justice Action (ASJA) created in this research is an approach for empowering nursing students for social justice practices. Conclusion: This research contributes to and advocates for greater nursing scholarship to raise the spotlight on social justice in the profession.

Keywords: social justice, nursing practice, nursing education, nursing curriculum, social justice awareness, social justice action, constructivist grounded theory

Procedia PDF Downloads 22
2559 CIPP Evaluation of Online Broadcasting of Suan Dusit Rajabhat University

Authors: Somkiat Korbuakaew, Winai Mankhatitham, Anchan Chongcharoen, Wichar Kunkum

Abstract:

This research’s objective is to evaluate the online broadcasting of Suan Dusit Rajabhat Univeristy by CIPP model. The evaluation was separated into 4 parts: context factor, input factor, process factor and product factor. Sample group in this research were 399 participants who were university’s executive, staff and students. Questionnaires and interview were the research tools. Data were analyzed by computer program. Statistics used here were percentage, mean, and standard deviation. Findings are as follows: 1. Context factor: The context factor here in this research was university’s executives, staff and students. The study shows that they would like to use online broadcasting to be the educational tool and IT development. 2. Input factor: The input factor was the modern IT equipment to create interesting teaching materials and develop education in general. 3. Process factor: The process factor in this study was the publication of the program that it should be promoted more among students and should be more objective. 4. Product factor: The product factor in this study was the purpose of the program that it expands the educational channel for students.

Keywords: evaluation, project, internet, online broadcasting

Procedia PDF Downloads 496
2558 Teaching Pragmatic Coherence in Literary Text: Analysis of Chimamanda Adichie’s Americanah

Authors: Joy Aworo-Okoroh

Abstract:

Literary texts are mirrors of a real-life situation. Thus, authors choose the linguistic items that would best encode their intended meanings and messages. However, words mean more than they seem. The meaning of words is not static rather, it is dynamic as they constantly enter into relationships within a context. Literary texts can only be meaningful if all pragmatic cues are identified and interpreted. Drawing upon Teun Van Djik's theory of local pragmatic coherence, it is established that words enter into relations in a text and these relations account for sequential speech acts in the texts. Comprehension of the text is dependent on the interpretation of these relations.To show the relevance of pragmatic coherence in literary text analysis, ten conversations were selected in Americanah in order to give a clear idea of the pragmatic relations used. The conversations were analysed, identifying the speech act and epistemic relations inherent in them. A subtle analysis of the structure of the conversations was also carried out. It was discovered that justification is the most commonly used relation and the meaning of the text is dependent on the interpretation of these instances' pragmatic coherence. The study concludes that to effectively teach literature in English, pragmatic coherence should be incorporated as words mean more than they say.

Keywords: pragmatic coherence, epistemic coherence, speech act, Americanah

Procedia PDF Downloads 112
2557 Extended Knowledge Exchange with Industrial Partners: A Case Study

Authors: C. Fortin, D. Tokmeninova, O. Ushakova

Abstract:

Among 500 Russian universities Skolkovo Institute of Science and Technology (Skoltech) is one of the youngest (established in 2011), quite small and vastly international, comprising 20 percent of international students and 70 percent of faculty with significant academic experience at top-100 universities (QS, THE). The institute has emerged from close collaboration with MIT and leading Russian universities. Skoltech is an entirely English speaking environment. Skoltech curriculum plans of ten Master programs are based on the CDIO learning outcomes model. However, despite the Institute’s unique focus on industrial innovations and startups, one of the main challenges has become an evident large proportion of nearly half of MSc graduates entering PhD programs at Skoltech or other universities rather than industry or entrepreneurship. In order to increase the share of students joining the industrial sector after graduation, Skoltech started implementing a number of unique practices with a focus on employers’ expectations incorporated into the curriculum redesign. In this sense, extended knowledge exchange with industrial partners via collaboration in learning activities, industrial projects and assessments became essential for students’ headway into industrial and entrepreneurship pathways. Current academic curriculum includes the following types of components based on extended knowledge exchange with industrial partners: innovation workshop, industrial immersion, special industrial tracks, MSc defenses. Innovation workshop is a 4 week full time diving into the Skoltech vibrant ecosystem designed to foster innovators, focuses on teamwork, group projects, and sparks entrepreneurial instincts from the very first days of study. From 2019 the number of mentors from industry and startups significantly increased to guide students across these sectors’ demands. Industrial immersion is an exclusive part of Skoltech curriculum where students after the first year of study spend 8 weeks in an industrial company carrying out an individual or team project and are guided jointly by both Skoltech and company supervisors. The aim of the industrial immersion is to familiarize students with relevant needs of Russian industry and to prepare graduates for job placement. During the immersion a company plays the role of a challenge provider for students. Skoltech has started a special industrial track comprising deep collaboration with IPG Photonics – a leading R&D company and manufacturer of high-performance fiber lasers and amplifiers for diverse applications. The track is aimed to train a new cohort of engineers and includes a variety of activities for students within the “Photonics” MSc program. It is expected to be a successful story and used as an example for similar initiatives with other Russian high-tech companies. One of the pathways of extended knowledge exchange with industrial partners is an active involvement of potential employers in MSc Defense Committees to review and assess MSc thesis projects and to participate in defense procedures. The paper will evaluate the effect and results of the above undertaken measures.

Keywords: Curriculum redesign, knowledge exchange model, learning outcomes framework, stakeholder engagement

Procedia PDF Downloads 54