Search results for: deep learning network
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12008

Search results for: deep learning network

7418 Study of ANFIS and ARIMA Model for Weather Forecasting

Authors: Bandreddy Anand Babu, Srinivasa Rao Mandadi, C. Pradeep Reddy, N. Ramesh Babu

Abstract:

In this paper quickly illustrate the correlation investigation of Auto-Regressive Integrated Moving and Average (ARIMA) and daptive Network Based Fuzzy Inference System (ANFIS) models done by climate estimating. The climate determining is taken from University of Waterloo. The information is taken as Relative Humidity, Ambient Air Temperature, Barometric Pressure and Wind Direction utilized within this paper. The paper is carried out by analyzing the exhibitions are seen by demonstrating of ARIMA and ANIFIS model like with Sum of average of errors. Versatile Network Based Fuzzy Inference System (ANFIS) demonstrating is carried out by Mat lab programming and Auto-Regressive Integrated Moving and Average (ARIMA) displaying is produced by utilizing XLSTAT programming. ANFIS is carried out in Fuzzy Logic Toolbox in Mat Lab programming.

Keywords: ARIMA, ANFIS, fuzzy surmising tool stash, weather forecasting, MATLAB

Procedia PDF Downloads 414
7417 Project-Bbased Learning (PBL) Taken to Extremes: Full-Year/Full-Time PBL Replacement of Core Curriculum

Authors: Stephen Grant Atkins

Abstract:

Radical use of project-based learning (PBL) in a small New Zealand business school provides an opportunity to longitudinally examine its effects over a decade of pre-Covid data. Prior to this business school’s implementation of PBL, starting in 2012, the business pedagogy literature presented just one example of PBL replacing an entire core-set of courses. In that instance, a British business school merged four of its ‘degree Year 3’ accounting courses into one PBL semester. As radical as that would have seemed, to students aged 20-to-22, the PBL experiment conducted in a New Zealand business school was notably more extreme: 41 nationally-approved Learning Outcomes (L.O.s), these deriving from 8 separate core courses, were aggregated into one grand set of L.O.s, and then treated as a ‘full-year’/‘full-time’ single course. The 8 courses in question were all components of this business school’s compulsory ‘degree Year 1’ curriculum. Thus, the students involved were notably younger (…ages 17-to-19…), and no ‘part-time’ enrolments were allowed. Of interest are this PBL experiment’s effects on subsequent performance outcomes in ‘degree Years 2 & 3’ (….which continued to operate in their traditional ways). Of special interest is the quality of ‘group project’ outcomes. This is because traditionally, ‘degree Year 1’ course assessments are only minimally based on group work. This PBL experiment altered that practice radically, such that PBL ‘degree Year 1’ alumni entered their remaining two years of business coursework with far more ‘project group’ experience. Timeline-wise, thus of interest here, firstly, is ‘degree Year 2’ performance outcomes data from years 2010-2012 + 2016-2018, and likewise ‘degree Year 3’ data for years 2011-2013 + 2017-2019. Those years provide a pre-&-post comparative baseline for performance outcomes in students never exposed to this school’s radical PBL experiment. That baseline is then compared to PBL alumni outcomes (2013-2016….including’Student Evaluation of Course Quality’ outcomes…) to clarify ‘radical PBL’ effects.

Keywords: project-based learning, longitudinal mixed-methods, students criticism, effects-on-learning

Procedia PDF Downloads 90
7416 Development and Validation of First Derivative Method and Artificial Neural Network for Simultaneous Spectrophotometric Determination of Two Closely Related Antioxidant Nutraceuticals in Their Binary Mixture”

Authors: Mohamed Korany, Azza Gazy, Essam Khamis, Marwa Adel, Miranda Fawzy

Abstract:

Background: Two new, simple and specific methods; First, a Zero-crossing first-derivative technique and second, a chemometric-assisted spectrophotometric artificial neural network (ANN) were developed and validated in accordance with ICH guidelines. Both methods were used for the simultaneous estimation of the two closely related antioxidant nutraceuticals ; Coenzyme Q10 (Q) ; also known as Ubidecarenone or Ubiquinone-10, and Vitamin E (E); alpha-tocopherol acetate, in their pharmaceutical binary mixture. Results: For first method: By applying the first derivative, both Q and E were alternatively determined; each at the zero-crossing of the other. The D1 amplitudes of Q and E, at 285 nm and 235 nm respectively, were recorded and correlated to their concentrations. The calibration curve is linear over the concentration range of 10-60 and 5.6-70 μg mL-1 for Q and E, respectively. For second method: ANN (as a multivariate calibration method) was developed and applied for the simultaneous determination of both analytes. A training set (or a concentration set) of 90 different synthetic mixtures containing Q and E, in wide concentration ranges between 0-100 µg/mL and 0-556 µg/mL respectively, were prepared in ethanol. The absorption spectra of the training sets were recorded in the spectral region of 230–300 nm. A Gradient Descend Back Propagation ANN chemometric calibration was computed by relating the concentration sets (x-block) to their corresponding absorption data (y-block). Another set of 45 synthetic mixtures of the two drugs, in defined range, was used to validate the proposed network. Neither chemical separation, preparation stage nor mathematical graphical treatment were required. Conclusions: The proposed methods were successfully applied for the assay of Q and E in laboratory prepared mixtures and combined pharmaceutical tablet with excellent recoveries. The ANN method was superior over the derivative technique as the former determined both drugs in the non-linear experimental conditions. It also offers rapidity, high accuracy, effort and money saving. Moreover, no need for an analyst for its application. Although the ANN technique needed a large training set, it is the method of choice in the routine analysis of Q and E tablet. No interference was observed from common pharmaceutical additives. The results of the two methods were compared together

Keywords: coenzyme Q10, vitamin E, chemometry, quantitative analysis, first derivative spectrophotometry, artificial neural network

Procedia PDF Downloads 441
7415 Analysis of Transformer Reactive Power Fluctuations during Adverse Space Weather

Authors: Patience Muchini, Electdom Matandiroya, Emmanuel Mashonjowa

Abstract:

A ground-end manifestation of space weather phenomena is known as geomagnetically induced currents (GICs). GICs flow along the electric power transmission cables connecting the transformers and between the grounding points of power transformers during significant geomagnetic storms. Geomagnetically induced currents have been studied in other regions and have been noted to affect the power grid network. In Zimbabwe, grid failures have been experienced, but it is yet to be proven if these failures have been due to GICs. The purpose of this paper is to characterize geomagnetically induced currents with a power grid network. This paper analyses data collected, which is geomagnetic data, which includes the Kp index, DST index, and the G-Scale from geomagnetic storms and also analyses power grid data, which includes reactive power, relay tripping, and alarms from high voltage substations and then correlates the data. This research analysis was first theoretically analyzed by studying geomagnetic parameters and then experimented upon. To correlate, MATLAB was used as the basic software to analyze the data. Latitudes of the substations were also brought into scrutiny to note if they were an impact due to the location as low latitudes areas like most parts of Zimbabwe, there are less severe geomagnetic variations. Based on theoretical and graphical analysis, it has been proven that there is a slight relationship between power system failures and GICs. Further analyses can be done by implementing measuring instruments to measure any currents in the grounding of high-voltage transformers when geomagnetic storms occur. Mitigation measures can then be developed to minimize the susceptibility of the power network to GICs.

Keywords: adverse space weather, DST index, geomagnetically induced currents, KP index, reactive power

Procedia PDF Downloads 109
7414 A Double Differential Chaos Shift Keying Scheme for Ultra-Wideband Chaotic Communication Technology Applied in Low-Rate Wireless Personal Area Network

Authors: Ghobad Gorji, Hasan Golabi

Abstract:

The goal of this paper is to describe the design of an ultra-wideband (UWB) system that is optimized for the low-rate wireless personal area network application. To this aim, we propose a system based on direct chaotic communication (DCC) technology. Based on this system, a 2-GHz wide chaotic signal is directly generated into the lower band of the UWB spectrum, i.e., 3.1–5.1 GHz. For this system, two simple modulation schemes, namely chaotic on-off keying (COOK) and differential chaos shift keying (DCSK), were studied before, and their performance was evaluated. We propose a modulation scheme, namely Double DCSK, to improve the performance of UWB DCC. Different characteristics of these systems, with Monte Carlo simulations based on the Additive White Gaussian Noise (AWGN) and the IEEE 802.15.4a standard channel models, are compared.

Keywords: UWB, DCC, IEEE 802.15.4a, COOK, DCSK

Procedia PDF Downloads 69
7413 Theology of Science and Technology as a Tool for Peace Education

Authors: Jonas Chikelue Ogbuefi

Abstract:

Science and Technology have a major impact on societal peace, it offers support to teaching and learning, cuts costs, and offers solutions to the current agitations and militancy in Nigeria today. Christianity, for instance, did not only change and form the western world in the past 2022 but still has a substantial role to play in society through liquid ecclesiology. This paper interrogated the impact of the theology of Science and Technology as a tool for peace sustainability through peace education in Nigeria. The method adopted is a historical and descriptive method of analysis. It was discovered that a larger number of Nigerian citizens lack almost all the basic things needed for the standard of living, such as Shelter, meaningful employment, and clothing, which is the root course of all agitations in Nigeria. Based on the above findings, the paper contends that the government alone cannot restore Peace in Nigeria. Hence the inability of the government to restore peace calls for all religious actors to be involved. The main thrust and recommendation of this paper are to challenge the religious actors to implement the Theology of Science and Technology as a tool for peace restoration and should network with both the government and the private sectors to make funds available to budding and existing entrepreneurs using Science and Technology as a tool for Peace and economic sustainability. This paper viewed the theology of Science and Technology as a tool for Peace and economic sustainability in Nigeria.

Keywords: theology, science, technology, peace education

Procedia PDF Downloads 79
7412 Designing an Editorialization Environment for Repeatable Self-Correcting Exercises

Authors: M. Kobylanski, D. Buskulic, P.-H. Duron, D. Revuz, F. Ruggieri, E. Sandier, C. Tijus

Abstract:

In order to design a cooperative e-learning platform, we observed teams of Teacher [T], Computer Scientist [CS] and exerciser's programmer-designer [ED] cooperating for the conception of a self-correcting exercise, but without the use of such a device in order to catch the kind of interactions a useful platform might provide. To do so, we first run a task analysis on how T, CS and ED should be cooperating in order to achieve, at best, the task of creating and implementing self-directed, self-paced, repeatable self-correcting exercises (RSE) in the context of open educational resources. The formalization of the whole process was based on the “objectives, activities and evaluations” theory of educational task analysis. Second, using the resulting frame as a “how-to-do it” guide, we run a series of three contrasted Hackathon of RSE-production to collect data about the cooperative process that could be later used to design the collaborative e-learning platform. Third, we used two complementary methods to collect, to code and to analyze the adequate survey data: the directional flow of interaction among T-CS-ED experts holding a functional role, and the Means-End Problem Solving analysis. Fourth, we listed the set of derived recommendations useful for the design of the exerciser as a cooperative e-learning platform. Final recommendations underline the necessity of building (i) an ecosystem that allows to sustain teams of T-CS-ED experts, (ii) a data safety platform although offering accessibility and open discussion about the production of exercises with their resources and (iii) a good architecture allowing the inheritance of parts of the coding of any exercise already in the data base as well as fast implementation of new kinds of exercises along with their associated learning activities.

Keywords: editorialization, open educational resources, pedagogical alignment, produsage, repeatable self-correcting exercises, team roles

Procedia PDF Downloads 117
7411 An Intelligent Prediction Method for Annular Pressure Driven by Mechanism and Data

Authors: Zhaopeng Zhu, Xianzhi Song, Gensheng Li, Shuo Zhu, Shiming Duan, Xuezhe Yao

Abstract:

Accurate calculation of wellbore pressure is of great significance to prevent wellbore risk during drilling. The traditional mechanism model needs a lot of iterative solving procedures in the calculation process, which reduces the calculation efficiency and is difficult to meet the demand of dynamic control of wellbore pressure. In recent years, many scholars have introduced artificial intelligence algorithms into wellbore pressure calculation, which significantly improves the calculation efficiency and accuracy of wellbore pressure. However, due to the ‘black box’ property of intelligent algorithm, the existing intelligent calculation model of wellbore pressure is difficult to play a role outside the scope of training data and overreacts to data noise, often resulting in abnormal calculation results. In this study, the multi-phase flow mechanism is embedded into the objective function of the neural network model as a constraint condition, and an intelligent prediction model of wellbore pressure under the constraint condition is established based on more than 400,000 sets of pressure measurement while drilling (MPD) data. The constraint of the multi-phase flow mechanism makes the prediction results of the neural network model more consistent with the distribution law of wellbore pressure, which overcomes the black-box attribute of the neural network model to some extent. The main performance is that the accuracy of the independent test data set is further improved, and the abnormal calculation values basically disappear. This method is a prediction method driven by MPD data and multi-phase flow mechanism, and it is the main way to predict wellbore pressure accurately and efficiently in the future.

Keywords: multiphase flow mechanism, pressure while drilling data, wellbore pressure, mechanism constraints, combined drive

Procedia PDF Downloads 167
7410 Peer Instruction, Technology, Education for Textile and Fashion Students

Authors: Jimmy K. C. Lam, Carrie Wong

Abstract:

One of the key goals on Learning and Teaching as documented in the University strategic plan 2012/13 – 2017/18 is to encourage active learning, the use of innovative teaching approaches and technology, and promoting the adoption of flexible and varied teaching delivery methods. This research reported the recent visited to Prof Eric Mazur at Harvard University on Peer Instruction: Collaborative learning in large class and innovative use of technology to enable new mode of learning. Peer Instruction is a research-based, interactive teaching method developed by Prof. Eric Mazur at Harvard University in the 1990s. It has been adopted across the disciplines, institutional type and throughout the world. One problem with conventional teaching lies in the presentation of the material. Frequently, it comes straight out of textbook/notes, giving students little incentive to attend class. This traditional presentation is always delivered as monologue in front of passive audience. Only exceptional lecturers are capable of holding students’ attention for an entire lecture period. Consequently, lectures simply reinforce students’ feelings that the most important step in mastering the material is memorizing a zoo of unrelated examples. In order to address these misconceptions about learning, Prof Mazur’s Team developed “Peer Instruction”, a method which involves students in their own learning during lectures and focuses their attention on underling concepts. Lectures are interspersed with conceptual questions called Concept Tests, designed to expose common difficulties in understanding the material. The students are given one or two minutes to think about the question and formulate their own answers; they then spend two or three minutes discussing their answers in a group of three or four, attempting to reach consensus on the correct answer. This process forces the students to think through the arguments being developed, and enable them to assess their understanding concepts before they leave the classroom. The findings from Peer Instruction and innovative use of technology on teaching at Harvard University were applied to the first year Textiles and Fashion students in Hong Kong. Survey conducted from 100 students showed that over 80% students enjoyed the flexibility of peer instruction and 70% of them enjoyed the instant feedback from the Clicker system (Student Response System used at Harvard University). Further work will continue to explore the possibility of peer instruction to art and fashion students.

Keywords: peer instruction, education, technology, fashion

Procedia PDF Downloads 312
7409 Replicating Brain’s Resting State Functional Connectivity Network Using a Multi-Factor Hub-Based Model

Authors: B. L. Ho, L. Shi, D. F. Wang, V. C. T. Mok

Abstract:

The brain’s functional connectivity while temporally non-stationary does express consistency at a macro spatial level. The study of stable resting state connectivity patterns hence provides opportunities for identification of diseases if such stability is severely perturbed. A mathematical model replicating the brain’s spatial connections will be useful for understanding brain’s representative geometry and complements the empirical model where it falls short. Empirical computations tend to involve large matrices and become infeasible with fine parcellation. However, the proposed analytical model has no such computational problems. To improve replicability, 92 subject data are obtained from two open sources. The proposed methodology, inspired by financial theory, uses multivariate regression to find relationships of every cortical region of interest (ROI) with some pre-identified hubs. These hubs acted as representatives for the entire cortical surface. A variance-covariance framework of all ROIs is then built based on these relationships to link up all the ROIs. The result is a high level of match between model and empirical correlations in the range of 0.59 to 0.66 after adjusting for sample size; an increase of almost forty percent. More significantly, the model framework provides an intuitive way to delineate between systemic drivers and idiosyncratic noise while reducing dimensions by more than 30 folds, hence, providing a way to conduct attribution analysis. Due to its analytical nature and simple structure, the model is useful as a standalone toolkit for network dependency analysis or as a module for other mathematical models.

Keywords: functional magnetic resonance imaging, multivariate regression, network hubs, resting state functional connectivity

Procedia PDF Downloads 148
7408 Features Vector Selection for the Recognition of the Fragmented Handwritten Numeric Chains

Authors: Salim Ouchtati, Aissa Belmeguenai, Mouldi Bedda

Abstract:

In this study, we propose an offline system for the recognition of the fragmented handwritten numeric chains. Firstly, we realized a recognition system of the isolated handwritten digits, in this part; the study is based mainly on the evaluation of neural network performances, trained with the gradient backpropagation algorithm. The used parameters to form the input vector of the neural network are extracted from the binary images of the isolated handwritten digit by several methods: the distribution sequence, sondes application, the Barr features, and the centered moments of the different projections and profiles. Secondly, the study is extended for the reading of the fragmented handwritten numeric chains constituted of a variable number of digits. The vertical projection was used to segment the numeric chain at isolated digits and every digit (or segment) was presented separately to the entry of the system achieved in the first part (recognition system of the isolated handwritten digits).

Keywords: features extraction, handwritten numeric chains, image processing, neural networks

Procedia PDF Downloads 262
7407 Why is the Recurrence Rate of Residual or Recurrent Disease Following Endoscopic Mucosal Resection (EMR) of the Oesophageal Dysplasia’s and T1 Tumours Higher in the Greater Midlands Cancer Network?

Authors: Harshadkumar Rajgor, Jeff Butterworth

Abstract:

Background: Barretts oesophagus increases the risk of developing oesophageal adenocarcinoma. Over the last 40 years, there has been a 6 fold increase in the incidence of oesophageal adenocarcinoma in the western world and the incidence rates are increasing at a greater rate than cancers of the colon, breast and lung. Endoscopic mucosal resection (EMR) is a relatively new technique being used by 2 centres in the greater midlands cancer network. EMR can be used for curative or staging purposes, for high-grade dysplasia’s and T1 tumours of the oesophagus. EMR is also suitable for those who are deemed high risk for oesophagectomy. EMR has a recurrence rate of 21% according to the Wiesbaden data. Method: A retrospective study of prospectively collected data was carried out involving 24 patients who had EMR for curative or staging purposes. Complications of residual or recurrent disease following EMR that required further treatment were investigated. Results: In 54% of cases residual or recurrent disease was suspected. 96% of patients were given clear and concise information regarding their diagnosis of high-grade dysplasia or T1 tumours. All 24 patients consulted the same specialist healthcare team. Conclusion: EMR is a safe and effective treatment for patients who have high-grade dysplasia and T1NO tumours. In 54% of cases residual or recurrent disease was suspected. Initially, only single resections were undertaken. Multiple resections are now being carried out to reduce the risk of recurrence. Complications from EMR remain low in this series and consisted of a single episode of post procedural bleeding.

Keywords: endoscopic mucosal resection, oesophageal dysplasia, T1 tumours, cancer network

Procedia PDF Downloads 313
7406 Scientific Recommender Systems Based on Neural Topic Model

Authors: Smail Boussaadi, Hassina Aliane

Abstract:

With the rapid growth of scientific literature, it is becoming increasingly challenging for researchers to keep up with the latest findings in their fields. Academic, professional networks play an essential role in connecting researchers and disseminating knowledge. To improve the user experience within these networks, we need effective article recommendation systems that provide personalized content.Current recommendation systems often rely on collaborative filtering or content-based techniques. However, these methods have limitations, such as the cold start problem and difficulty in capturing semantic relationships between articles. To overcome these challenges, we propose a new approach that combines BERTopic (Bidirectional Encoder Representations from Transformers), a state-of-the-art topic modeling technique, with community detection algorithms in a academic, professional network. Experiences confirm our performance expectations by showing good relevance and objectivity in the results.

Keywords: scientific articles, community detection, academic social network, recommender systems, neural topic model

Procedia PDF Downloads 93
7405 Efficacy of Learning: Digital Sources versus Print

Authors: Rahimah Akbar, Abdullah Al-Hashemi, Hanan Taqi, Taiba Sadeq

Abstract:

As technology continues to develop, teaching curriculums in both schools and universities have begun adopting a more computer/digital based approach to the transmission of knowledge and information, as opposed to the more old-fashioned use of textbooks. This gives rise to the question: Are there any differences in learning from a digital source over learning from a printed source, as in from a textbook? More specifically, which medium of information results in better long-term retention? A review of the confounding factors implicated in understanding the relationship between learning from the two different mediums was done. Alongside this, a 4-week cohort study involving 76 1st year English Language female students was performed, whereby the participants were divided into 2 groups. Group A studied material from a paper source (referred to as the Print Medium), and Group B studied material from a digital source (Digital Medium). The dependent variables were grading of memory recall indexed by a 4 point grading system, and total frequency of item repetition. The study was facilitated by advanced computer software called Super Memo. Results showed that, contrary to prevailing evidence, the Digital Medium group showed no statistically significant differences in terms of the shift from Remember (Episodic) to Know (Semantic) when all confounding factors were accounted for. The shift from Random Guess and Familiar to Remember occurred faster in the Digital Medium than it did in the Print Medium.

Keywords: digital medium, print medium, long-term memory recall, episodic memory, semantic memory, super memo, forgetting index, frequency of repetitions, total time spent

Procedia PDF Downloads 287
7404 The Influence of Cognitive Load in the Acquisition of Words through Sentence or Essay Writing

Authors: Breno Barrreto Silva, Agnieszka Otwinowska, Katarzyna Kutylowska

Abstract:

Research comparing lexical learning following the writing of sentences and longer texts with keywords is limited and contradictory. One possibility is that the recursivity of writing may enhance processing and increase lexical learning; another possibility is that the higher cognitive load of complex-text writing (e.g., essays), at least when timed, may hinder the learning of words. In our study, we selected 2 sets of 10 academic keywords matched for part of speech, length (number of characters), frequency (SUBTLEXus), and concreteness, and we asked 90 L1-Polish advanced-level English majors to use the keywords when writing sentences, timed (60 minutes) or untimed essays. First, all participants wrote a timed Control essay (60 minutes) without keywords. Then different groups produced Timed essays (60 minutes; n=33), Untimed essays (n=24), or Sentences (n=33) using the two sets of glossed keywords (counterbalanced). The comparability of the participants in the three groups was ensured by matching them for proficiency in English (LexTALE), and for few measures derived from the control essay: VocD (assessing productive lexical diversity), normed errors (assessing productive accuracy), words per minute (assessing productive written fluency), and holistic scores (assessing overall quality of production). We measured lexical learning (depth and breadth) via an adapted Vocabulary Knowledge Scale (VKS) and a free association test. Cognitive load was measured in the three essays (Control, Timed, Untimed) using normed number of errors and holistic scores (TOEFL criteria). The number of errors and essay scores were obtained from two raters (interrater reliability Pearson’s r=.78-91). Generalized linear mixed models showed no difference in the breadth and depth of keyword knowledge after writing Sentences, Timed essays, and Untimed essays. The task-based measurements found that Control and Timed essays had similar holistic scores, but that Untimed essay had better quality than Timed essay. Also, Untimed essay was the most accurate, and Timed essay the most error prone. Concluding, using keywords in Timed, but not Untimed, essays increased cognitive load, leading to more errors and lower quality. Still, writing sentences and essays yielded similar lexical learning, and differences in the cognitive load between Timed and Untimed essays did not affect lexical acquisition.

Keywords: learning academic words, writing essays, cognitive load, english as an L2

Procedia PDF Downloads 65
7403 A Collaborative Action Research by Using the Children’s School Success Plus Curriculum Framework to Support Early Childhood Education/Early Childhood Special Education Teachers to Build a Professional Learning Community

Authors: Chiou-Shiue Ko, Pei-Fang Wu, Shu-hsien Tseng

Abstract:

The researchers adopted two-year action research to investigate the professional collaborative process and development in learning communities for both early childhood and early childhood special education teachers on implementing the children’s school success curriculum framework. The participating teachers were recruited from three preschool sites for this current study. Research data were collected from multiple methods in order to ensure the data quality and validity. The results showed that participating educators had achieved professional growth, and they became more aware of teaching intentions and the preparation for the curriculum. Teachers in this research become more child-focused in teaching and create opportunities for children to participate in classroom activities and routines. The researcher also finds teachers’ participation levels were driven by each individual personality; during professional growth, some teachers are more proactive and reflective, and some are not. According to the research findings, suggestions for future studies and practices are provided.

Keywords: children’s school success curriculum framework, early childhood special education, preschool education, professional learning community

Procedia PDF Downloads 133
7402 The Desire to Know: Arnold’s Contribution to a Psychological Conceptualization of Academic Motivation

Authors: F. Ruiz-Fuster

Abstract:

Arnold’s redefinition of human motives can sustain a psychology of education which emphasizes the beauty of knowledge and the exercise of intellectual functions. Thus, education instead of focusing on skills and learning by doing would be centered on ‘the widest reaches of the human spirit’. One way to attain it is by developing children’s inherent interest. Arnold takes into account the fact that the desire to know is the inherent interest which leads students to explore and learn. She also emphasizes the need of exercising human functions as thinking, judging and reasoning. According to Arnold, the influence of psychological theories of motivation in education has derived in considering that all learning and school tasks should derive from children’s needs and impulses. The desire to know and the curiosity have not been considered as basic and active as any instinctive drive or basic need, so there has been an attempt to justify and understand how biological drives guide student’s learning. However, understanding motives and motivation not as a drive, an instinct or an impulse guided by our basic needs, but as a want that leads to action can help to understand, from a psychological perspective, how teachers can motivate students to learn, strengthening their desire and interest to reason and discover the whole new world of knowledge.

Keywords: academic motivation, interests, desire to know, educational psychology, intellectual functions

Procedia PDF Downloads 148
7401 Leveraging Engineering Education and Industrial Training: Learning from a Case Study

Authors: Li Wang

Abstract:

The explosive of technology advances has opened up many avenues of career options for engineering graduates. Hence, how relevant their learning at university is very much dependent on their actual jobs. Bridging the gap between education and industrial practice is important, but it also becomes evident how both engineering education and industrial training can be leveraged at the same time and balance between what students should grasp at university and what they can be continuously trained at the working environment. Through a case study of developing a commercial product, this paper presents the required level of depth of technical knowledge and skills for some typical engineering jobs (for mechanical/materials engineering). It highlights the necessary collaboration for industry, university, and accreditation bodies to work together to nurture the next generation of engineers.

Keywords: leverage, collaboration, career, industry, engineering education

Procedia PDF Downloads 91
7400 Challenge Based Learning Approach for a Craft Mezcal Kiln Energetic Redesign

Authors: Jonathan A. Sánchez Muñoz, Gustavo Flores Eraña, Juan M. Silva

Abstract:

Mexican Mezcal industry has reached attention during the last decade due to it has been a popular beverage demanded by North American and European markets, reaching popularity due to its crafty character. Despite its wide demand, productive processes are still made with rudimentary equipment, and there is a lack of evidence to improve kiln energy efficiency. Tec21 is a challenge-based learning curricular model implemented by Tecnológico de Monterrey since 2019, where each formation unit requires an industrial partner. “Problem processes solution” is a formation unity designed for mechatronics engineers, where students apply the acquired knowledge in thermofluids and apply electronic. During five weeks, students are immersed in an industrial problem to obtain a proper level of competencies according to formation unit designers. This work evaluates the competencies acquired by the student through qualitative research methodology. Several evaluation instruments (report, essay, and poster) were selected to evaluate etic argumentation, principles of sustainability, implemented actions, process modelling, and redesign feasibility.

Keywords: applied electronic, challenge based learning, competencies, mezcal industry, thermofluids

Procedia PDF Downloads 116
7399 Intrusion Detection Using Dual Artificial Techniques

Authors: Rana I. Abdulghani, Amera I. Melhum

Abstract:

With the abnormal growth of the usage of computers over networks and under the consideration or agreement of most of the computer security experts who said that the goal of building a secure system is never achieved effectively, all these points led to the design of the intrusion detection systems(IDS). This research adopts a comparison between two techniques for network intrusion detection, The first one used the (Particles Swarm Optimization) that fall within the field (Swarm Intelligence). In this Act, the algorithm Enhanced for the purpose of obtaining the minimum error rate by amending the cluster centers when better fitness function is found through the training stages. Results show that this modification gives more efficient exploration of the original algorithm. The second algorithm used a (Back propagation NN) algorithm. Finally a comparison between the results of two methods used were based on (NSL_KDD) data sets for the construction and evaluation of intrusion detection systems. This research is only interested in clustering the two categories (Normal and Abnormal) for the given connection records. Practices experiments result in intrude detection rate (99.183818%) for EPSO and intrude detection rate (69.446416%) for BP neural network.

Keywords: IDS, SI, BP, NSL_KDD, PSO

Procedia PDF Downloads 378
7398 Empowering Business Students with Intercultural Communicative Competence through Multicultural Literature

Authors: Dorsaf Ben Malek

Abstract:

The function of culture in language teaching changed because of globalization and the latest technologies. English became a lingua franca which resulted in altering the teaching objectives. The re-evaluation of cultural awareness is one of them. Business English teaching has also been subject to all these changes. It is therefore a wrong idea if we try to consider it as a diffusion of unlimited listing of lexis, diagrams, charts, and statistics. In fact, business students’ future career will require business terminology together with intercultural communicative competence (ICC) to handle different multicultural encounters and contribute to the international community. The first part of this paper is dedicated to the necessity of empowering business students with intercultural communicative competence and the second turns around the potential of multicultural literature in implementing ICC in business English teaching. This was proved through a qualitative action research done on a group of Tunisian MA business students. It was an opportunity to discover the potential of multicultural literature together with inquiry-based learning in enhancing business students’ intercultural communicative competence. Data were collected through classroom observations, journals and semi-structured interviews. Results were in favour of using multicultural literature to enhance business students’ ICC. In addition, the short story may be a motivating tool to read literature, and inquiry-based learning can be an effective approach to teaching literature.

Keywords: intercultural communicative competence, multicultural literature, short stories, inquiry-based learning

Procedia PDF Downloads 329
7397 The Use of Emerging Technologies in Higher Education Institutions: A Case of Nelson Mandela University, South Africa

Authors: Ayanda P. Deliwe, Storm B. Watson

Abstract:

The COVID-19 pandemic has disrupted the established practices of higher education institutions (HEIs). Most higher education institutions worldwide had to shift from traditional face-to-face to online learning. The online environment and new online tools are disrupting the way in which higher education is presented. Furthermore, the structures of higher education institutions have been impacted by rapid advancements in information and communication technologies. Emerging technologies should not be viewed in a negative light because, as opposed to the traditional curriculum that worked to create productive and efficient researchers, emerging technologies encourage creativity and innovation. Therefore, using technology together with traditional means will enhance teaching and learning. Emerging technologies in higher education not only change the experience of students, lecturers, and the content, but it is also influencing the attraction and retention of students. Higher education institutions are under immense pressure because not only are they competing locally and nationally, but emerging technologies also expand the competition internationally. Emerging technologies have eliminated border barriers, allowing students to study in the country of their choice regardless of where they are in the world. Higher education institutions are becoming indifferent as technology is finding its way into the lecture room day by day. Academics need to utilise technology at their disposal if they want to get through to their students. Academics are now competing for students' attention with social media platforms such as WhatsApp, Snapchat, Instagram, Facebook, TikTok, and others. This is posing a significant challenge to higher education institutions. It is, therefore, critical to pay attention to emerging technologies in order to see how they can be incorporated into the classroom in order to improve educational quality while remaining relevant in the work industry. This study aims to understand how emerging technologies have been utilised at Nelson Mandela University in presenting teaching and learning activities since April 2020. The primary objective of this study is to analyse how academics are incorporating emerging technologies in their teaching and learning activities. This primary objective was achieved by conducting a literature review on clarifying and conceptualising the emerging technologies being utilised by higher education institutions, reviewing and analysing the use of emerging technologies, and will further be investigated through an empirical analysis of the use of emerging technologies at Nelson Mandela University. Findings from the literature review revealed that emerging technology is impacting several key areas in higher education institutions, such as the attraction and retention of students, enhancement of teaching and learning, increase in global competition, elimination of border barriers, and highlighting the digital divide. The literature review further identified that learning management systems, open educational resources, learning analytics, and artificial intelligence are the most prevalent emerging technologies being used in higher education institutions. The identified emerging technologies will be further analysed through an empirical analysis to identify how they are being utilised at Nelson Mandela University.

Keywords: artificial intelligence, emerging technologies, learning analytics, learner management systems, open educational resources

Procedia PDF Downloads 67
7396 Trauma: Constructivist Theoretical Framework

Authors: Wendi Dunham, Kimberly Floyd

Abstract:

The constructivist approach to learning is a theoretical orientation that posits that individuals create their own understanding and knowledge of the world through their experiences and interactions. This approach emphasizes that learning is an active process and that individuals are not passive recipients when constructing their understanding of their world. When used concurrently with trauma-informed practices, a constructivist approach can inform the development of a framework for students and teachers that supports their social, emotional, and mental health in addition to enabling academic success. This framework can be applied to teachers and students. When applied to teachers, it can be used to achieve purposeful coping mechanisms through restorative justice and dispositional mindfulness. When applied to students, the framework can implement proactive, student-based practices such as Response to Intervention (RtI) and the 4 Rs to connect resiliency and intervention to academic learning. Using a constructivist, trauma-informed framework can provide students with a greater sense of control and agency over their trauma experiences and impart confidence in achieving school success.

Keywords: trauma, trauma informed practices in education, constructivist theory framework, school responses to trauma, trauma informed supports for teachers, trauma informed strategies for students, restorative justice, mindfulness, response to intervention, the 4 R's, resiliency

Procedia PDF Downloads 39
7395 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 120
7394 Aire-Dependent Transcripts have Shortened 3’UTRs and Show Greater Stability by Evading Microrna-Mediated Repression

Authors: Clotilde Guyon, Nada Jmari, Yen-Chin Li, Jean Denoyel, Noriyuki Fujikado, Christophe Blanchet, David Root, Matthieu Giraud

Abstract:

Aire induces ectopic expression of a large repertoire of tissue-specific antigen (TSA) genes in thymic medullary epithelial cells (MECs), driving immunological self-tolerance in maturing T cells. Although important mechanisms of Aire-induced transcription have recently been disclosed through the identification and the study of Aire’s partners, the fine transcriptional functions underlied by a number of them and conferred to Aire are still unknown. Alternative cleavage and polyadenylation (APA) is an essential mRNA processing step regulated by the termination complex consisting of 85 proteins, 10 of them have been related to Aire. We evaluated APA in MECs in vivo by microarray analysis with mRNA-spanning probes and RNA deep sequencing. We uncovered the preference of Aire-dependent transcripts for short-3’UTR isoforms and for proximal poly(A) site selection marked by the increased binding of the cleavage factor Cstf-64. RNA interference of the 10 Aire-related proteins revealed that Clp1, a member of the core termination complex, exerts a profound effect on short 3’UTR isoform preference. Clp1 is also significantly upregulated in the MECs compared to 25 mouse tissues in which we found that TSA expression is associated with longer 3’UTR isoforms. Aire-dependent transcripts escape a global 3’UTR lengthening associated with MEC differentiation, thereby potentiating the repressive effect of microRNAs that are globally upregulated in mature MECs. Consistent with these findings, RNA deep sequencing of actinomycinD-treated MECs revealed the increased stability of short 3’UTR Aire-induced transcripts, resulting in TSA transcripts accumulation and contributing for their enrichment in the MECs.

Keywords: Aire, central tolerance, miRNAs, transcription termination

Procedia PDF Downloads 378
7393 Enhancing Teaching of Engineering Mathematics

Authors: Tajinder Pal Singh

Abstract:

Teaching of mathematics to engineering students is an open ended problem in education. The main goal of mathematics learning for engineering students is the ability of applying a wide range of mathematical techniques and skills in their engineering classes and later in their professional work. Most of the undergraduate engineering students and faculties feels that no efforts and attempts are made to demonstrate the applicability of various topics of mathematics that are taught thus making mathematics unavoidable for some engineering faculty and their students. The lack of understanding of concepts in engineering mathematics may hinder the understanding of other concepts or even subjects. However, for most undergraduate engineering students, mathematics is one of the most difficult courses in their field of study. Most of the engineering students never understood mathematics or they never liked it because it was too abstract for them and they could never relate to it. A right balance of application and concept based teaching can only fulfill the objectives of teaching mathematics to engineering students. It will surely improve and enhance their problem solving and creative thinking skills. In this paper, some practical (informal) ways of making mathematics-teaching application based for the engineering students is discussed. An attempt is made to understand the present state of teaching mathematics in engineering colleges. The weaknesses and strengths of the current teaching approach are elaborated. Some of the causes of unpopularity of mathematics subject are analyzed and a few pragmatic suggestions have been made. Faculty in mathematics courses should spend more time discussing the applications as well as the conceptual underpinnings rather than focus solely on strategies and techniques to solve problems. They should also introduce more ‘word’ problems as these problems are commonly encountered in engineering courses. Overspecialization in engineering education should not occur at the expense of (or by diluting) mathematics and basic sciences. The role of engineering education is to provide the fundamental (basic) knowledge and to teach the students simple methodology of self-learning and self-development. All these issues would be better addressed if mathematics and engineering faculty join hands together to plan and design the learning experiences for the students who take their classes. When faculties stop competing against each other and start competing against the situation, they will perform better. Without creating any administrative hassles these suggestions can be used by any young inexperienced faculty of mathematics to inspire engineering students to learn engineering mathematics effectively.

Keywords: application based learning, conceptual learning, engineering mathematics, word problem

Procedia PDF Downloads 228
7392 Human-Centric Sensor Networks for Comfort and Productivity in Offices: Integrating Environmental, Body Area Network, and Participatory Sensing

Authors: Chenlu Zhang, Wanni Zhang, Florian Schaule

Abstract:

Indoor environment in office buildings directly affects comfort, productivity, health, and well-being of building occupants. Wireless environmental sensor networks have been deployed in many modern offices to monitor and control the indoor environments. However, indoor environmental variables are not strong enough predictors of comfort and productivity levels of every occupant due to personal differences, both physiologically and psychologically. This study proposes human-centric sensor networks that integrate wireless environmental sensors, body area network sensors and participatory sensing technologies to collect data from both environment and human and support building operations. The sensor networks have been tested in one small-size and one medium-size office rooms with 22 participants for five months. Indoor environmental data (e.g., air temperature and relative humidity), physiological data (e.g., skin temperature and Galvani skin response), and physiological responses (e.g., comfort and self-reported productivity levels) were obtained from each participant and his/her workplace. The data results show that: (1) participants have different physiological and physiological responses in the same environmental conditions; (2) physiological variables are more effective predictors of comfort and productivity levels than environmental variables. These results indicate that the human-centric sensor networks can support human-centric building control and improve comfort and productivity in offices.

Keywords: body area network, comfort and productivity, human-centric sensors, internet of things, participatory sensing

Procedia PDF Downloads 135
7391 Mobile Phones in Saudi Arabian EFL Classrooms

Authors: Srinivasa Rao Idapalapati, Manssour Habbash

Abstract:

As mobile connectedness continues to sweep across the landscape, the value of deploying mobile technology to the service of learning and teaching appears to be both self-evident and unavoidable. To this end, this study explores the reasons for the reluctance of teachers in Saudi Arabia to use mobiles in EFL (English as a Foreign Language) classes for teaching and learning purposes. The main objective of this study is a qualitative analysis of the responses of the views of the teachers at a university in Saudi Arabia about the use of mobile phones in classrooms for educational purposes. Driven by the hypothesis that the teachers in Saudi Arabian universities aren’t prepared well enough to use mobile phones in classrooms for educational purposes, this study examines the data obtained through a questionnaire provided to about hundred teachers working at a university in Saudi Arabia through convenient sampling method. The responses are analyzed by qualitative interpretive method and found that teachers and the students are in confusion whether to use mobiles, and need some training sessions on the use of mobile phones in classrooms for educational purposes. The outcome of the analysis is discussed in light of the concerns bases adoption model and the inferences are provided in a descriptive mode.

Keywords: mobile assisted language learning, technology adoption, classroom instruction, concerns based adoption model

Procedia PDF Downloads 361
7390 A Comparative Study on the Development of Webquest and Online Treasure Hunt as Instructional Materials in Teaching Motion in One Dimension for Grade VII Students

Authors: Mark Anthony Burdeos, Kara Ella Catoto, Alraine Pauyon, Elesar Malicoban

Abstract:

This study sought to develop, validate, and implement the WebQuest and Online Treasure Hunt as instructional materials in teaching Motion in One Dimension for Grade 7 students and to determine its effects on the students’ conceptual learning, performance and attitude towards Physics. In the development stage, several steps were taken, such as the actual planning and developing the WebQuest and Online Treasure Hunt and making the lesson plan and achievement test. The content and the ICT(Information Communications Technology) effect of the developed instructional materials were evaluated by the Content and ICT experts using adapted evaluation forms. During the implementation, pretest and posttest were administered to determine students’ performance, and pre-attitude and post-attitude tests to investigate students’ attitudes towards Physics before and after the WebQuest and Online Treasure Hunt activity. The developed WebQuest and Online Treasure Hunt passed the validation of Content experts and ICT experts. Students acquired more knowledge on Motion in One Dimension and gained a positive attitude towards Physics after the utilization of WebQuest and Online Treasure Hunt, evidenced significantly higher scores in posttest compared to pretest and higher ratings in post-attitude than pre-attitude. The developed WebQuest and Online Treasure Hunt were proven good in quality and effective materials in teaching Motion in One Dimension and developing a positive attitude towards Physics. However, students performed better in the pretest and posttest and rated higher in the pre-attitude and post-attitude tests in the WebQuest than in the Online Treasure Hunt. This study would provide significant learning experiences to the students that would be useful in building their knowledge, in understanding concepts in a most understandable way, in exercising to use their higher-order thinking skills, and in utilizing their capabilities and abilities to relate Physics topics to real-life situations thereby, students can have in-depth learning about Motion in One Dimension. This study would help teachers to enhance the teaching strategies as the two instructional materials provide interesting, engaging, and innovative teaching-learning experiences for the learners, which are helpful in increasing the level of their motivation and participation in learning Physics. In addition, it would provide information as a reference in using technology in the classroom and to determine which of the two instructional materials, WebQuest and Online Treasure Hunt, is suitable for the teaching-learning process in Motion in One Dimension.

Keywords: ICT integration, motion in one dimension, online treasure hunt, Webquest

Procedia PDF Downloads 174
7389 The Interplay of Factors Affecting Learning of Introductory Programming: A Comparative Study of an Australian and an Indian University

Authors: Ritu Sharma, Haifeng Shen

Abstract:

Teaching introductory programming is a challenging task in tertiary education and various factors are believed to have influence on students’ learning of programming. However, these factors were largely studied independently in a chosen context. This paper aims to investigate whether interrelationships exist among the factors and whether the interrelationships are context-dependent. In this empirical study, two universities were chosen from two continents, which represent different cultures, teaching methodologies, assessment criteria and languages used to teach programming in west and east worlds respectively. The results reveal that some interrelationships are common across the two different contexts, while others appear context-dependent.

Keywords: introductory programming, tertiary education, factors, interrelationships, context, empirical study

Procedia PDF Downloads 358