Search results for: distance learning
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8790

Search results for: distance learning

1980 Perceptions of Senior Academics in Teacher Education Colleges Regarding the Integration of Digital Games during the Pandemic

Authors: Merav Hayakac, Orit Avidov-Ungarab

Abstract:

The current study adopted an interpretive-constructivist approach to examine how senior academics from a large sample of Israeli teacher education colleges serving general or religious populations perceived the integration of digital games into their teacher instruction and what their policy and vision were in this regard in the context of the COVID-19 pandemic. Half the participants expressed a desire to integrate digital games into their teaching and learning but acknowledged that this practice was uncommon. Only a small minority believed they had achieved successful integration, with doubt and skepticism expressed by some religious colleges. Most colleges had policies encouraging technology integration supported by ongoing funding. Although a considerable gap between policy and implementation remained, the COVID-19 pandemic was viewed as having accelerated the integration of digital games into pre-service teacher instruction. The findings suggest that discussions around technology-related vision and policy and their translation into practice should relate to the specific cultural needs and academic preparedness of the population(s) served by the college.

Keywords: COVID-19, digital games, pedagogy, teacher education colleges

Procedia PDF Downloads 92
1979 Artificial Intelligence and Distributed System Computing: Application and Practice in Real Life

Authors: Lai Junzhe, Wang Lihao, Burra Venkata Durga Kumar

Abstract:

In recent years, due to today's global technological advances, big data and artificial intelligence technologies have been widely used in various industries and fields, playing an important role in reducing costs and increasing efficiency. Among them, artificial intelligence has derived another branch in its own continuous progress and the continuous development of computer personnel, namely distributed artificial intelligence computing systems. Distributed AI is a method for solving complex learning, decision-making, and planning problems, characterized by the ability to take advantage of large-scale computation and the spatial distribution of resources, and accordingly, it can handle problems with large data sets. Nowadays, distributed AI is widely used in military, medical, and human daily life and brings great convenience and efficient operation to life. In this paper, we will discuss three areas of distributed AI computing systems in vision processing, blockchain, and smart home to introduce the performance of distributed systems and the role of AI in distributed systems.

Keywords: distributed system, artificial intelligence, blockchain, IoT, visual information processing, smart home

Procedia PDF Downloads 105
1978 Investigations of Protein Aggregation Using Sequence and Structure Based Features

Authors: M. Michael Gromiha, A. Mary Thangakani, Sandeep Kumar, D. Velmurugan

Abstract:

The main cause of several neurodegenerative diseases such as Alzhemier, Parkinson, and spongiform encephalopathies is formation of amyloid fibrils and plaques in proteins. We have analyzed different sets of proteins and peptides to understand the influence of sequence-based features on protein aggregation process. The comparison of 373 pairs of homologous mesophilic and thermophilic proteins showed that aggregation-prone regions (APRs) are present in both. But, the thermophilic protein monomers show greater ability to ‘stow away’ the APRs in their hydrophobic cores and protect them from solvent exposure. The comparison of amyloid forming and amorphous b-aggregating hexapeptides suggested distinct preferences for specific residues at the six positions as well as all possible combinations of nine residue pairs. The compositions of residues at different positions and residue pairs have been converted into energy potentials and utilized for distinguishing between amyloid forming and amorphous b-aggregating peptides. Our method could correctly identify the amyloid forming peptides at an accuracy of 95-100% in different datasets of peptides.

Keywords: aggregation, amyloids, thermophilic proteins, amino acid residues, machine learning techniques

Procedia PDF Downloads 609
1977 Multi-Modal Visualization of Working Instructions for Assembly Operations

Authors: Josef Wolfartsberger, Michael Heiml, Georg Schwarz, Sabrina Egger

Abstract:

Growing individualization and higher numbers of variants in industrial assembly products raise the complexity of manufacturing processes. Technical assistance systems considering both procedural and human factors allow for an increase in product quality and a decrease in required learning times by supporting workers with precise working instructions. Due to varying needs of workers, the presentation of working instructions leads to several challenges. This paper presents an approach for a multi-modal visualization application to support assembly work of complex parts. Our approach is integrated within an interconnected assistance system network and supports the presentation of cloud-streamed textual instructions, images, videos, 3D animations and audio files along with multi-modal user interaction, customizable UI, multi-platform support (e.g. tablet-PC, TV screen, smartphone or Augmented Reality devices), automated text translation and speech synthesis. The worker benefits from more accessible and up-to-date instructions presented in an easy-to-read way.

Keywords: assembly, assistive technologies, augmented reality, manufacturing, visualization

Procedia PDF Downloads 162
1976 The Relationships between Second Language Proficiency (L2) and Interpersonal Relationships of Students and Teachers: Pilot Study in Wenzhou-Kean University

Authors: Hu Yinyao

Abstract:

Learning and using a second language have become more and more common in daily life. Understanding the complexity of second language proficiency can help students develop their interpersonal relationships with their friends and professors, even enhancing intimacy. This paper examines Wenzhou-Kean University students' second language proficiency and interpersonal relationships. The purpose of the research was to explore the relationship between second language proficiency, extent of intimacy, and interpersonal relationships of the 100 Wenzhou-Kean University students. A mixed methodology was utilized in the research study. Student respondents from Wenzhou-Kean University were chosen randomly by using random sampling. The data analysis used descriptive data in terms of figures and thematical data in the table. The researcher found that Wenzhou-Kean University’s students have shown lower intermediate level of second language proficiency and that their intimacy is middle when using a second language. Especially when talking about some sensitive topics, students tend not to use a second language due to low proficiency. This research project has a strong implication on interpersonal relationships and second language proficiency. The outcome of the study would be greatly helpful to enhance the interpersonal relationship and intimacy between students and students, students and professors who use.

Keywords: Interpersonal relationship, second language proficiency, intimacy, education, univeristy students

Procedia PDF Downloads 37
1975 Lecturer’s Perception of the Role of Information and Communication Technology in Office Technology and Management Programme in Polytechnics in Nigeria

Authors: Felicia Kikelomo Oluwalola

Abstract:

This study examined lecturers’ perception of the roles of Information and Communication Technology (ICT) in Office Technology and Management (OTM) programme in polytechnics, in South-West, Nigeria. Descriptive survey design was adopted in this study. Purposive sampling technique was used to select all OTM lecturers in the nine (9) Polytechnics in the South-West, Nigeria. A 4-rating scale was adopted questionnaire titled ‘Lecturers’ Perception of the Roles of ICT in OTM Programme in Polytechnics’ with a reliability index of 0.93 was used. Two research questions were answered, and one null hypothesis was tested for the study. Data collected was analysed using descriptive statistics, independent t-test and one way Analysis of Variance (ANOVA) at 0.05 level of significance. The study revealed that lecturers have right perception of the roles of ICT in OTM programme in polytechnics. Also, the study revealed no significant difference between the mean perception of male and female lecturers in office technology and management. Based on the findings, the study recommended among others that recruitment of professionals in the field of ICT is necessary for effective teaching learning to be established and OTM curriculum should be constantly reviewed to enhance some ICT package that is acceptable globally.

Keywords: communication, information, perception, technology

Procedia PDF Downloads 452
1974 Indigenous Storytelling: Transformation for Health, Emotions and Spirituality

Authors: Annabelle Nelson

Abstract:

This literature review documents indigenous storytelling as it functions to help humans face adversity and find emotional strength by aligning with nature. Archetypes in stories can transform the inner world from a Jungian perspective. Joseph Campbell’s hero-heroine cycle depicts the structure of stories to include a call to adventure, tests, helpers, and a return as the transformed person can help him or herself and even help their communities. By showcasing certain character traits, such as bravery or perseverance or humility, stories give maps for humans to face adversity. The main characters or archetypes in stories, as Carl Jung posited, provide a vehicle that can open consciousness if a listener identifies with the character. As documented in the review, this has many benefits. First, it can open consciousness to the collective unconscious for insight and intuitive clarity, as well as healing and release emotional trauma. The resultant spacious quality of consciousness allows the spiritual self to present insights to conscious awareness. Research in applied youth development programs demonstrates the utility of storytelling to prompt healthy choices and transform difficult life experience into success.

Keywords: archetypes, learning, storytelling, transformation

Procedia PDF Downloads 184
1973 Influence of Radio Frequency Identification Technology at Cost of Supply Chain as a Driver for the Generation of Competitive Advantage

Authors: Mona Baniahmadi, Saied Haghanifar

Abstract:

Radio Frequency Identification (RFID) is regarded as a promising technology for the optimization of supply chain processes since it improves manufacturing and retail operations from forecasting demand for planning, managing inventory, and distribution. This study precisely aims at learning to know the RFID technology and at explaining how it can concretely be used for supply chain management and how it can help improving it in the case of Hejrat Company which is located in Iran and works on the distribution of medical drugs and cosmetics. This study uses some statistical analysis to calculate the expected benefits of an integrated RFID system on supply chain obtained through competitive advantages increases with decreasing cost factor. The study investigates how the cost of storage process, labor cost, the cost of missing goods, inventory management optimization, on-time delivery, order cost, lost sales and supply process optimization affect the performance of the integrated RFID supply chain regarding cost factors and provides a competitive advantage.

Keywords: cost, competitive advantage, radio frequency identification, supply chain

Procedia PDF Downloads 269
1972 Challenges in Promoting Software Usability and Applying Principles of Usage-Centred Design in Saudi Arabia

Authors: Kholod J. Alotaibi, Andrew M. Gravell

Abstract:

A study was conducted in which 212 software developers in higher education institutions in Saudi Arabia were surveyed to gather an indication of their understanding of the concept of usability, their acceptance of its importance, and to see how well its principles are applied. Interviews were then held with 20 of these developers, and a demonstration of Usage-Centred Design was attempted, a highly usability focused software development methodology, at one select institution for its redesign of an e-learning exam system interface during the requirements gathering phase. The study confirms the need to raise awareness of usability and its importance, and for Usage-Centred Design to be applied in its entirety, also need to encourage greater consultation with potential end-users of software and collaborative practices. The demonstration of Usage-Centred Design confirmed its ability to capture usability requirements more completely and precisely than would otherwise be the case, and hence its usefulness for developers concerned with improving software usability. The concluding discussion delves on the challenges for promoting usability and Usage-Centred Design in light of the research results and findings and recommendations are made for the same.

Keywords: usability, usage-centred, applying principles of usage-centred, Saudi Arabia

Procedia PDF Downloads 384
1971 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data

Authors: S. Nickolas, Shobha K.

Abstract:

The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.

Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing

Procedia PDF Downloads 271
1970 Investigation on Performance of Change Point Algorithm in Time Series Dynamical Regimes and Effect of Data Characteristics

Authors: Farhad Asadi, Mohammad Javad Mollakazemi

Abstract:

In this paper, Bayesian online inference in models of data series are constructed by change-points algorithm, which separated the observed time series into independent series and study the change and variation of the regime of the data with related statistical characteristics. variation of statistical characteristics of time series data often represent separated phenomena in the some dynamical system, like a change in state of brain dynamical reflected in EEG signal data measurement or a change in important regime of data in many dynamical system. In this paper, prediction algorithm for studying change point location in some time series data is simulated. It is verified that pattern of proposed distribution of data has important factor on simpler and smother fluctuation of hazard rate parameter and also for better identification of change point locations. Finally, the conditions of how the time series distribution effect on factors in this approach are explained and validated with different time series databases for some dynamical system.

Keywords: time series, fluctuation in statistical characteristics, optimal learning, change-point algorithm

Procedia PDF Downloads 421
1969 The Scientific Study of the Relationship Between Physicochemical and Microstructural Properties of Ultrafiltered Cheese: Protein Modification and Membrane Separation

Authors: Shahram Naghizadeh Raeisi, Ali Alghooneh

Abstract:

The loss of curd cohesiveness and syneresis are two common problems in the ultrafiltered cheese industry. In this study, by using membrane technology and protein modification, a modified cheese was developed and its properties were compared with a control sample. In order to decrease the lactose content and adjust the protein, acidity, dry matter and milk minerals, a combination of ultrafiltration, nanofiltration and reverse osmosis technologies was employed. For protein modification, a two-stage chemical and enzymatic reaction was employed before and after ultrafiltration. The physicochemical and microstructural properties of the modified ultrafiltered cheese were compared with the control one. Results showed that the modified protein enhanced the functional properties of the final cheese significantly (pvalue< 0.05), even if the protein content was 50% lower than the control one. The modified cheese showed 21 ± 0.70, 18 ± 1.10 & 25±1.65% higher hardness, cohesiveness and water-holding capacity values, respectively, than the control sample. This behavior could be explained by the developed microstructure of the gel network. Furthermore, chemical-enzymatic modification of milk protein induced a significant change in the network parameter of the final cheese. In this way, the indices of network linkage strength, network linkage density, and time scale of junctions were 10.34 ± 0.52, 68.50 ± 2.10 & 82.21 ± 3.85% higher than the control sample, whereas the distance between adjacent linkages was 16.77 ± 1.10% lower than the control sample. These results were supported by the results of the textural analysis. A non-linear viscoelastic study showed a triangle waveform stress of the modified protein contained cheese, while the control sample showed rectangular waveform stress, which suggested a better sliceability of the modified cheese. Moreover, to study the shelf life of the products, the acidity, as well as molds and yeast population, were determined in 120 days. It’s worth mentioning that the lactose content of modified cheese was adjusted at 2.5% before fermentation, while the lactose of the control one was at 4.5%. The control sample showed 8 weeks shelf life, while the shelf life of the modified cheese was 18 weeks in the refrigerator. During 18 weeks, the acidity of modified and control samples increased from 82 ± 1.50 to 94 ± 2.20 °D and 88 ± 1.64 to 194 ± 5.10 °D, respectively. The mold and yeast populations, with time, followed the semicircular shape model (R2 = 0.92, R2adj = 0.89, RMSE = 1.25). Furthermore, the mold and yeast counts and their growth rate in the modified cheese were lower than those for control one; Aforementioned result could be explained by the shortage of the source of energy for the microorganism in the modified cheese. The lactose content of the modified sample was less than 0.2 ± 0.05% at the end of fermentation, while this was 3.7 ± 0.68% in the control sample.

Keywords: non-linear viscoelastic, protein modification, semicircular shape model, ultrafiltered cheese

Procedia PDF Downloads 70
1968 Using Technology to Deliver and Scale Early Childhood Development Services in Resource Constrained Environments: Case Studies from South Africa

Authors: Sonja Giese, Tess N. Peacock

Abstract:

South African based Innovation Edge is experimenting with technology to drive positive behavior change, enable data-driven decision making, and scale quality early years services. This paper uses five case studies to illustrate how technology can be used in resource-constrained environments to first, encourage parenting practices that build early language development (using a stage-based mobile messaging pilot, ChildConnect), secondly, to improve the quality of ECD programs (using a mobile application, CareUp), thirdly, how to affordably scale services for the early detection of visual and hearing impairments (using a mobile tool, HearX), fourthly, how to build a transparent and accountable system for the registration and funding of ECD (using a blockchain enabled platform, Amply), and finally enable rapid data collection and feedback to facilitate quality enhancement of programs at scale (the Early Learning Outcomes Measure). ChildConnect and CareUp were both developed using a design based iterative research approach. The usage and uptake of ChildConnect and CareUp was evaluated with qualitative and quantitative methods. Actual child outcomes were not measured in the initial pilots. Although parents who used and engaged on either platform felt more supported and informed, parent engagement and usage remains a challenge. This is contrast to ECD practitioners whose usage and knowledge with CareUp showed both sustained engagement and knowledge improvement. HearX is an easy-to-use tool to identify hearing loss and visual impairment. The tool was tested with 10000 children in an informal settlement. The feasibility of cost-effectively decentralising screening services was demonstrated. Practical and financial barriers remain with respect to parental consent and for successful referrals. Amply uses mobile and blockchain technology to increase impact and accountability of public services. In the pilot project, Amply is being used to replace an existing paper-based system to register children for a government-funded pre-school subsidy in South Africa. Early Learning Outcomes Measure defines what it means for a child to be developmentally ‘on track’ at aged 50-69 months. ELOM administration is enabled via a tablet which allows for easy and accurate data collection, transfer, analysis, and feedback. ELOM is being used extensively to drive quality enhancement of ECD programs across multiple modalities. The nature of ECD services in South Africa is that they are in large part provided by disconnected private individuals or Non-Governmental Organizations (in contrast to basic education which is publicly provided by the government). It is a disparate sector which means that scaling successful interventions is that much harder. All five interventions show the potential of technology to support and enhance a range of ECD services, but pathways to scale are still being tested.

Keywords: assessment, behavior change, communication, data, disabilities, mobile, scale, technology, quality

Procedia PDF Downloads 129
1967 Using Speech Emotion Recognition as a Longitudinal Biomarker for Alzheimer’s Diseases

Authors: Yishu Gong, Liangliang Yang, Jianyu Zhang, Zhengyu Chen, Sihong He, Xusheng Zhang, Wei Zhang

Abstract:

Alzheimer’s disease (AD) is a progressive neurodegenerative disorder that affects millions of people worldwide and is characterized by cognitive decline and behavioral changes. People living with Alzheimer’s disease often find it hard to complete routine tasks. However, there are limited objective assessments that aim to quantify the difficulty of certain tasks for AD patients compared to non-AD people. In this study, we propose to use speech emotion recognition (SER), especially the frustration level, as a potential biomarker for quantifying the difficulty patients experience when describing a picture. We build an SER model using data from the IEMOCAP dataset and apply the model to the DementiaBank data to detect the AD/non-AD group difference and perform longitudinal analysis to track the AD disease progression. Our results show that the frustration level detected from the SER model can possibly be used as a cost-effective tool for objective tracking of AD progression in addition to the Mini-Mental State Examination (MMSE) score.

Keywords: Alzheimer’s disease, speech emotion recognition, longitudinal biomarker, machine learning

Procedia PDF Downloads 109
1966 Improving System Performance through User's Resource Access Patterns

Authors: K. C. Wong

Abstract:

This paper demonstrates a number of examples in the hope to shed some light on the possibility of designing future operating systems in a more adaptation-based manner. A modern operating system, we conceive, should possess the capability of 'learning' in such a way that it can dynamically adjust its services and behavior according to the current status of the environment in which it operates. In other words, a modern operating system should play a more proactive role during the session of providing system services to users. As such, a modern operating system is expected to create a computing environment, in which its users are provided with system services more matching their dynamically changing needs. The examples demonstrated in this paper show that user's resource access patterns 'learned' and determined during a session can be utilized to improve system performance and hence to provide users with a better and more effective computing environment. The paper also discusses how to use the frequency, the continuity, and the duration of resource accesses in a session to quantitatively measure and determine user's resource access patterns for the examples shown in the paper.

Keywords: adaptation-based systems, operating systems, resource access patterns, system performance

Procedia PDF Downloads 132
1965 1D Convolutional Networks to Compute Mel-Spectrogram, Chromagram, and Cochleogram for Audio Networks

Authors: Elias Nemer, Greg Vines

Abstract:

Time-frequency transformation and spectral representations of audio signals are commonly used in various machine learning applications. Training networks on frequency features such as the Mel-Spectrogram or Cochleogram have been proven more effective and convenient than training on-time samples. In practical realizations, these features are created on a different processor and/or pre-computed and stored on disk, requiring additional efforts and making it difficult to experiment with different features. In this paper, we provide a PyTorch framework for creating various spectral features as well as time-frequency transformation and time-domain filter-banks using the built-in trainable conv1d() layer. This allows computing these features on the fly as part of a larger network and enabling easier experimentation with various combinations and parameters. Our work extends the work in the literature developed for that end: First, by adding more of these features and also by allowing the possibility of either starting from initialized kernels or training them from random values. The code is written as a template of classes and scripts that users may integrate into their own PyTorch classes or simply use as is and add more layers for various applications.

Keywords: neural networks Mel-Spectrogram, chromagram, cochleogram, discrete Fourrier transform, PyTorch conv1d()

Procedia PDF Downloads 226
1964 Functional Neural Network for Decision Processing: A Racing Network of Programmable Neurons Where the Operating Model Is the Network Itself

Authors: Frederic Jumelle, Kelvin So, Didan Deng

Abstract:

In this paper, we are introducing a model of artificial general intelligence (AGI), the functional neural network (FNN), for modeling human decision-making processes. The FNN is composed of multiple artificial mirror neurons (AMN) racing in the network. Each AMN has a similar structure programmed independently by the users and composed of an intention wheel, a motor core, and a sensory core racing at a specific velocity. The mathematics of the node’s formulation and the racing mechanism of multiple nodes in the network will be discussed, and the group decision process with fuzzy logic and the transformation of these conceptual methods into practical methods of simulation and in operations will be developed. Eventually, we will describe some possible future research directions in the fields of finance, education, and medicine, including the opportunity to design an intelligent learning agent with application in AGI. We believe that FNN has a promising potential to transform the way we can compute decision-making and lead to a new generation of AI chips for seamless human-machine interactions (HMI).

Keywords: neural computing, human machine interation, artificial general intelligence, decision processing

Procedia PDF Downloads 120
1963 Comprehensive Lifespan Support for Quality of Life

Authors: Joann Douziech

Abstract:

Individuals with intellectual and developmental disabilities (IDD) possess characteristics that present both challenges and gifts. Individuals with IDD require and are worthy of intentional, strategic, and specialized support throughout their lifespan to ensure optimum quality-of-life outcomes. The current global advocacy movement advancing the rights of individuals with IDD emphasizes a high degree of choice over life decisions. For some individuals, this degree of choice results in a variety of negative health and well-being outcomes. Improving the quality of life outcomes requires the combination of a commitment to the rights of the individual with a responsibility to provide support and choice commensurate with individual capacity. A belief that individuals with IDD are capable of learning and they are worthy of being taught provides the foundation for a holistic model of support throughout their lifespan. This model is based on three pillars of engineering the environment, promoting skill development and maintenance, and staff support. In an ever-changing world, supporting quality of life requires attention to moments, phases, and changes in stages throughout the lifespan. Balancing these complexities with strategic, responsive, and dynamic interventions enhances the quality of life of individuals with ID throughout their lifespan.

Keywords: achieving optimum quality of life, comprehensive support, lifespan approach, philosophy and pedagogy

Procedia PDF Downloads 62
1962 The Impact of Technology on Sales Researches and Distribution

Authors: Nady Farag Faragalla Hanna

Abstract:

In the car dealership industry in Japan, the sales specialist is a key factor in the success of the company. I hypothesize that when a company understands the characteristics of sales professionals in its industry, it is easier to recruit and train salespeople effectively. Lean human resources management ensures the economic success and performance of companies, especially small and medium-sized companies.The purpose of the article is to determine the characteristics of sales specialists for small and medium-sized car dealerships using the chi-square test and the proximate variable model. Accordingly, the results show that career change experience, learning ability and product knowledge are important, while university education, career building through internal transfer, leadership experience and people development are not important for becoming a sales professional. I also show that the characteristics of sales specialists are perseverance, humility, improvisation and passion for business.

Keywords: electronics engineering, marketing, sales, E-commerce digitalization, interactive systems, sales process ARIMA models, sales demand forecasting, time series, R codetraits of sales professionals, variable precision rough sets theory, sales professional, sales professionals

Procedia PDF Downloads 45
1961 E-teaching Barriers: A Survey from Shanghai Primary School Teachers

Authors: Liu Dan

Abstract:

It was considered either unnecessary or impossible for primary school students to implement online teaching until last year. A large number of E-learning or E-teaching researches have been focused on adult-learners, andragogy and technology, however, primary school education, it is facing many problems that need to be solved. Therefore, this research is aimed at exploring barriers and influential factors on online teaching for K-12 students from teachers’ perspectives and discussing the E-pedagogy that is suitable for primary school students and teachers. Eight hundred and ninety-six teachers from 10 primary schools in Shanghai were invited to participate in a questionnaire survey. Data were analysed by hierarchical regression, and the results stress the significant three barriers by teachers with online teaching: the existing system is deficient in emotional interaction, teachers’ attitude towards the technology is negative and the present teacher training is lack of systematic E-pedagogy guidance. The barriers discovered by this study will help the software designers (E-lab) develop tools that allow for flexible and evolving pedagogical approaches whilst providing an easy entry point for cautious newcomers, so that help the teachers free to engage in E-teaching at pedagogical and disciplinary levels, to enhance their repertoire of teaching practices.

Keywords: online teaching barriers (OTB), e-teaching, primary school, teachers, technology

Procedia PDF Downloads 197
1960 Simulation of Flow through Dam Foundation by FEM and ANN Methods Case Study: Shahid Abbaspour Dam

Authors: Mehrdad Shahrbanozadeh, Gholam Abbas Barani, Saeed Shojaee

Abstract:

In this study, a finite element (Seep3D model) and an artificial neural network (ANN) model were developed to simulate flow through dam foundation. Seep3D model is capable of simulating three-dimensional flow through a heterogeneous and anisotropic, saturated and unsaturated porous media. Flow through the Shahid Abbaspour dam foundation has been used as a case study. The FEM with 24960 triangular elements and 28707 nodes applied to model flow through foundation of this dam. The FEM being made denser in the neighborhood of the curtain screen. The ANN model developed for Shahid Abbaspour dam is a feedforward four layer network employing the sigmoid function as an activator and the back-propagation algorithm for the network learning. The water level elevations of the upstream and downstream of the dam have been used as input variables and the piezometric heads as the target outputs in the ANN model. The two models are calibrated and verified using the Shahid Abbaspour’s dam piezometric data. Results of the models were compared with those measured by the piezometers which are in good agreement. The model results also revealed that the ANN model performed as good as and in some cases better than the FEM.

Keywords: seepage, dam foundation, finite element method, neural network, seep 3D model

Procedia PDF Downloads 467
1959 Reusing Assessments Tests by Generating Arborescent Test Groups Using a Genetic Algorithm

Authors: Ovidiu Domşa, Nicolae Bold

Abstract:

Using Information and Communication Technologies (ICT) notions in education and three basic processes of education (teaching, learning and assessment) can bring benefits to the pupils and the professional development of teachers. In this matter, we refer to these notions as concepts taken from the informatics area and apply them to the domain of education. These notions refer to genetic algorithms and arborescent structures, used in the specific process of assessment or evaluation. This paper uses these kinds of notions to generate subtrees from a main tree of tests related between them by their degree of difficulty. These subtrees must contain the highest number of connections between the nodes and the lowest number of missing edges (which are subtrees of the main tree) and, in the particular case of the non-existence of a subtree with no missing edges, the subtrees which have the lowest (minimal) number of missing edges between the nodes, where a node is a test and an edge is a direct connection between two tests which differs by one degree of difficulty. The subtrees are represented as sequences. The tests are the same (a number coding a test represents that test in every sequence) and they are reused for each sequence of tests.

Keywords: chromosome, genetic algorithm, subtree, test

Procedia PDF Downloads 321
1958 Digitization and Economic Growth in Africa: The Role of Financial Sector Development

Authors: Abdul Ganiyu Iddrisu, Bei Chen

Abstract:

Digitization is the process of transforming analog material into digital form, especially for storage and use in a computer. Significant development of information and communication technology (ICT) over the past years has encouraged many researchers to investigate its contribution to promoting economic growth and reducing poverty. Yet the compelling empirical evidence on the effects of digitization on economic growth remains weak, particularly in Africa. This is because extant studies that explicitly evaluate digitization and economic growth nexus are mostly reports and desk reviews. This points out an empirical knowledge gap in the literature. Hypothetically, digitization influences financial sector development which in turn influences economic growth. Digitization has changed the financial sector and its operating environment. Obstacles to access to financing, for instance, physical distance, minimum balance requirements, and low-income flows, among others can be circumvented. Savings have increased, micro-savers have opened bank accounts, and banks are now able to price short-term loans. This has the potential to develop the financial sector. However, empirical evidence on the digitization-financial development nexus is dearth. On the other hand, a number of studies maintained that financial sector development greatly influences growth of economies. We, therefore, argue that financial sector development is one of the transmission mechanisms through which digitization affects economic growth. Employing macro-country-level data from African countries and using fixed effects, random effects and Hausman-Taylor estimation approaches, this paper contributes to the literature by analysing economic growth in Africa, focusing on the role of digitization and financial sector development. First, we assess how digitization influences financial sector development in Africa. From an economic policy perspective, it is important to identify digitization determinants of financial sector development so that action can be taken to reduce the economic shocks associated with financial sector distortions. This nexus is rarely examined empirically in the literature. Secondly, we examine the effect of domestic credit to the private sector and stock market capitalization as a percentage of GDP as used to proxy for financial sector development on economic growth. Digitization is represented by the volume of digital/ICT equipment imported and GDP growth is used to proxy economic growth. Finally, we examine the effect of digitization on economic growth in the light of financial sector development. The following key results were found; first, digitalization propels financial sector development in Africa. Second, financial sector development enhances economic growth. Finally, contrary to our expectation, the results also indicate that digitalization conditioned on financial sector development tends to reduce economic growth in Africa. However, results of the net effects suggest that digitalization, overall, improve economic growth in Africa. We, therefore, conclude that, digitalization in Africa does not only develop the financial sector but unconditionally contributes the growth of the continent’s economies.

Keywords: digitalization, financial sector development, Africa, economic growth

Procedia PDF Downloads 135
1957 Deep-Learning to Generation of Weights for Image Captioning Using Part-of-Speech Approach

Authors: Tiago do Carmo Nogueira, Cássio Dener Noronha Vinhal, Gélson da Cruz Júnior, Matheus Rudolfo Diedrich Ullmann

Abstract:

Generating automatic image descriptions through natural language is a challenging task. Image captioning is a task that consistently describes an image by combining computer vision and natural language processing techniques. To accomplish this task, cutting-edge models use encoder-decoder structures. Thus, Convolutional Neural Networks (CNN) are used to extract the characteristics of the images, and Recurrent Neural Networks (RNN) generate the descriptive sentences of the images. However, cutting-edge approaches still suffer from problems of generating incorrect captions and accumulating errors in the decoders. To solve this problem, we propose a model based on the encoder-decoder structure, introducing a module that generates the weights according to the importance of the word to form the sentence, using the part-of-speech (PoS). Thus, the results demonstrate that our model surpasses state-of-the-art models.

Keywords: gated recurrent units, caption generation, convolutional neural network, part-of-speech

Procedia PDF Downloads 95
1956 Telemedicine Versus Face-to-Face Follow up in General Surgery: A Randomized Controlled Trial

Authors: Teagan Fink, Lynn Chong, Michael Hii, Brett Knowles

Abstract:

Background: Telemedicine is a rapidly advancing field providing healthcare to patients at a distance from their treating clinician. There is a paucity of high-quality evidence detailing the safety and acceptability of telemedicine for postoperative outpatient follow-up. This randomized controlled trial – conducted prior to the COVID 19 pandemic – aimed to assess patient satisfaction and safety (as determined by readmission, reoperation and complication rates) of telephone compared to face-to-face clinic follow-up after uncomplicated general surgical procedures. Methods: Patients following uncomplicated laparoscopic appendicectomy or cholecystectomy and laparoscopic or open umbilical or inguinal hernia repairs were randomized to a telephone or face-to-face outpatient clinic follow-up. Data points including patient demographics, perioperative details and postoperative outcomes (eg. wound healing complications, pain scores, unplanned readmission to hospital and return to daily activities) were compared between groups. Patients also completed a Likert patient satisfaction survey following their consultation. Results: 103 patients were recruited over a 12-month period (21 laparoscopic appendicectomies, 65 laparoscopic cholecystectomies, nine open umbilical hernia repairs, six laparoscopic inguinal hernia repairs and two laparoscopic umbilical hernia repairs). Baseline patient demographics and operative interventions were the same in both groups. Patient or clinician-reported concerns on postoperative pain, use of analgesia, wound healing complications and return to daily activities at clinic follow-up were not significantly different between the two groups. Of the 58 patients randomized to the telemedicine arm, 40% reported high and 60% reported very high patient satisfaction. Telemedicine clinic mean consultation times were significantly shorter than face-to-face consultation times (telemedicine 10.3 +/- 7.2 minutes, face-to-face 19.2 +/- 23.8 minutes, p-value = 0.014). Rates of failing to attend clinic were not significantly different (telemedicine 3%, control 6%). There was no increased rate of postoperative complications in patients followed up by telemedicine compared to in-person. There were no unplanned readmissions, return to theatre, or mortalities in this study. Conclusion: Telemedicine follow-up of patients undergoing uncomplicated general surgery is safe and does not result in any missed diagnosis or higher rates of complications. Telemedicine provides high patient satisfaction and steps to implement this modality in inpatient care should be undertaken.

Keywords: general surgery, telemedicine, patient satisfaction, patient safety

Procedia PDF Downloads 115
1955 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach

Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta

Abstract:

Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.

Keywords: support vector machines, decision tree, random forest

Procedia PDF Downloads 33
1954 Comparative Analysis of Predictive Models for Customer Churn Prediction in the Telecommunication Industry

Authors: Deepika Christopher, Garima Anand

Abstract:

To determine the best model for churn prediction in the telecom industry, this paper compares 11 machine learning algorithms, namely Logistic Regression, Support Vector Machine, Random Forest, Decision Tree, XGBoost, LightGBM, Cat Boost, AdaBoost, Extra Trees, Deep Neural Network, and Hybrid Model (MLPClassifier). It also aims to pinpoint the top three factors that lead to customer churn and conducts customer segmentation to identify vulnerable groups. According to the data, the Logistic Regression model performs the best, with an F1 score of 0.6215, 81.76% accuracy, 68.95% precision, and 56.57% recall. The top three attributes that cause churn are found to be tenure, Internet Service Fiber optic, and Internet Service DSL; conversely, the top three models in this article that perform the best are Logistic Regression, Deep Neural Network, and AdaBoost. The K means algorithm is applied to establish and analyze four different customer clusters. This study has effectively identified customers that are at risk of churn and may be utilized to develop and execute strategies that lower customer attrition.

Keywords: attrition, retention, predictive modeling, customer segmentation, telecommunications

Procedia PDF Downloads 53
1953 KCBA, A Method for Feature Extraction of Colonoscopy Images

Authors: Vahid Bayrami Rad

Abstract:

In recent years, the use of artificial intelligence techniques, tools, and methods in processing medical images and health-related applications has been highlighted and a lot of research has been done in this regard. For example, colonoscopy and diagnosis of colon lesions are some cases in which the process of diagnosis of lesions can be improved by using image processing and artificial intelligence algorithms, which help doctors a lot. Due to the lack of accurate measurements and the variety of injuries in colonoscopy images, the process of diagnosing the type of lesions is a little difficult even for expert doctors. Therefore, by using different software and image processing, doctors can be helped to increase the accuracy of their observations and ultimately improve their diagnosis. Also, by using automatic methods, the process of diagnosing the type of disease can be improved. Therefore, in this paper, a deep learning framework called KCBA is proposed to classify colonoscopy lesions which are composed of several methods such as K-means clustering, a bag of features and deep auto-encoder. Finally, according to the experimental results, the proposed method's performance in classifying colonoscopy images is depicted considering the accuracy criterion.

Keywords: colorectal cancer, colonoscopy, region of interest, narrow band imaging, texture analysis, bag of feature

Procedia PDF Downloads 49
1952 Investigations on Pyrolysis Model for Radiatively Dominant Diesel Pool Fire Using Fire Dynamic Simulator

Authors: Siva K. Bathina, Sudheer Siddapureddy

Abstract:

Pool fires are formed when the flammable liquid accidentally spills on the ground or water and ignites. Pool fire is a kind of buoyancy-driven and diffusion flame. There have been many pool fire accidents caused during processing, handling and storing of liquid fuels in chemical and oil industries. Such kind of accidents causes enormous damage to property as well as the loss of lives. Pool fires are complex in nature due to the strong interaction among the combustion, heat and mass transfers and pyrolysis at the fuel surface. Moreover, the experimental study of such large complex fires involves fire safety issues and difficulties in performing experiments. In the present work, large eddy simulations are performed to study such complex fire scenarios using fire dynamic simulator. A 1 m diesel pool fire is considered for the studied cases, and diesel is chosen as it is most commonly involved fuel in fire accidents. Fire simulations are performed by specifying two different boundary conditions: one the fuel is in liquid state and pyrolysis model is invoked, and the other by assuming the fuel is initially in a vapor state and thereby prescribing the mass loss rate. A domain of size 11.2 m × 11.2 m × 7.28 m with uniform structured grid is chosen for the numerical simulations. Grid sensitivity analysis is performed, and a non-dimensional grid size of 12 corresponding to 8 cm grid size is considered. Flame properties like mass burning rate, irradiance, and time-averaged axial flame temperature profile are predicted. The predicted steady-state mass burning rate is 40 g/s and is within the uncertainty limits of the previously reported experimental data (39.4 g/s). Though the profile of the irradiance at a distance from the fire along the height is somewhat in line with the experimental data and the location of the maximum value of irradiance is shifted to a higher location. This may be due to the lack of sophisticated models for the species transportation along with combustion and radiation in the continuous zone. Furthermore, the axial temperatures are not predicted well (for any of the boundary conditions) in any of the zones. The present study shows that the existing models are not sufficient enough for modeling blended fuels like diesel. The predictions are strongly dependent on the experimental values of the soot yield. Future experiments are necessary for generalizing the soot yield for different fires.

Keywords: burning rate, fire accidents, fire dynamic simulator, pyrolysis

Procedia PDF Downloads 192
1951 Chinese Language Teaching as a Second Language: Immersion Teaching

Authors: Lee Bih Ni, Kiu Su Na

Abstract:

This paper discusses the Chinese Language Teaching as a Second Language by focusing on Immersion Teaching. Researchers used narrative literature review to describe the current states of both art and science in focused areas of inquiry. Immersion teaching comes with a standard that teachers must reliably meet. Chinese language-immersion instruction consists of language and content lessons, including functional usage of the language, academic language, authentic language, and correct Chinese sociocultural language. Researchers used narrative literature reviews to build a scientific knowledge base. Researchers collected all the important points of discussion, and put them here with reference to the specific field where this paper is originally based on. The findings show that Chinese Language in immersion teaching is not like standard foreign language classroom; immersion setting provides more opportunities to teach students colloquial language than academic. Immersion techniques also introduce a language’s cultural and social contexts in a meaningful and memorable way. It is particularly important that immersion teachers connect classwork with real-life experiences. Immersion also includes more elements of discovery and inquiry based learning than do other kinds of instructional practices. Students are always and consistently interpreted the conclusions and context clues.

Keywords: a second language, Chinese language teaching, immersion teaching, instructional strategies

Procedia PDF Downloads 446