Search results for: learning assessment
7529 An In-Depth Inquiry into the Impact of Poor Teacher-Student Relationships on Chronic Absenteeism in Secondary Schools of West Java Province, Indonesia
Authors: Yenni Anggrayni
Abstract:
The lack of awareness of the significant prevalence of school absenteeism in Indonesia, which ultimately results in high rates of school dropouts, is an unresolved issue. Therefore, this study aims to investigate the root causes of chronic absenteeism qualitatively and quantitatively using the bioecological systems paradigm in secondary schools for any reason. This study used an open-ended questionnaire to collect data from 1,148 students in six West Java Province districts/cities. Univariate and stepwise multiple logistic regression analyses produced a prediction model for the components. Analysis results show that poor teacher-student relationships, bullying by peers or teachers, negative perception of education, and lack of parental involvement in learning activities are the leading causes of chronic absenteeism. Another finding is to promote home-school partnerships to improve school climate and parental involvement in learning to address chronic absenteeism.Keywords: bullying, chronic absenteeism, dropout of school, home-school partnerships, parental involvement
Procedia PDF Downloads 737528 Analyzing Changes in Runoff Patterns Due to Urbanization Using SWAT Models
Authors: Asawari Ajay Avhad
Abstract:
The Soil and Water Assessment Tool (SWAT) is a hydrological model designed to predict the complex interactions within natural and human-altered watersheds. This research applies the SWAT model to the Ulhas River basin, a small watershed undergoing urbanization and characterized by bowl-like topography. Three simulation scenarios (LC17, LC22, and LC27) are investigated, each representing different land use and land cover (LULC) configurations, to assess the impact of urbanization on runoff. The LULC for the year 2027 is generated using the MOLUSCE Plugin of QGIS, incorporating various spatial factors such as DEM, Distance from Road, Distance from River, Slope, and distance from settlements. Future climate data is simulated within the SWAT model using historical data spanning 30 years. A susceptibility map for runoff across the basin is created, classifying runoff into five susceptibility levels ranging from very low to very high. Sub-basins corresponding to major urban settlements are identified as highly susceptible to runoff. With consideration of future climate projections, a slight increase in runoff is forecasted. The reliability of the methodology was validated through the identification of sub-basins known for experiencing severe flood events, which were determined to be highly susceptible to runoff. The susceptibility map successfully pinpointed these sub-basins with a track record of extreme flood occurrences, thus reinforcing the credibility of the assessment methodology. This study suggests that the methodology employed could serve as a valuable tool in flood management planning.Keywords: future land use impact, flood management, run off prediction, ArcSWAT
Procedia PDF Downloads 557527 Arabic Light Word Analyser: Roles with Deep Learning Approach
Authors: Mohammed Abu Shquier
Abstract:
This paper introduces a word segmentation method using the novel BP-LSTM-CRF architecture for processing semantic output training. The objective of web morphological analysis tools is to link a formal morpho-syntactic description to a lemma, along with morpho-syntactic information, a vocalized form, a vocalized analysis with morpho-syntactic information, and a list of paradigms. A key objective is to continuously enhance the proposed system through an inductive learning approach that considers semantic influences. The system is currently under construction and development based on data-driven learning. To evaluate the tool, an experiment on homograph analysis was conducted. The tool also encompasses the assumption of deep binary segmentation hypotheses, the arbitrary choice of trigram or n-gram continuation probabilities, language limitations, and morphology for both Modern Standard Arabic (MSA) and Dialectal Arabic (DA), which provide justification for updating this system. Most Arabic word analysis systems are based on the phonotactic morpho-syntactic analysis of a word transmitted using lexical rules, which are mainly used in MENA language technology tools, without taking into account contextual or semantic morphological implications. Therefore, it is necessary to have an automatic analysis tool taking into account the word sense and not only the morpho-syntactic category. Moreover, they are also based on statistical/stochastic models. These stochastic models, such as HMMs, have shown their effectiveness in different NLP applications: part-of-speech tagging, machine translation, speech recognition, etc. As an extension, we focus on language modeling using Recurrent Neural Network (RNN); given that morphological analysis coverage was very low in dialectal Arabic, it is significantly important to investigate deeply how the dialect data influence the accuracy of these approaches by developing dialectal morphological processing tools to show that dialectal variability can support to improve analysis.Keywords: NLP, DL, ML, analyser, MSA, RNN, CNN
Procedia PDF Downloads 477526 The Overlooked Problem Among Surgical Patients: Preoperative Anxiety at Ethiopian University Hospital
Authors: Yohtahe Woldegerima Berhe, Tadesse Belayneh Melkie, Girmay Fitiwi Lema, Marye Getnet, Wubie Birlie Chekol
Abstract:
Introduction: Anxiety was repeatedly reported as the worst aspect of the perioperative time. The objective of this study was to assess the prevalence of preoperative anxiety among adult surgical patients at the University of Gondar Comprehensive Specialized Hospital (UoGCSH), Northwest Ethiopia. Methodology: Hospital-based cross-sectional study was conducted among surgical patients at the university hospital. After obtaining ethical approval, 407 surgical patients were approached during the preoperative period. Preoperative anxiety was assessed by the State-Trait Anxiety Inventory. The association between variables was determined by using binary logistic regression analysis. The strength of association was described in adjusted odds ratio (AOR) and a p-value < 0.05 at a 95% confidence interval which was considered statistically significant. Results: A total of 400 patients were included in this study, with a 98.3% response rate. Preoperative anxiety was observed among 237 (59.3%) patients, and the median (IQR) STAI score was 50 (40 – 56.7). age ≥ 60 years (AOR: 5.7, CI: 1.6 – 20.4, P: 0.007), emergency surgery (AOR: 2.5, CI: 1.3 – 4.7, P: 0.005), preoperative pain (AOR: 2.6, CI: 1.2 – 5.4, P: 0.005), and rural residency (AOR: 1.8, CI: 1.1 – 2.9, P: 0.031) were found significantly associated with preoperative anxiety. Conclusions: The prevalence of preoperative anxiety among surgical patients was high. Older age (≥ 60 years), emergency surgery, preoperative pain, and rural residency were found to be significantly associated with preoperative anxiety. Assessment for preoperative anxiety should be a routine component of preoperative assessment of both elective and emergency surgical patients. Preoperative pain should be appropriately managed as it can help to reduce preoperative anxiety. Optimal anxiety reduction methods should be investigated and implemented in the hospital.Keywords: preoperative anxiety, anxiety, anxiety of anesthesia and surgery, state-trait anxiety inventory, preoperative care
Procedia PDF Downloads 257525 Arabic as a Foreign Language in the Curriculum of Higher Education in Nigeria: Problems, Solutions, and Prospects
Authors: Kazeem Oluwatoyin Ajape
Abstract:
The study is concerned with the problem of how to improve the teaching of Arabic as a foreign language in Nigerian Higher Education System. The paper traces the historical background of Arabic education in Nigeria and also outlines the problems facing the language in Nigerian Institutions. It lays down some of the essential foundation work necessary for bringing about systematic and constructive improvements in the Teaching of Arabic as a Foreign Language (TAFL) by giving answers to the following research questions: what is the appropriate medium of instruction in teaching a foreign or second language? What is the position of English language in the teaching and learning of Arabic/Islamic education? What is the relevance of the present curriculum of Arabic /Islamic education in Nigerian institutions to the contemporary society? A survey of the literature indicates that a revolution is currently taking place in FL teaching and that a new approach known as the Communicative Approach (CA), has begun to emerge and influence the teaching of FLs in general, over the last decade or so. Since the CA is currently being adapted to the teaching of most major FLs and since this revolution has not yet had much impact on TAPL, the study explores the possibility of the application of the CA to the teaching of Arabic as a living language and also makes recommendations towards the development of the language in Nigerian Institutions of Higher Learning.Keywords: Arabic Language, foreign language, Nigerian institutions, curriculum, communicative approach
Procedia PDF Downloads 6167524 Assessment of Environmental Risk Factors of Railway Using Integrated ANP-DEMATEL Approach in Fuzzy Conditions
Authors: Mehrdad Abkenari, Mehmet Kunt, Mahdi Nourollahi
Abstract:
Evaluating the environmental risk factors is a combination of analysis of transportation effects. Various definitions for risk can be found in different scientific sources. Each definition depends on a specific and particular perspective or dimension. The effects of potential risks present along the new proposed routes and existing infrastructures of large transportation projects like railways should be studied under comprehensive engineering frameworks. Despite various definitions provided for ‘risk’, all include a uniform concept. Two obvious aspects, loss and unreliability, have always been pointed in all definitions of this term. But, selection as the third aspect is usually implied and means how one notices it. Currently, conducting engineering studies on the environmental effects of railway projects have become obligatory according to the Environmental Assessment Act in developing countries. Considering the longitudinal nature of these projects and probable passage of railways through various ecosystems, scientific research on the environmental risk of these projects have become of great interest. Although many areas of expertise such as road construction in developing countries have not seriously committed to these studies yet, attention to these subjects in establishment or implementation of different systems have become an inseparable part of this wave of research. The present study used environmental risks identified and existing in previous studies and stations to use in next step. The second step proposes a new hybrid approach of analytical network process (ANP) and DEMATEL in fuzzy conditions for assessment of determined risks. Since evaluation of identified risks was not an easy touch, mesh structure was an appropriate approach for analyzing complex systems which were accordingly employed for problem description and modeling. Researchers faced the shortage of real space data and also due to the ambiguity of experts’ opinions and judgments, they were declared in language variables instead of numerical ones. Since fuzzy logic is appropriate for ambiguity and uncertainty, formulation of experts’ opinions in the form of fuzzy numbers seemed an appropriate approach. Fuzzy DEMATEL method was used to extract the relations between major and minor risk factors. Considering the internal relations of risk major factors and its sub-factors in the analysis of fuzzy network, the weight of risk’s main factors and sub-factors were determined. In general, findings of the present study, in which effective railway environmental risk indicators were theoretically identified and rated through the first usage of combined model of DEMATEL and fuzzy network analysis, indicate that environmental risks can be evaluated more accurately and also employed in railway projects.Keywords: DEMATEL, ANP, fuzzy, risk
Procedia PDF Downloads 4187523 Linguistic Accessibility and Audiovisual Translation: Corpus Linguistics as a Tool for Analysis
Authors: Juan-Pedro Rica-Peromingo
Abstract:
The important change taking place with respect to the media and the audiovisual world in Europe needs to benefit all populations, in particular those with special needs, such as the deaf and hard-of-hearing population (SDH) and blind and partially-sighted population (AD). This recent interest in the field of audiovisual translation (AVT) can be observed in the teaching and learning of the different modes of AVT in the degree and post-degree courses at Spanish universities, which expand the interest and practice of AVT linguistic accessibility. We present a research project led at the UCM which consists of the compilation of AVT activities for teaching purposes and tries to analyze the creation and reception of SDH and AD: the AVLA Project (Audiovisual Learning Archive), which includes audiovisual materials carried out by the university students on different AVT modes and evaluations from the blind and deaf informants. In this study, we present the materials created by the students. A group of the deaf and blind population has been in charge of testing the student's SDH and AD corpus of audiovisual materials through some questionnaires used to evaluate the students’ production. These questionnaires include information about the reception of the subtitles and the audio descriptions from linguistic and technical points of view. With all the materials compiled in the research project, a corpus with both the students’ production and the recipients’ evaluations is being compiled: the CALING (Corpus de Accesibilidad Lingüística) corpus. Preliminary results will be presented with respect to those aspects, difficulties, and deficiencies in the SDH and AD included in the corpus, specifically with respect to the length of subtitles, the position of the contextual information on the screen, and the text included in the audio descriptions and tone of voice used. These results may suggest some changes and improvements in the quality of the SDH and AD analyzed. In the end, demand for the teaching and learning of AVT and linguistic accessibility at a university level and some important changes in the norms which regulate SDH and AD nationally and internationally will be suggested.Keywords: audiovisual translation, corpus linguistics, linguistic accessibility, teaching
Procedia PDF Downloads 877522 Risk Assessment of Lead Element in Red Peppers Collected from Marketplaces in Antalya, Southern Turkey
Authors: Serpil Kilic, Ihsan Burak Cam, Murat Kilic, Timur Tongur
Abstract:
Interest in the lead (Pb) has considerably increased due to knowledge about the potential toxic effects of this element, recently. Exposure to heavy metals above the acceptable limit affects human health. Indeed, Pb is accumulated through food chains up to toxic concentrations; therefore, it can pose an adverse potential threat to human health. A sensitive and reliable method for determination of Pb element in red pepper were improved in the present study. Samples (33 red pepper products having different brands) were purchased from different markets in Turkey. The selected method validation criteria (linearity, Limit of Detection, Limit of Quantification, recovery, and trueness) demonstrated. Recovery values close to 100% showed adequate precision and accuracy for analysis. According to the results of red pepper analysis, all of the tested lead element in the samples was determined at various concentrations. A Perkin- Elmer ELAN DRC-e model ICP-MS system was used for detection of Pb. Organic red pepper was used to obtain a matrix for all method validation studies. The certified reference material, Fapas chili powder, was digested and analyzed, together with the different sample batches. Three replicates from each sample were digested and analyzed. The results of the exposure levels of the elements were discussed considering the scientific opinions of the European Food Safety Authority (EFSA), which is the European Union’s (EU) risk assessment source associated with food safety. The Target Hazard Quotient (THQ) was described by the United States Environmental Protection Agency (USEPA) for the calculation of potential health risks associated with long-term exposure to chemical pollutants. THQ value contains intake of elements, exposure frequency and duration, body weight and the oral reference dose (RfD). If the THQ value is lower than one, it means that the exposed population is assumed to be safe and 1 < THQ < 5 means that the exposed population is in a level of concern interval. In this study, the THQ of Pb was obtained as < 1. The results of THQ calculations showed that the values were below one for all the tested, meaning the samples did not pose a health risk to the local population. This work was supported by The Scientific Research Projects Coordination Unit of Akdeniz University. Project Number: FBA-2017-2494.Keywords: lead analyses, red pepper, risk assessment, daily exposure
Procedia PDF Downloads 1727521 Case Study of Mechanised Shea Butter Production in South-Western Nigeria Using the LCA Approach from Gate-to-Gate
Authors: Temitayo Abayomi Ewemoje, Oluwamayowa Oluwafemi Oluwaniyi
Abstract:
Agriculture and food processing, industry are among the largest industrial sectors that uses large amount of energy. Thus, a larger amount of gases from their fuel combustion technologies is being released into the environment. The choice of input energy supply not only directly having affects the environment, but also poses a threat to human health. The study was therefore designed to assess each unit production processes in order to identify hotspots using life cycle assessments (LCA) approach in South-western Nigeria. Data such as machine power rating, operation duration, inputs and outputs of shea butter materials for unit processes obtained at site were used to modelled Life Cycle Impact Analysis on GaBi6 (Holistic Balancing) software. Four scenarios were drawn for the impact assessments. Material sourcing from Kaiama, Scenarios 1, 3 and Minna Scenarios 2, 4 but different heat supply sources (Liquefied Petroleum Gas ‘LPG’ Scenarios 1, 2 and 10.8 kW Diesel Heater, scenarios 3, 4). Modelling of shea butter production on GaBi6 was for 1kg functional unit of shea butter produced and the Tool for the Reduction and Assessment of Chemical and other Environmental Impacts (TRACI) midpoint assessment was tool used to was analyse the life cycle inventories of the four scenarios. Eight categories in all four Scenarios were observed out of which three impact categories; Global Warming Potential (GWP) (0.613, 0.751, 0.661, 0.799) kg CO2¬-Equiv., Acidification Potential (AP) (0.112, 0.132, 0.129, 0.149) kg H+ moles-Equiv., and Smog (0.044, 0.059, 0.049, 0.063) kg O3-Equiv., categories had the greater impacts on the environment in Scenarios 1-4 respectively. Impacts from transportation activities was also seen to contribute more to these environmental impact categories due to large volume of petrol combusted leading to releases of gases such as CO2, CH4, N2O, SO2, and NOx into the environment during the transportation of raw shea kernel purchased. The ratio of transportation distance from Minna and Kaiama to production site was approximately 3.5. Shea butter unit processes with greater impacts in all categories was the packaging, milling and with the churning processes in ascending order of magnitude was identified as hotspots that may require attention. From the 1kg shea butter functional unit, it was inferred that locating production site at the shortest travelling distance to raw material sourcing and combustion of LPG for heating would reduce all the impact categories assessed on the environment.Keywords: GaBi6, Life cycle assessment, shea butter production, TRACI
Procedia PDF Downloads 3327520 The Library as a Metaphor: Perceptions, Evolution, and the Shifting Role in Society Through a Librarian's Lens
Authors: Nihar Kanta Patra, Akhtar Hussain
Abstract:
This comprehensive study, through the perspective of librarians, explores the library as a metaphor and its profound significance in representing knowledge and learning. It delves into how librarians perceive the library as a metaphor and the ways in which it symbolizes the acquisition, preservation, and dissemination of knowledge. The research investigates the most common metaphors used to describe libraries, as witnessed by librarians, and analyzes how these metaphors reflect the evolving role of libraries in society. Furthermore, the study examines how the library metaphor influences the perception of librarians regarding academic libraries as physical places and academic library websites as virtual spaces, exploring their potential for learning and exploration. It investigates the evolving nature of the library as a metaphor over time, as seen by librarians, considering the changing landscape of information and technology. The research explores the ways in which the library metaphor has expanded beyond its traditional representation, encompassing digital resources, online connectivity, and virtual realms, and provides insights into its potential evolution in the future. Drawing on the experiences of librarians in their interactions with library users, the study uncovers any specific cultural or generational differences in how people interpret or relate to the library as a metaphor. It sheds light on the diverse perspectives and interpretations of the metaphor based on cultural backgrounds, educational experiences, and technological familiarity. Lastly, the study investigates the evolving roles of libraries as observed by librarians and explores how these changing roles can influence the metaphors we use to represent them. It examines the dynamic nature of libraries as they adapt to societal needs, technological advancements, and new modes of information dissemination. By analyzing these various dimensions, this research provides a comprehensive understanding of the library as a metaphor through the lens of librarians, illuminating its significance, evolution, and its transformative impact on knowledge, learning, and the changing role of libraries in society.Keywords: library, librarians, metaphor, perception
Procedia PDF Downloads 1037519 Evaluation of Modern Natural Language Processing Techniques via Measuring a Company's Public Perception
Authors: Burak Oksuzoglu, Savas Yildirim, Ferhat Kutlu
Abstract:
Opinion mining (OM) is one of the natural language processing (NLP) problems to determine the polarity of opinions, mostly represented on a positive-neutral-negative axis. The data for OM is usually collected from various social media platforms. In an era where social media has considerable control over companies’ futures, it’s worth understanding social media and taking actions accordingly. OM comes to the fore here as the scale of the discussion about companies increases, and it becomes unfeasible to gauge opinion on individual levels. Thus, the companies opt to automize this process by applying machine learning (ML) approaches to their data. For the last two decades, OM or sentiment analysis (SA) has been mainly performed by applying ML classification algorithms such as support vector machines (SVM) and Naïve Bayes to a bag of n-gram representations of textual data. With the advent of deep learning and its apparent success in NLP, traditional methods have become obsolete. Transfer learning paradigm that has been commonly used in computer vision (CV) problems started to shape NLP approaches and language models (LM) lately. This gave a sudden rise to the usage of the pretrained language model (PTM), which contains language representations that are obtained by training it on the large datasets using self-supervised learning objectives. The PTMs are further fine-tuned by a specialized downstream task dataset to produce efficient models for various NLP tasks such as OM, NER (Named-Entity Recognition), Question Answering (QA), and so forth. In this study, the traditional and modern NLP approaches have been evaluated for OM by using a sizable corpus belonging to a large private company containing about 76,000 comments in Turkish: SVM with a bag of n-grams, and two chosen pre-trained models, multilingual universal sentence encoder (MUSE) and bidirectional encoder representations from transformers (BERT). The MUSE model is a multilingual model that supports 16 languages, including Turkish, and it is based on convolutional neural networks. The BERT is a monolingual model in our case and transformers-based neural networks. It uses a masked language model and next sentence prediction tasks that allow the bidirectional training of the transformers. During the training phase of the architecture, pre-processing operations such as morphological parsing, stemming, and spelling correction was not used since the experiments showed that their contribution to the model performance was found insignificant even though Turkish is a highly agglutinative and inflective language. The results show that usage of deep learning methods with pre-trained models and fine-tuning achieve about 11% improvement over SVM for OM. The BERT model achieved around 94% prediction accuracy while the MUSE model achieved around 88% and SVM did around 83%. The MUSE multilingual model shows better results than SVM, but it still performs worse than the monolingual BERT model.Keywords: BERT, MUSE, opinion mining, pretrained language model, SVM, Turkish
Procedia PDF Downloads 1517518 Revolutionizing Financial Forecasts: Enhancing Predictions with Graph Convolutional Networks (GCN) - Long Short-Term Memory (LSTM) Fusion
Authors: Ali Kazemi
Abstract:
Those within the volatile and interconnected international economic markets, appropriately predicting market trends, hold substantial fees for traders and financial establishments. Traditional device mastering strategies have made full-size strides in forecasting marketplace movements; however, monetary data's complicated and networked nature calls for extra sophisticated processes. This observation offers a groundbreaking method for monetary marketplace prediction that leverages the synergistic capability of Graph Convolutional Networks (GCNs) and Long Short-Term Memory (LSTM) networks. Our suggested algorithm is meticulously designed to forecast the traits of inventory market indices and cryptocurrency costs, utilizing a comprehensive dataset spanning from January 1, 2015, to December 31, 2023. This era, marked by sizable volatility and transformation in financial markets, affords a solid basis for schooling and checking out our predictive version. Our algorithm integrates diverse facts to construct a dynamic economic graph that correctly reflects market intricacies. We meticulously collect opening, closing, and high and low costs daily for key inventory marketplace indices (e.g., S&P 500, NASDAQ) and widespread cryptocurrencies (e.g., Bitcoin, Ethereum), ensuring a holistic view of marketplace traits. Daily trading volumes are also incorporated to seize marketplace pastime and liquidity, providing critical insights into the market's shopping for and selling dynamics. Furthermore, recognizing the profound influence of the monetary surroundings on financial markets, we integrate critical macroeconomic signs with hobby fees, inflation rates, GDP increase, and unemployment costs into our model. Our GCN algorithm is adept at learning the relational patterns amongst specific financial devices represented as nodes in a comprehensive market graph. Edges in this graph encapsulate the relationships based totally on co-movement styles and sentiment correlations, enabling our version to grasp the complicated community of influences governing marketplace moves. Complementing this, our LSTM algorithm is trained on sequences of the spatial-temporal illustration discovered through the GCN, enriched with historic fee and extent records. This lets the LSTM seize and expect temporal marketplace developments accurately. Inside the complete assessment of our GCN-LSTM algorithm across the inventory marketplace and cryptocurrency datasets, the version confirmed advanced predictive accuracy and profitability compared to conventional and opportunity machine learning to know benchmarks. Specifically, the model performed a Mean Absolute Error (MAE) of 0.85%, indicating high precision in predicting day-by-day charge movements. The RMSE was recorded at 1.2%, underscoring the model's effectiveness in minimizing tremendous prediction mistakes, which is vital in volatile markets. Furthermore, when assessing the model's predictive performance on directional market movements, it achieved an accuracy rate of 78%, significantly outperforming the benchmark models, averaging an accuracy of 65%. This high degree of accuracy is instrumental for techniques that predict the course of price moves. This study showcases the efficacy of mixing graph-based totally and sequential deep learning knowledge in economic marketplace prediction and highlights the fee of a comprehensive, records-pushed evaluation framework. Our findings promise to revolutionize investment techniques and hazard management practices, offering investors and economic analysts a powerful device to navigate the complexities of cutting-edge economic markets.Keywords: financial market prediction, graph convolutional networks (GCNs), long short-term memory (LSTM), cryptocurrency forecasting
Procedia PDF Downloads 717517 Focus Group Discussion (FGD) Strategy in Teaching Sociolinguistics to Enhance Students' Mastery: A Survey Research in Sanata Dharma ELESP Department
Authors: Nugraheni Widianingtyas, Niko Albert Setiawan
Abstract:
For ELESP Teachers’ College, teaching learning strategies such as presentation and group discussion are classical ones to be implemented in the class. In order to create a breakthrough which can bring about more positive advancements in the learning process, a Focus Group Discussion (FGD) is being offered and implemented in certain classes. Interestingly, FGD is frequently used in the social-business inquiries such as for recruiting employees. It is then interesting to investigate FGD when it is implemented in the educational scope, especially in the Sociolinguistics class which regarded as one of the most arduous subjects in this study program. Thus, this study focused on how FGD enhances students Sociolinguistics mastery. In response to that, a quantitative survey research was conducted in which observation, questionnaire, and interview (triangulation method) became the instruments. The respondents of this study were 29 sixth-semester students who take Sociolinguistics of ELESP, Sanata Dharma University in 2017. The findings indicated that FGD could help students in enhancing Sociolinguistics mastery. In addition, it also revealed that FGD was exploring students’ logical thinking, English communication skill, and decision-making.Keywords: focus group discussion, material mastery, sociolinguistics, teaching strategy
Procedia PDF Downloads 2177516 Synthetic Classicism: A Machine Learning Approach to the Recognition and Design of Circular Pavilions
Authors: Federico Garrido, Mostafa El Hayani, Ahmed Shams
Abstract:
The exploration of the potential of artificial intelligence (AI) in architecture is still embryonic, however, its latent capacity to change design disciplines is significant. 'Synthetic Classism' is a research project that questions the underlying aspects of classically organized architecture not just in aesthetic terms but also from a geometrical and morphological point of view, intending to generate new architectural information using historical examples as source material. The main aim of this paper is to explore the uses of artificial intelligence and machine learning algorithms in architectural design while creating a coherent narrative to be contained within a design process. The purpose is twofold: on one hand, to develop and train machine learning algorithms to produce architectural information of small pavilions and on the other, to synthesize new information from previous architectural drawings. These algorithms intend to 'interpret' graphical information from each pavilion and then generate new information from it. The procedure, once these algorithms are trained, is the following: parting from a line profile, a synthetic 'front view' of a pavilion is generated, then using it as a source material, an isometric view is created from it, and finally, a top view is produced. Thanks to GAN algorithms, it is also possible to generate Front and Isometric views without any graphical input as well. The final intention of the research is to produce isometric views out of historical information, such as the pavilions from Sebastiano Serlio, James Gibbs, or John Soane. The idea is to create and interpret new information not just in terms of historical reconstruction but also to explore AI as a novel tool in the narrative of a creative design process. This research also challenges the idea of the role of algorithmic design associated with efficiency or fitness while embracing the possibility of a creative collaboration between artificial intelligence and a human designer. Hence the double feature of this research, both analytical and creative, first by synthesizing images based on a given dataset and then by generating new architectural information from historical references. We find that the possibility of creatively understand and manipulate historic (and synthetic) information will be a key feature in future innovative design processes. Finally, the main question that we propose is whether an AI could be used not just to create an original and innovative group of simple buildings but also to explore the possibility of fostering a novel architectural sensibility grounded on the specificities on the architectural dataset, either historic, human-made or synthetic.Keywords: architecture, central pavilions, classicism, machine learning
Procedia PDF Downloads 1447515 Using Technology to Deliver and Scale Early Childhood Development Services in Resource Constrained Environments: Case Studies from South Africa
Authors: Sonja Giese, Tess N. Peacock
Abstract:
South African based Innovation Edge is experimenting with technology to drive positive behavior change, enable data-driven decision making, and scale quality early years services. This paper uses five case studies to illustrate how technology can be used in resource-constrained environments to first, encourage parenting practices that build early language development (using a stage-based mobile messaging pilot, ChildConnect), secondly, to improve the quality of ECD programs (using a mobile application, CareUp), thirdly, how to affordably scale services for the early detection of visual and hearing impairments (using a mobile tool, HearX), fourthly, how to build a transparent and accountable system for the registration and funding of ECD (using a blockchain enabled platform, Amply), and finally enable rapid data collection and feedback to facilitate quality enhancement of programs at scale (the Early Learning Outcomes Measure). ChildConnect and CareUp were both developed using a design based iterative research approach. The usage and uptake of ChildConnect and CareUp was evaluated with qualitative and quantitative methods. Actual child outcomes were not measured in the initial pilots. Although parents who used and engaged on either platform felt more supported and informed, parent engagement and usage remains a challenge. This is contrast to ECD practitioners whose usage and knowledge with CareUp showed both sustained engagement and knowledge improvement. HearX is an easy-to-use tool to identify hearing loss and visual impairment. The tool was tested with 10000 children in an informal settlement. The feasibility of cost-effectively decentralising screening services was demonstrated. Practical and financial barriers remain with respect to parental consent and for successful referrals. Amply uses mobile and blockchain technology to increase impact and accountability of public services. In the pilot project, Amply is being used to replace an existing paper-based system to register children for a government-funded pre-school subsidy in South Africa. Early Learning Outcomes Measure defines what it means for a child to be developmentally ‘on track’ at aged 50-69 months. ELOM administration is enabled via a tablet which allows for easy and accurate data collection, transfer, analysis, and feedback. ELOM is being used extensively to drive quality enhancement of ECD programs across multiple modalities. The nature of ECD services in South Africa is that they are in large part provided by disconnected private individuals or Non-Governmental Organizations (in contrast to basic education which is publicly provided by the government). It is a disparate sector which means that scaling successful interventions is that much harder. All five interventions show the potential of technology to support and enhance a range of ECD services, but pathways to scale are still being tested.Keywords: assessment, behavior change, communication, data, disabilities, mobile, scale, technology, quality
Procedia PDF Downloads 1377514 A Convolution Neural Network Approach to Predict Pes-Planus Using Plantar Pressure Mapping Images
Authors: Adel Khorramrouz, Monireh Ahmadi Bani, Ehsan Norouzi, Morvarid Lalenoor
Abstract:
Background: Plantar pressure distribution measurement has been used for a long time to assess foot disorders. Plantar pressure is an important component affecting the foot and ankle function and Changes in plantar pressure distribution could indicate various foot and ankle disorders. Morphologic and mechanical properties of the foot may be important factors affecting the plantar pressure distribution. Accurate and early measurement may help to reduce the prevalence of pes planus. With recent developments in technology, new techniques such as machine learning have been used to assist clinicians in predicting patients with foot disorders. Significance of the study: This study proposes a neural network learning-based flat foot classification methodology using static foot pressure distribution. Methodologies: Data were collected from 895 patients who were referred to a foot clinic due to foot disorders. Patients with pes planus were labeled by an experienced physician based on clinical examination. Then all subjects (with and without pes planus) were evaluated for static plantar pressures distribution. Patients who were diagnosed with the flat foot in both feet were included in the study. In the next step, the leg length was normalized and the network was trained for plantar pressure mapping images. Findings: From a total of 895 image data, 581 were labeled as pes planus. A computational neural network (CNN) ran to evaluate the performance of the proposed model. The prediction accuracy of the basic CNN-based model was performed and the prediction model was derived through the proposed methodology. In the basic CNN model, the training accuracy was 79.14%, and the test accuracy was 72.09%. Conclusion: This model can be easily and simply used by patients with pes planus and doctors to predict the classification of pes planus and prescreen for possible musculoskeletal disorders related to this condition. However, more models need to be considered and compared for higher accuracy.Keywords: foot disorder, machine learning, neural network, pes planus
Procedia PDF Downloads 3697513 A Worldwide Assessment of Geothermal Energy Policy: Systematic, Qualitative and Critical Literature Review
Authors: Diego Moya, Juan Paredes, Clay Aldas, Ramiro Tite, Prasad Kaparaju
Abstract:
Globally, energy policy for geothermal development is addressed in different forms, depending on the economy, resources, country-development, environment aspects and technology access. Although some countries have established strong regulations and standards for geothermal exploration, exploitation and sustainable use at the policy level (government departments and institutions), others have discussed geothermal laws at legal levels (congress – a national legislative body of a country). Appropriate regulations are needed not only to meet local and international funding requirements but also to avoid speculation in the use of the geothermal resource. In this regards, this paper presents the results of a systematic, qualitative and critical literature review of geothermal energy policy worldwide addressing two scenarios: policy and legal levels. At first, literature is collected and classified from scientific and government sources regarding geothermal energy policy of the most advanced geothermal producing countries, including Iceland, New Zealand, Mexico, the USA, Central America, Italy, Japan, Philippines, Indonesia, Kenia, and Australia. This is followed by a systematic review of the literature aiming to know the best geothermal practices and what remains uncertain regarding geothermal policy implementation. This analysis is made considering the stages of geothermal production. Furthermore, a qualitative analysis is conducted comparing the findings across geothermal policies in the countries mentioned above. Then, a critical review aims to identify significant items in the field to be applied in countries with geothermal potential but with no or weak geothermal policies. Finally, patterns and relationships are detected, and conclusions are drawn.Keywords: assessment, geothermal, energy policy, worldwide
Procedia PDF Downloads 3927512 Kuwait Environmental Remediation Program: Fresh Groudwater Risk Assessement from Tarcrete Material across the Raudhatain and Sabriyah Oil Fields, North Kuwait
Authors: Nada Al-Qallaf, Aisha Al-Barood, Djamel Lekmine, Srinivasan Vedhapuri
Abstract:
Kuwait Oil Company (KOC) under the supervision of Kuwait National Focal Point (KNFP) is planning to remediate 26 million (M) m3 of oil-contaminated soil in oil fields of Kuwait as a direct and indirect fallout of the Gulf War during 1990-1991. This project is funded by the United Nations Compensation Commission (UNCC) under the Kuwait Environmental Remediation Program (KERP). Oil-contamination of the soil occurred due to the destruction of the oil wells and spilled crude oil across the land surface and created ‘oil lakes’ in low lying land. Aerial fall-out from oil spray and combustion products from oil fires combined with the sand and gravel on the ground surface to form a layer of hardened ‘Tarcrete’. The unique fresh groundwater lenses present in the Raudhatain and Sabriya subsurface areas had been impacted by the discharge and/or spills of dissolved petroleum constituents. These fresh groundwater aquifers were used for drinking water purposes until 1990, prior to invasion. This has significantly damages altered the landscape, ecology and habitat of the flora and fauna and in Kuwait Desert. Under KERP, KOC is fully responsible for the planning and execution of the remediation and restoration projects in KOC oil fields. After the initial recommendation of UNCC to construct engineered landfills for containment and disposal of heavily contaminated soils, two landfills were constructed, one in North Kuwait and another in South East Kuwait of capacity 1.7 million m3 and 0.5 million m3 respectively. KOC further developed the Total Remediation Strategy in conjunction with KNFP and has obtained UNCC approval. The TRS comprises of elements such as Risk Based Approach (RBA), Bioremediation of low Contaminated Soil levels, Remediation Treatment Technologies, Sludge Disposal via Beneficial Recycling or Re-use and Engineered landfills for Containment of untreatable materials. Risk Based Assessment as a key component to avoid any unnecessary remedial works, where it can be demonstrated that human health and the environment are sufficiently protected in the absence of active remediation. This study demonstrates on the risks of tarcrete materials spread over areas 20 Km2 on the fresh Ground water lenses/catchment located beneath the Sabriyah and Raudhatain oil fields in North Kuwait. KOC’s primary objective is to provide justification of using RBA, to support a case with the Kuwait regulators to leave the tarcrete material in place, rather than seek to undertake large-scale removal and remediation. The large-scale coverage of the tarcrete in the oil fields and perception that the residual contamination associated with this source is present in an environmentally sensitive area essentially in ground water resource. As part of this assessment, conceptual site model (CSM) and complete risk-based and fate and transport modelling was carried out which includes derivation of site-specific assessment criteria (SSAC) and quantification of risk to identified waters resource receptors posed by tarcrete impacted areas. The outcome of this assessment was determined that the residual tarcrete deposits across the site area shall not create risks to fresh groundwater resources and the remedial action to remove and remediate the surficial tarcrete deposits is not warranted.Keywords: conceptual site model, fresh groundwater, oil-contaminated soil, tarcrete, risk based assessment
Procedia PDF Downloads 1807511 Circular Tool and Dynamic Approach to Grow the Entrepreneurship of Macroeconomic Metabolism
Authors: Maria Areias, Diogo Simões, Ana Figueiredo, Anishur Rahman, Filipa Figueiredo, João Nunes
Abstract:
It is expected that close to 7 billion people will live in urban areas by 2050. In order to improve the sustainability of the territories and its transition towards circular economy, it’s necessary to understand its metabolism and promote and guide the entrepreneurship answer. The study of a macroeconomic metabolism involves the quantification of the inputs, outputs and storage of energy, water, materials and wastes for an urban region. This quantification and analysis representing one opportunity for the promotion of green entrepreneurship. There are several methods to assess the environmental impacts of an urban territory, such as human and environmental risk assessment (HERA), life cycle assessment (LCA), ecological footprint assessment (EF), material flow analysis (MFA), physical input-output table (PIOT), ecological network analysis (ENA), multicriteria decision analysis (MCDA) among others. However, no consensus exists about which of those assessment methods are best to analyze the sustainability of these complex systems. Taking into account the weaknesses and needs identified, the CiiM - Circular Innovation Inter-Municipality project aims to define an uniform and globally accepted methodology through the integration of various methodologies and dynamic approaches to increase the efficiency of macroeconomic metabolisms and promoting entrepreneurship in a circular economy. The pilot territory considered in CiiM project has a total area of 969,428 ha, comprising a total of 897,256 inhabitants (about 41% of the population of the Center Region). The main economic activities in the pilot territory, which contribute to a gross domestic product of 14.4 billion euros, are: social support activities for the elderly; construction of buildings; road transport of goods, retailing in supermarkets and hypermarkets; mass production of other garments; inpatient health facilities; and the manufacture of other components and accessories for motor vehicles. The region's business network is mostly constituted of micro and small companies (similar to the Central Region of Portugal), with a total of 53,708 companies identified in the CIM Region of Coimbra (39 large companies), 28,146 in the CIM Viseu Dão Lafões (22 large companies) and 24,953 in CIM Beiras and Serra da Estrela (13 large companies). For the construction of the database was taking into account data available at the National Institute of Statistics (INE), General Directorate of Energy and Geology (DGEG), Eurostat, Pordata, Strategy and Planning Office (GEP), Portuguese Environment Agency (APA), Commission for Coordination and Regional Development (CCDR) and Inter-municipal Community (CIM), as well as dedicated databases. In addition to the collection of statistical data, it was necessary to identify and characterize the different stakeholder groups in the pilot territory that are relevant to the different metabolism components under analysis. The CIIM project also adds the potential of a Geographic Information System (GIS) so that it is be possible to obtain geospatial results of the territorial metabolisms (rural and urban) of the pilot region. This platform will be a powerful visualization tool of flows of products/services that occur within the region and will support the stakeholders, improving their circular performance and identifying new business ideas and symbiotic partnerships.Keywords: circular economy tools, life cycle assessment macroeconomic metabolism, multicriteria decision analysis, decision support tools, circular entrepreneurship, industrial and regional symbiosis
Procedia PDF Downloads 1077510 Improved Super-Resolution Using Deep Denoising Convolutional Neural Network
Authors: Pawan Kumar Mishra, Ganesh Singh Bisht
Abstract:
Super-resolution is the technique that is being used in computer vision to construct high-resolution images from a single low-resolution image. It is used to increase the frequency component, recover the lost details and removing the down sampling and noises that caused by camera during image acquisition process. High-resolution images or videos are desired part of all image processing tasks and its analysis in most of digital imaging application. The target behind super-resolution is to combine non-repetition information inside single or multiple low-resolution frames to generate a high-resolution image. Many methods have been proposed where multiple images are used as low-resolution images of same scene with different variation in transformation. This is called multi-image super resolution. And another family of methods is single image super-resolution that tries to learn redundancy that presents in image and reconstruction the lost information from a single low-resolution image. Use of deep learning is one of state of art method at present for solving reconstruction high-resolution image. In this research, we proposed Deep Denoising Super Resolution (DDSR) that is a deep neural network for effectively reconstruct the high-resolution image from low-resolution image.Keywords: resolution, deep-learning, neural network, de-blurring
Procedia PDF Downloads 5197509 Influence of Instrumental Playing on Attachment Type of Musicians and Music Students Using Adult Attachment Scale-R
Authors: Sofia Serra-Dawa
Abstract:
Adult relationships accrue on a variety of past social experiences, intentions, and emotions that might predispose and influence the approach to and construction of subsequent relationships. The Adult Attachment Theory (AAT) proposes four types of adult attachment, where attachment is built over two dimensions of anxiety and avoidance: secure, anxious-preoccupied, dismissive-avoidant, and fearful-avoidant. The AAT has been studied in multiple settings such as personal and therapeutic relationships, educational settings, sexual orientation, health, and religion. In music scholarship, the AAT has been used to frame class learning of student singers and study the relational behavior between voice teachers and students. Building on this study, the present inquiry studies how attachment types might characterize learning relationships of music students (in the Western Conservatory tradition), and whether particular instrumental experiences might correlate to given attachment styles. Given certain behavioral cohesive features of established traditions of instrumental playing and performance modes, it is hypothesized that student musicians will display specific characteristics correlated to instrumental traditions, demonstrating clear tendency of attachment style, which in turn has implications on subsequent professional interactions. This study is informed by the methodological framework of Adult Attachment Scale-R (Collins and Read, 1990), which was particularly chosen given its non-invasive questions and classificatory validation. It is further hypothesized that the analytical comparison of musicians’ profiles has the potential to serve as the baseline for other comparative behavioral observation studies [this component is expected to be verified and completed well before the conference meeting]. This research may have implications for practitioners concerned with matching and improving musical teaching and learning relationships and in (professional and amateur) long-term musical settings.Keywords: adult attachment, music education, musicians attachment profile, musicians relationships
Procedia PDF Downloads 1607508 A Unified Deep Framework for Joint 3d Pose Estimation and Action Recognition from a Single Color Camera
Authors: Huy Hieu Pham, Houssam Salmane, Louahdi Khoudour, Alain Crouzil, Pablo Zegers, Sergio Velastin
Abstract:
We present a deep learning-based multitask framework for joint 3D human pose estimation and action recognition from color video sequences. Our approach proceeds along two stages. In the first, we run a real-time 2D pose detector to determine the precise pixel location of important key points of the body. A two-stream neural network is then designed and trained to map detected 2D keypoints into 3D poses. In the second, we deploy the Efficient Neural Architecture Search (ENAS) algorithm to find an optimal network architecture that is used for modeling the Spatio-temporal evolution of the estimated 3D poses via an image-based intermediate representation and performing action recognition. Experiments on Human3.6M, Microsoft Research Redmond (MSR) Action3D, and Stony Brook University (SBU) Kinect Interaction datasets verify the effectiveness of the proposed method on the targeted tasks. Moreover, we show that our method requires a low computational budget for training and inference.Keywords: human action recognition, pose estimation, D-CNN, deep learning
Procedia PDF Downloads 1497507 An Investigation into the Use of an Atomistic, Hermeneutic, Holistic Approach in Education Relating to the Architectural Design Process
Authors: N. Pritchard
Abstract:
Within architectural education, students arrive fore-armed with; their life-experience; knowledge gained from subject-based learning; their brains and more specifically their imaginations. The learning-by-doing that they embark on in studio-based/project-based learning calls for supervision that allows the student to proactively undertake research and experimentation with design solution possibilities. The degree to which this supervision includes direction is subject to debate and differing opinion. It can be argued that if the student is to learn-by-doing, then design decision making within the design process needs to be instigated and owned by the student so that they have the ability to personally reflect on and evaluate those decisions. Within this premise lies the problem that the student's endeavours can become unstructured and unfocused as they work their way into a new and complex activity. A resultant weakness can be that the design activity is compartmented and not holistic or comprehensive, and therefore, the student's reflections are consequently impoverished in terms of providing a positive, informative feedback loop. The construct proffered in this paper is that a supportive 'armature' or 'Heuristic-Framework' can be developed that facilitates a holistic approach and reflective learning. The normal explorations of architectural design comprise: Analysing the site and context, reviewing building precedents, assimilating the briefing information. However, the student can still be compromised by 'not knowing what they need to know'. The long-serving triad 'Firmness, Commodity and Delight' provides a broad-brush framework of considerations to explore and integrate into good design. If this were further atomised in subdivision formed from the disparate aspects of architectural design that need to be considered within the design process, then the student could sieve through the facts more methodically and reflectively in terms of considering their interrelationship conflict and alliances. The words facts and sieve hold the acronym of the aspects that form the Heuristic-Framework: Function, Aesthetics, Context, Tectonics, Spatial, Servicing, Infrastructure, Environmental, Value and Ecological issues. The Heuristic could be used as a Hermeneutic Model with each aspect of design being focused on and considered in abstraction and then considered in its relation to other aspect and the design proposal as a whole. Importantly, the heuristic could be used as a method for gathering information and enhancing the design brief. The more poetic, mysterious, intuitive, unconscious processes should still be able to occur for the student. The Heuristic-Framework should not be seen as comprehensive prescriptive formulaic or inhibiting to the wide exploration of possibilities and solutions within the architectural design process.Keywords: atomistic, hermeneutic, holistic, approach architectural design studio education
Procedia PDF Downloads 2647506 Semi-Supervised Learning for Spanish Speech Recognition Using Deep Neural Networks
Authors: B. R. Campomanes-Alvarez, P. Quiros, B. Fernandez
Abstract:
Automatic Speech Recognition (ASR) is a machine-based process of decoding and transcribing oral speech. A typical ASR system receives acoustic input from a speaker or an audio file, analyzes it using algorithms, and produces an output in the form of a text. Some speech recognition systems use Hidden Markov Models (HMMs) to deal with the temporal variability of speech and Gaussian Mixture Models (GMMs) to determine how well each state of each HMM fits a short window of frames of coefficients that represents the acoustic input. Another way to evaluate the fit is to use a feed-forward neural network that takes several frames of coefficients as input and produces posterior probabilities over HMM states as output. Deep neural networks (DNNs) that have many hidden layers and are trained using new methods have been shown to outperform GMMs on a variety of speech recognition systems. Acoustic models for state-of-the-art ASR systems are usually training on massive amounts of data. However, audio files with their corresponding transcriptions can be difficult to obtain, especially in the Spanish language. Hence, in the case of these low-resource scenarios, building an ASR model is considered as a complex task due to the lack of labeled data, resulting in an under-trained system. Semi-supervised learning approaches arise as necessary tasks given the high cost of transcribing audio data. The main goal of this proposal is to develop a procedure based on acoustic semi-supervised learning for Spanish ASR systems by using DNNs. This semi-supervised learning approach consists of: (a) Training a seed ASR model with a DNN using a set of audios and their respective transcriptions. A DNN with a one-hidden-layer network was initialized; increasing the number of hidden layers in training, to a five. A refinement, which consisted of the weight matrix plus bias term and a Stochastic Gradient Descent (SGD) training were also performed. The objective function was the cross-entropy criterion. (b) Decoding/testing a set of unlabeled data with the obtained seed model. (c) Selecting a suitable subset of the validated data to retrain the seed model, thereby improving its performance on the target test set. To choose the most precise transcriptions, three confidence scores or metrics, regarding the lattice concept (based on the graph cost, the acoustic cost and a combination of both), was performed as selection technique. The performance of the ASR system will be calculated by means of the Word Error Rate (WER). The test dataset was renewed in order to extract the new transcriptions added to the training dataset. Some experiments were carried out in order to select the best ASR results. A comparison between a GMM-based model without retraining and the DNN proposed system was also made under the same conditions. Results showed that the semi-supervised ASR-model based on DNNs outperformed the GMM-model, in terms of WER, in all tested cases. The best result obtained an improvement of 6% relative WER. Hence, these promising results suggest that the proposed technique could be suitable for building ASR models in low-resource environments.Keywords: automatic speech recognition, deep neural networks, machine learning, semi-supervised learning
Procedia PDF Downloads 3447505 The Mentoring in Professional Development of University Teachers
Authors: Nagore Guerra Bilbao, Clemente Lobato Fraile
Abstract:
Mentoring is provided by professionals with a higher level of experience and competence as part of the professional development of a university faculty. This paper explores the characteristics of the mentoring provided by those teachers participating in the development of an active methodology program run at the University of the Basque Country: to examine and to analyze mentors’ performance with the aim of providing empirical evidence regarding its value as a lifelong learning strategy for teaching staff. A total of 183 teachers were trained during the first three programs. The analysis method uses a coding technique and is based on flexible, systematic guidelines for gathering and analyzing qualitative data. The results have confirmed the conception of mentoring as a methodological innovation in higher education. In short, university teachers in general assessed the mentoring they received positively, considering it to be a valid, useful strategy in their professional development. They highlighted the methodological expertise of their mentor and underscored how they monitored the learning process of the active method and provided guidance and advice when necessary. Finally, they also drew attention to traits such as availability, personal commitment and flexibility in. However, a minority critique is pointed to some aspects of the performance of some mentors.Keywords: higher education, mentoring, professional development, university teachers
Procedia PDF Downloads 2457504 Empowering Girls and Youth in Bangladesh: Importance of Creating Safe Digital Space for Online Learning and Education
Authors: Md. Rasel Mia, Ashik Billah
Abstract:
The empowerment of girls and youth in Bangladesh is a demanding issue in today's digital age, where online learning and education have become integral to personal and societal development. This abstract explores the critical importance of creating a secure online environment for girls and youth in Bangladesh, emphasizing the transformative impact it can have on their access to education and knowledge. Bangladesh, like many developing nations, faces gender inequalities in education and access to digital resources. The creation of a safe digital space not only mitigates the gender digital divide but also fosters an environment where girls and youth can thrive academically and professionally. This manuscript draws attention to the efforts through a mixed-method study to assess the current digital landscape in Bangladesh, revealing disparities in phone and internet access, online practices, and awareness of cyber security among diverse demographic groups. Moreover, the study unveils the varying levels of familial support and barriers encountered by girls and youth in their quest for digital literacy. It emphasizes the need for tailored training programs that address specific learning needs while also advocating for enhanced internet accessibility, safe online practices, and inclusive online platforms. The manuscript culminates in a call for collaborative efforts among stakeholders, including NGOs, government agencies, and telecommunications companies, to implement targeted interventions that bridge the gender digital divide and pave the way for a brighter, more equitable future for girls and youth in Bangladesh. In conclusion, this research highlights the undeniable significance of creating a safe digital space as a catalyst for the empowerment of girls and youth in Bangladesh, ensuring that they not only access but excel in the online space, thereby contributing to their personal growth and the advancement of society as a whole.Keywords: collaboration, cyber security, digital literacy, digital resources, inclusiveness
Procedia PDF Downloads 647503 Neural Reshaping: The Plasticity of Human Brain and Artificial Intelligence in the Learning Process
Authors: Seyed-Ali Sadegh-Zadeh, Mahboobe Bahrami, Sahar Ahmadi, Seyed-Yaser Mousavi, Hamed Atashbar, Amir M. Hajiyavand
Abstract:
This paper presents an investigation into the concept of neural reshaping, which is crucial for achieving strong artificial intelligence through the development of AI algorithms with very high plasticity. By examining the plasticity of both human and artificial neural networks, the study uncovers groundbreaking insights into how these systems adapt to new experiences and situations, ultimately highlighting the potential for creating advanced AI systems that closely mimic human intelligence. The uniqueness of this paper lies in its comprehensive analysis of the neural reshaping process in both human and artificial intelligence systems. This comparative approach enables a deeper understanding of the fundamental principles of neural plasticity, thus shedding light on the limitations and untapped potential of both human and AI learning capabilities. By emphasizing the importance of neural reshaping in the quest for strong AI, the study underscores the need for developing AI algorithms with exceptional adaptability and plasticity. The paper's findings have significant implications for the future of AI research and development. By identifying the core principles of neural reshaping, this research can guide the design of next-generation AI technologies that can enhance human and artificial intelligence alike. These advancements will be instrumental in creating a new era of AI systems with unparalleled capabilities, paving the way for improved decision-making, problem-solving, and overall cognitive performance. In conclusion, this paper makes a substantial contribution by investigating the concept of neural reshaping and its importance for achieving strong AI. Through its in-depth exploration of neural plasticity in both human and artificial neural networks, the study unveils vital insights that can inform the development of innovative AI technologies with high adaptability and potential for enhancing human and AI capabilities alike.Keywords: neural plasticity, brain adaptation, artificial intelligence, learning, cognitive reshaping
Procedia PDF Downloads 567502 Gamification in Education: A Case Study on the Use of Serious Games
Authors: Maciej Zareba, Pawel Dawid
Abstract:
This article provides a case study exploring the use of serious games in educational settings, indicating their potential to transform conventional teaching methods into interactive and engaging learning experiences. By incorporating game elements such as points, leaderboards and progress indicators, serious games establish clear goals, provide real-time feedback and give a sense of progress. These elements enable students to solve complex problems in simulated environments, fostering critical thinking, creativity and contextual learning. The article provides a case study of the feasibility of using the 4FactryManager serious game in a selected educational context, demonstrating its effectiveness in increasing student motivation, improving academic performance and promoting knowledge consolidation. The study and presentation are based on the results of industrial research and development work conducted as part of the project titled (4FM) 4FACTORY Manager – an innovative simulation game for managing real production processes using a novel gameplay model based on the interaction between the virtual and real worlds, applying the Industry 4.0 concept (Project number: POIR.01.02.00-00-0057/19).Keywords: gamification, serious games, education, elearning
Procedia PDF Downloads 127501 The Degree Project-Course in Swedish Teacher Education – Deliberative and Transformative Perspectives on the Formative Assessment Practice
Authors: Per Blomqvist
Abstract:
The overall aim of this study is to highlight how the degree project-course in teacher education has developed over time at Swedish universities, above all regarding changes in the formative assessment practices in relation to student's opportunities to take part in writing processes that can develop both their independent critical thinking, subject knowledge, and academic writing skills. Theoretically, the study is based on deliberative and transformative perspectives of teaching academic writing in higher education. The deliberative perspective is motivated by the fact that it is the universities and their departments' responsibility to give the students opportunities to develop their academic writing skills, while there is little guidance on how this can be implemented. The transformative perspective is motivated by the fact that education needs to be adapted to the student's prior knowledge and developed in relation to the student group. Given the academisation of education and the new student groups, this is a necessity. The empirical data consists of video recordings of teacher groups' conversations at three Swedish universities. The conversations were conducted as so-called collective remembering interviews, a method to stimulate the participants' memory through social interaction, and focused on addressing issues on how the degree project-course in teacher education has changed over time. Topic analysis was used to analyze the conversations in order to identify common descriptions and expressions among the teachers. The result highlights great similarities in how the degree project-course has changed over time, both from a deliberative and a transformative perspective. The course is characterized by a “strong framing,” where the teachers have great control over the work through detailed instructions for the writing process and detailed templates for the text. This is justified by the fact that the education has been adapted based on the student teachers' lack of prior subject knowledge. The strong framing places high demands on continuous discussions between teachers about, for example, which tools the students have with them and which linguistic and textual tools are offered in the education. The teachers describe that such governance often leads to conflicts between teachers from different departments because reading and writing are always part of cultural contexts and are linked to different knowledge, traditions, and values. The problem that is made visible in this study raises questions about how students' opportunities to develop independence and make critical judgments in academic writing are affected if the writing becomes too controlled and if passing students becomes the main goal of education.Keywords: formative assessment, academic writing, degree project, higher education, deliberative perspective, transformative perspective
Procedia PDF Downloads 687500 Flow-Oriented Incentive Spirometry in the Reversal of Diaphragmatic Dysfunction in Bariatric Surgery Postoperative Period
Authors: Eli Maria Forti-Pazzianotto, Carolina Moraes Da Costa, Daniela Faleiros Berteli Merino, Maura Rigoldi Simões Da Rocha, Irineu Rasera-Junior
Abstract:
There is no conclusive evidence to support the use of one type or brand of incentive espirometry over others. The decision as to which equipment is best, have being based on empirical assessment of patient acceptance, ease of use, and cost. The aim was to evaluate the effects of use of two methodologies of breathing exercises, performed by flow-oriented incentive spirometry, in the reversal of diaphragmatic dysfunction in postoperative bariatric surgery. 38 morbid obese women were selected. Respiratory muscle strength was evaluated through the nasal inspiratory pressure (NIP), and the respiratory muscles endurance, through incremental test by measurement of sustained maximal inspiratory pressure (SMIP). They were randomized in 2 groups: 1- Respiron® Classic (RC) the inspirations were slow, deep and sustained for as long as possible (5 sec). 2- Respiron® Athletic1 (RA1) - the inspirations were explosive, quick and intense, raising balls by the explosive way. 6 sets of 15 repetitions with intervals of 30 to 60 seconds were performed in groups. At the end of the intervention program (second PO), the volunteers were reevaluated. The groups were homogeneous with regard to initial assessment. However on reevaluating there was a significant decline of the variable PIN (p= < 0.0001) and SMIP (p=0.0004) in RC. In the RA1 group there was a maintenance of SMIP (p=0.5076) after surgery. The use of the Respiron Athletic 1, as well as the methodology of application used, can contribute positively to preserve the inspiratory muscle endurance and improve the diaphragmatic dysfunction in postoperative period.Keywords: bariatric surgery, incentive spirometry, respiratory muscle, physiotherapy
Procedia PDF Downloads 374