Search results for: hybrid learning
3790 Intensification of Wet Air Oxidation of Landfill Leachate Reverse Osmosis Concentrates
Authors: Emilie Gout, Mathias Monnot, Olivier Boutin, Pierre Vanloot, Philippe Moulin
Abstract:
Water is a precious resource. Treating industrial wastewater remains a considerable technical challenge of our century. The effluent considered for this study is landfill leachate treated by reverse osmosis (RO). Nowadays, in most developed countries, sanitary landfilling is the main method to deal with municipal solid waste. Rainwater percolates through solid waste, generating leachates mostly comprised of organic and inorganic matter. Whilst leachate ages, its composition varies, becoming more and more bio-refractory. RO is already used for landfill leachates as it generates good quality permeate. However, its mains drawback is the production of highly polluted concentrates that cannot be discharged in the environment or reused, which is an important industrial issue. It is against this background that the study of coupling RO with wet air oxidation (WAO) was set to intensify and optimize processes to meet current regulations for water discharge in the environment. WAO is widely studied for effluents containing bio-refractory compounds. Oxidation consists of a destruction reaction capable of mineralizing the recalcitrant organic fraction of pollution into carbon dioxide and water when complete. WAO process in subcritical conditions requires a high-energy consumption, but it can be autothermic in a certain range of chemical oxygen demand (COD) concentrations (10-100 g.L⁻¹). Appropriate COD concentrations are reached in landfill leachate RO concentrates. Therefore, the purpose of this work is to report the performances of mineralization during WAO on RO concentrates. The coupling of RO/WAO has shown promising results in previous works on both synthetic and real effluents in terms of organic carbon (TOC) reduction by WAO and retention by RO. Non-catalytic WAO with air as oxidizer was performed in a lab-scale stirred autoclave (1 L) on landfill leachates RO concentrates collected in different seasons in a sanitary landfill in southern France. The yield of WAO depends on operating parameters such as total pressure, temperature, and time. Compositions of the effluent are also important aspects for process intensification. An experimental design methodology was used to minimize the number of experiments whilst finding the operating conditions achieving the best pollution reduction. The simulation led to a set of 18 experiments, and the responses to highlight process efficiency are pH, conductivity, turbidity, COD, TOC, and inorganic carbon. A 70% oxygen excess was chosen for all the experiments. First experiments showed that COD and TOC abatements of at least 70% were obtained after 90 min at 300°C and 20 MPa, which attested the possibility to treat RO leachate concentrates with WAO. In order to meet French regulations and validate process intensification with industrial effluents, some continuous experiments in a bubble column are foreseen, and some further analyses will be performed, such as biological oxygen demand and study of gas composition. Meanwhile, other industrial effluents are treated to compare RO-WAO performances. These effluents, coming from pharmaceutical, petrochemical, and tertiary wastewater industries, present different specific pollutants that will provide a better comprehension of the hybrid process and prove the intensification and feasibility of the process at an industrial scale. Acknowledgments: This work has been supported by the French National Research Agency (ANR) for the Project TEMPO under the reference number ANR-19-CE04-0002-01.Keywords: hybrid process, landfill leachates, process intensification, reverse osmosis, wet air oxidation
Procedia PDF Downloads 1373789 A Conundrum of Teachability and Learnability of Deaf Adult English as Second Language Learners in Pakistani Mainstream Classrooms: Integration or Elimination
Authors: Amnah Moghees, Saima Abbas Dar, Muniba Saeed
Abstract:
Teaching a second language to deaf learners has always been a challenge in Pakistan. Different approaches and strategies have been followed, but they have been resulted into partial or complete failure. The study aims to investigate the language problems faced by adult deaf learners of English as second language in mainstream classrooms. Moreover, the study also determines the factors which are very much involved in language teaching and learning in mainstream classes. To investigate the language problems, data will be collected through writing samples of ten deaf adult learners and ten normal ESL learners of the same class; whereas, observation in inclusive language teaching classrooms and interviews from five ESL teachers in inclusive classes will be conducted to know the factors which are directly or indirectly involved in inclusive language education. Keeping in view this study, qualitative research paradigm will be applied to analyse the corpus. The study figures out that deaf ESL learners face severe language issues such as; odd sentence structures, subject and verb agreement violation, misappropriation of verb forms and tenses as compared to normal ESL learners. The study also predicts that in mainstream classrooms there are multiple factors which are affecting the smoothness of teaching and learning procedure; role of mediator, level of deaf learners, empathy of normal learners towards deaf learners and language teacher’s training.Keywords: deaf English language learner, empathy, mainstream classrooms, previous language knowledge of learners, role of mediator, language teachers' training
Procedia PDF Downloads 1663788 A Socio-Cultural Approach to Implementing Inclusive Education in South Africa
Authors: Louis Botha
Abstract:
Since the presentation of South Africa’s inclusive education strategy in Education White Paper 6 in 2001, very little has been accomplished in terms of its implementation. The failure to achieve the goals set by this policy document is related to teachers lacking confidence and knowledge about how to enact inclusive education, as well as challenges of inflexible curricula, limited resources in overcrowded classrooms, and so forth. This paper presents a socio-cultural approach to addressing these challenges of implementing inclusive education in the South African context. It takes its departure from the view that inclusive education has been adequately theorized and conceptualized in terms of its philosophical and ethical principles, especially in South African policy and debates. What is missing, however, are carefully theorized, practically implementable research interventions which can address the concerns mentioned above. Drawing on socio-cultural principles of learning and development and on cultural-historical activity theory (CHAT) in particular, this paper argues for the use of formative interventions which introduce appropriately constructed mediational artifacts that have the potential to initiate inclusive practices and pedagogies within South African schools and classrooms. It makes use of Vygotsky’s concept of double stimulation to show how the proposed artifacts could instigate forms of transformative agency which promote the adoption of inclusive cultures of learning and teaching.Keywords: cultural-historical activity theory, double stimulation, formative interventions, transformative agency
Procedia PDF Downloads 2333787 Satellite Connectivity for Sustainable Mobility
Authors: Roberta Mugellesi Dow
Abstract:
As the climate crisis becomes unignorable, it is imperative that new services are developed addressing not only the needs of customers but also taking into account its impact on the environment. The Telecommunication and Integrated Application (TIA) Directorate of ESA is supporting the green transition with particular attention to the sustainable mobility.“Accelerating the shift to sustainable and smart mobility” is at the core of the European Green Deal strategy, which seeks a 90% reduction in related emissions by 2050 . Transforming the way that people and goods move is essential to increasing mobility while decreasing environmental impact, and transport must be considered holistically to produce a shared vision of green intermodal mobility. The use of space technologies, integrated with terrestrial technologies, is an enabler of smarter traffic management and increased transport efficiency for automated and connected multimodal mobility. Satellite connectivity, including future 5G networks, and digital technologies such as Digital Twin, AI, Machine Learning, and cloud-based applications are key enablers of sustainable mobility.SatCom is essential to ensure that connectivity is ubiquitously available, even in remote and rural areas, or in case of a failure, by the convergence of terrestrial and SatCom connectivity networks, This is especially crucial when there are risks of network failures or cyber-attacks targeting terrestrial communication. SatCom ensures communication network robustness and resilience. The combination of terrestrial and satellite communication networks is making possible intelligent and ubiquitous V2X systems and PNT services with significantly enhanced reliability and security, hyper-fast wireless access, as well as much seamless communication coverage. SatNav is essential in providing accurate tracking and tracing capabilities for automated vehicles and in guiding them to target locations. SatNav can also enable location-based services like car sharing applications, parking assistance, and fare payment. In addition to GNSS receivers, wireless connections, radar, lidar, and other installed sensors can enable automated vehicles to monitor surroundings, to ‘talk to each other’ and with infrastructure in real-time, and to respond to changes instantaneously. SatEO can be used to provide the maps required by the traffic management, as well as evaluate the conditions on the ground, assess changes and provide key data for monitoring and forecasting air pollution and other important parameters. Earth Observation derived data are used to provide meteorological information such as wind speed and direction, humidity, and others that must be considered into models contributing to traffic management services. The paper will provide examples of services and applications that have been developed aiming to identify innovative solutions and new business models that are allowed by new digital technologies engaging space and non space ecosystem together to deliver value and providing innovative, greener solutions in the mobility sector. Examples include Connected Autonomous Vehicles, electric vehicles, green logistics, and others. For the technologies relevant are the hybrid satcom and 5G providing ubiquitous coverage, IoT integration with non space technologies, as well as navigation, PNT technology, and other space data.Keywords: sustainability, connectivity, mobility, satellites
Procedia PDF Downloads 1333786 Maker-Based Learning in Secondary Mathematics: Investigating Students’ Proportional Reasoning Understanding through Digital Making
Authors: Juan Torralba
Abstract:
Student digital artifacts were investigated, utilizing a qualitative exploratory research design to understand the ways in which students represented their knowledge of seventh-grade proportionality concepts as they participated in maker-based activities that culminated in the creation of digital 3-dimensional models of their dream homes. Representations of the geometric and numeric dimensions of proportionality were analyzed in the written, verbal, and visual data collected from the students. A directed content analysis approach was utilized in the data analysis, as this work aimed to build upon existing research in the field of maker-based STEAM Education. The results from this work show that students can represent their understanding of proportional reasoning through open-ended written responses more accurately than through verbal descriptions or digital artifacts. The geometric and numeric dimensions of proportionality and their respective components of attributes of similarity representation and percents, rates, and ratios representations were the most represented by the students than any other across the data, suggesting a maker-based instructional approach to teaching proportionality in the middle grades may be promising in helping students gain a solid foundation in those components. Recommendations for practice and research are discussed.Keywords: learning through making, maker-based education, maker education in the middle grades, making in mathematics, the maker movement
Procedia PDF Downloads 713785 Learning English from Movies: An Exploratory Study
Authors: Yasamiyan Alolaywi
Abstract:
The sources of second language acquisition vary and depend on a learner’s preferences and choices; however, undoubtedly, the most effective methods provide authentic language input. This current study explores the effectiveness of watching movies as a means of English language acquisition. It explores university students’ views on the impact of this method in improving English language skills. The participants in this study were 74 students (25 males and 49 females) from the Department of English Language and Translation at Qassim University, Saudi Arabia. Data for this research were collected from questionnaires and individual interviews with several selected students. The findings of this study showed that many students watch movies frequently and for various purposes, the most important of which is entertainment. The students also admitted that movies help them acquire a great deal of vocabulary and develop their listening and writing skills. Also, the participants believed that exposure to a target language by native speakers helps enhance language fluency and proficiency. The students learn not only linguistic aspects from films but also other aspects, such as culture, lifestyle, and ways of thinking, in addition to learning other languages such as Spanish. In light of these results, some recommendations are proposed, such as verifying the feasibility of integrating media into a foreign language classroom. While this study covers aspects of the relationship between watching movies and English language acquisition, knowledge gaps remain that need to be filled by further research, such as on incorporating media into the educational process and how movie subtitles can improve learners’ language skills.Keywords: language acquisition, English movies, EFL learners, perceptions
Procedia PDF Downloads 1013784 Learning from Dendrites: Improving the Point Neuron Model
Authors: Alexander Vandesompele, Joni Dambre
Abstract:
The diversity in dendritic arborization, as first illustrated by Santiago Ramon y Cajal, has always suggested a role for dendrites in the functionality of neurons. In the past decades, thanks to new recording techniques and optical stimulation methods, it has become clear that dendrites are not merely passive electrical components. They are observed to integrate inputs in a non-linear fashion and actively participate in computations. Regardless, in simulations of neural networks dendritic structure and functionality are often overlooked. Especially in a machine learning context, when designing artificial neural networks, point neuron models such as the leaky-integrate-and-fire (LIF) model are dominant. These models mimic the integration of inputs at the neuron soma, and ignore the existence of dendrites. In this work, the LIF point neuron model is extended with a simple form of dendritic computation. This gives the LIF neuron increased capacity to discriminate spatiotemporal input sequences, a dendritic functionality as observed in another study. Simulations of the spiking neurons are performed using the Bindsnet framework. In the common LIF model, incoming synapses are independent. Here, we introduce a dependency between incoming synapses such that the post-synaptic impact of a spike is not only determined by the weight of the synapse, but also by the activity of other synapses. This is a form of short term plasticity where synapses are potentiated or depressed by the preceding activity of neighbouring synapses. This is a straightforward way to prevent inputs from simply summing linearly at the soma. To implement this, each pair of synapses on a neuron is assigned a variable,representing the synaptic relation. This variable determines the magnitude ofthe short term plasticity. These variables can be chosen randomly or, more interestingly, can be learned using a form of Hebbian learning. We use Spike-Time-Dependent-Plasticity (STDP), commonly used to learn synaptic strength magnitudes. If all neurons in a layer receive the same input, they tend to learn the same through STDP. Adding inhibitory connections between the neurons creates a winner-take-all (WTA) network. This causes the different neurons to learn different input sequences. To illustrate the impact of the proposed dendritic mechanism, even without learning, we attach five input neurons to two output neurons. One output neuron isa regular LIF neuron, the other output neuron is a LIF neuron with dendritic relationships. Then, the five input neurons are allowed to fire in a particular order. The membrane potentials are reset and subsequently the five input neurons are fired in the reversed order. As the regular LIF neuron linearly integrates its inputs at the soma, the membrane potential response to both sequences is similar in magnitude. In the other output neuron, due to the dendritic mechanism, the membrane potential response is different for both sequences. Hence, the dendritic mechanism improves the neuron’s capacity for discriminating spa-tiotemporal sequences. Dendritic computations improve LIF neurons even if the relationships between synapses are established randomly. Ideally however, a learning rule is used to improve the dendritic relationships based on input data. It is possible to learn synaptic strength with STDP, to make a neuron more sensitive to its input. Similarly, it is possible to learn dendritic relationships with STDP, to make the neuron more sensitive to spatiotemporal input sequences. Feeding structured data to a WTA network with dendritic computation leads to a significantly higher number of discriminated input patterns. Without the dendritic computation, output neurons are less specific and may, for instance, be activated by a sequence in reverse order.Keywords: dendritic computation, spiking neural networks, point neuron model
Procedia PDF Downloads 1333783 Accurate Mass Segmentation Using U-Net Deep Learning Architecture for Improved Cancer Detection
Authors: Ali Hamza
Abstract:
Accurate segmentation of breast ultrasound images is of paramount importance in enhancing the diagnostic capabilities of breast cancer detection. This study presents an approach utilizing the U-Net architecture for segmenting breast ultrasound images aimed at improving the accuracy and reliability of mass identification within the breast tissue. The proposed method encompasses a multi-stage process. Initially, preprocessing techniques are employed to refine image quality and diminish noise interference. Subsequently, the U-Net architecture, a deep learning convolutional neural network (CNN), is employed for pixel-wise segmentation of regions of interest corresponding to potential breast masses. The U-Net's distinctive architecture, characterized by a contracting and expansive pathway, enables accurate boundary delineation and detailed feature extraction. To evaluate the effectiveness of the proposed approach, an extensive dataset of breast ultrasound images is employed, encompassing diverse cases. Quantitative performance metrics such as the Dice coefficient, Jaccard index, sensitivity, specificity, and Hausdorff distance are employed to comprehensively assess the segmentation accuracy. Comparative analyses against traditional segmentation methods showcase the superiority of the U-Net architecture in capturing intricate details and accurately segmenting breast masses. The outcomes of this study emphasize the potential of the U-Net-based segmentation approach in bolstering breast ultrasound image analysis. The method's ability to reliably pinpoint mass boundaries holds promise for aiding radiologists in precise diagnosis and treatment planning. However, further validation and integration within clinical workflows are necessary to ascertain their practical clinical utility and facilitate seamless adoption by healthcare professionals. In conclusion, leveraging the U-Net architecture for breast ultrasound image segmentation showcases a robust framework that can significantly enhance diagnostic accuracy and advance the field of breast cancer detection. This approach represents a pivotal step towards empowering medical professionals with a more potent tool for early and accurate breast cancer diagnosis.Keywords: mage segmentation, U-Net, deep learning, breast cancer detection, diagnostic accuracy, mass identification, convolutional neural network
Procedia PDF Downloads 843782 A Machine Learning Approach for Earthquake Prediction in Various Zones Based on Solar Activity
Authors: Viacheslav Shkuratskyy, Aminu Bello Usman, Michael O’Dea, Saifur Rahman Sabuj
Abstract:
This paper examines relationships between solar activity and earthquakes; it applied machine learning techniques: K-nearest neighbour, support vector regression, random forest regression, and long short-term memory network. Data from the SILSO World Data Center, the NOAA National Center, the GOES satellite, NASA OMNIWeb, and the United States Geological Survey were used for the experiment. The 23rd and 24th solar cycles, daily sunspot number, solar wind velocity, proton density, and proton temperature were all included in the dataset. The study also examined sunspots, solar wind, and solar flares, which all reflect solar activity and earthquake frequency distribution by magnitude and depth. The findings showed that the long short-term memory network model predicts earthquakes more correctly than the other models applied in the study, and solar activity is more likely to affect earthquakes of lower magnitude and shallow depth than earthquakes of magnitude 5.5 or larger with intermediate depth and deep depth.Keywords: k-nearest neighbour, support vector regression, random forest regression, long short-term memory network, earthquakes, solar activity, sunspot number, solar wind, solar flares
Procedia PDF Downloads 733781 Determining Variables in Mathematics Performance According to Gender in Mexican Elementary School
Authors: Nora Gavira Duron, Cinthya Moreda Gonzalez-Ortega, Reyna Susana Garcia Ruiz
Abstract:
This paper objective is to analyze the mathematics performance in the Learning Evaluation National Plan (PLANEA for its Spanish initials: Plan Nacional para la Evaluación de los Aprendizajes), applied to Mexican students who are enrolled in the last elementary-school year over the 2017-2018 academic year. Such test was conducted nationwide in 3,573 schools, using a sample of 108,083 students, whose average in mathematics, on a scale of 0 to 100, was 45.6 points. 75% of the sample analyzed did not reach the sufficiency level (60 points). It should be noted that only 2% got a 90 or higher score result. The performance is analyzed while considering whether there are differences in gender, marginalization level, public or private school enrollment, parents’ academic background, and living-with-parents situation. Likewise, this variable impact (among other variables) on school performance by gender is evaluated, considering multivariate logistic (Logit) regression analysis. The results show there are no significant differences in mathematics performance regarding gender in elementary school; nevertheless, the impact exerted by mothers who studied at least high school is of great relevance for students, particularly for girls. Other determining variables are students’ resilience, their parents’ economic status, and the fact they attend private schools, strengthened by the mother's education.Keywords: multivariate regression analysis, academic performance, learning evaluation, mathematics result per gender
Procedia PDF Downloads 1473780 Encephalon-An Implementation of a Handwritten Mathematical Expression Solver
Authors: Shreeyam, Ranjan Kumar Sah, Shivangi
Abstract:
Recognizing and solving handwritten mathematical expressions can be a challenging task, particularly when certain characters are segmented and classified. This project proposes a solution that uses Convolutional Neural Network (CNN) and image processing techniques to accurately solve various types of equations, including arithmetic, quadratic, and trigonometric equations, as well as logical operations like logical AND, OR, NOT, NAND, XOR, and NOR. The proposed solution also provides a graphical solution, allowing users to visualize equations and their solutions. In addition to equation solving, the platform, called CNNCalc, offers a comprehensive learning experience for students. It provides educational content, a quiz platform, and a coding platform for practicing programming skills in different languages like C, Python, and Java. This all-in-one solution makes the learning process engaging and enjoyable for students. The proposed methodology includes horizontal compact projection analysis and survey for segmentation and binarization, as well as connected component analysis and integrated connected component analysis for character classification. The compact projection algorithm compresses the horizontal projections to remove noise and obtain a clearer image, contributing to the accuracy of character segmentation. Experimental results demonstrate the effectiveness of the proposed solution in solving a wide range of mathematical equations. CNNCalc provides a powerful and user-friendly platform for solving equations, learning, and practicing programming skills. With its comprehensive features and accurate results, CNNCalc is poised to revolutionize the way students learn and solve mathematical equations. The platform utilizes a custom-designed Convolutional Neural Network (CNN) with image processing techniques to accurately recognize and classify symbols within handwritten equations. The compact projection algorithm effectively removes noise from horizontal projections, leading to clearer images and improved character segmentation. Experimental results demonstrate the accuracy and effectiveness of the proposed solution in solving a wide range of equations, including arithmetic, quadratic, trigonometric, and logical operations. CNNCalc features a user-friendly interface with a graphical representation of equations being solved, making it an interactive and engaging learning experience for users. The platform also includes tutorials, testing capabilities, and programming features in languages such as C, Python, and Java. Users can track their progress and work towards improving their skills. CNNCalc is poised to revolutionize the way students learn and solve mathematical equations with its comprehensive features and accurate results.Keywords: AL, ML, hand written equation solver, maths, computer, CNNCalc, convolutional neural networks
Procedia PDF Downloads 1233779 Like Life Itself: Elemental Affordances in the Creation of Transmedia Storyworlds-The Four Broken Hearts Case Study
Authors: Muhammad Babar Suleman
Abstract:
Transgressing the boundaries of the real and the virtual, the temporal and the spatial and the personal and the political, Four Broken Hearts is a hybrid storyworld encompassing film, live performance, location-based experiences and social media. The project is scheduled for launch early next year and is currently a work-in-progress undergoing initial user testing. The story of Four Broken Hearts is being told by taking each of the classic elements of fiction- character, setting, exposition, climax and denouement - and bringing them ‘to life’ in the medium that conveys them to the highest degree of mimesis: Characters are built and explored through social media, Setting is experienced through location-based storytelling, the Backstory is fleshed out using film and the Climax is performed as an immersive drama. By taking advantage of what each medium does best while complementing the other mediums, Four Broken Hearts is presented in the form of a rich transmedia experience that allows audiences to explore the story world across many different platforms while still tying it all together within a cohesive narrative. This article presents an investigation of the project’s narrative outputs produced so far.Keywords: narratology, storyworld, transmedia, narrative, storytelling
Procedia PDF Downloads 3123778 Presentation of a Mix Algorithm for Estimating the Battery State of Charge Using Kalman Filter and Neural Networks
Authors: Amin Sedighfar, M. R. Moniri
Abstract:
Determination of state of charge (SOC) in today’s world becomes an increasingly important issue in all the applications that include a battery. In fact, estimation of the SOC is a fundamental need for the battery, which is the most important energy storage in Hybrid Electric Vehicles (HEVs), smart grid systems, drones, UPS and so on. Regarding those applications, the SOC estimation algorithm is expected to be precise and easy to implement. This paper presents an online method for the estimation of the SOC of Valve-Regulated Lead Acid (VRLA) batteries. The proposed method uses the well-known Kalman Filter (KF), and Neural Networks (NNs) and all of the simulations have been done with MATLAB software. The NN is trained offline using the data collected from the battery discharging process. A generic cell model is used, and the underlying dynamic behavior of the model has used two capacitors (bulk and surface) and three resistors (terminal, surface, and end), where the SOC determined from the voltage represents the bulk capacitor. The aim of this work is to compare the performance of conventional integration-based SOC estimation methods with a mixed algorithm. Moreover, by containing the effect of temperature, the final result becomes more accurate.Keywords: Kalman filter, neural networks, state-of-charge, VRLA battery
Procedia PDF Downloads 1923777 Reconstructability Analysis for Landslide Prediction
Authors: David Percy
Abstract:
Landslides are a geologic phenomenon that affects a large number of inhabited places and are constantly being monitored and studied for the prediction of future occurrences. Reconstructability analysis (RA) is a methodology for extracting informative models from large volumes of data that work exclusively with discrete data. While RA has been used in medical applications and social science extensively, we are introducing it to the spatial sciences through applications like landslide prediction. Since RA works exclusively with discrete data, such as soil classification or bedrock type, working with continuous data, such as porosity, requires that these data are binned for inclusion in the model. RA constructs models of the data which pick out the most informative elements, independent variables (IVs), from each layer that predict the dependent variable (DV), landslide occurrence. Each layer included in the model retains its classification data as a primary encoding of the data. Unlike other machine learning algorithms that force the data into one-hot encoding type of schemes, RA works directly with the data as it is encoded, with the exception of continuous data, which must be binned. The usual physical and derived layers are included in the model, and testing our results against other published methodologies, such as neural networks, yields accuracy that is similar but with the advantage of a completely transparent model. The results of an RA session with a data set are a report on every combination of variables and their probability of landslide events occurring. In this way, every combination of informative state combinations can be examined.Keywords: reconstructability analysis, machine learning, landslides, raster analysis
Procedia PDF Downloads 663776 Development of Advanced Virtual Radiation Detection and Measurement Laboratory (AVR-DML) for Nuclear Science and Engineering Students
Authors: Lily Ranjbar, Haori Yang
Abstract:
Online education has been around for several decades, but the importance of online education became evident after the COVID-19 pandemic. Eventhough the online delivery approach works well for knowledge building through delivering content and oversight processes, it has limitations in developing hands-on laboratory skills, especially in the STEM field. During the pandemic, many education institutions faced numerous challenges in delivering lab-based courses, especially in the STEM field. Also, many students worldwide were unable to practice working with lab equipment due to social distancing or the significant cost of highly specialized equipment. The laboratory plays a crucial role in nuclear science and engineering education. It can engage students and improve their learning outcomes. In addition, online education and virtual labs have gained substantial popularity in engineering and science education. Therefore, developing virtual labs is vital for institutions to deliver high-class education to their students, including their online students. The School of Nuclear Science and Engineering (NSE) at Oregon State University, in partnership with SpectralLabs company, has developed an Advanced Virtual Radiation Detection and Measurement Lab (AVR-DML) to offer a fully online Master of Health Physics program. It was essential for us to use a system that could simulate nuclear modules that accurately replicate the underlying physics, the nature of radiation and radiation transport, and the mechanics of the instrumentations used in the real radiation detection lab. It was all accomplished using a Realistic, Adaptive, Interactive Learning System (RAILS). RAILS is a comprehensive software simulation-based learning system for use in training. It is comprised of a web-based learning management system that is located on a central server, as well as a 3D-simulation package that is downloaded locally to user machines. Users will find that the graphics, animations, and sounds in RAILS create a realistic, immersive environment to practice detecting different radiation sources. These features allow students to coexist, interact and engage with a real STEM lab in all its dimensions. It enables them to feel like they are in a real lab environment and to see the same system they would in a lab. Unique interactive interfaces were designed and developed by integrating all the tools and equipment needed to run each lab. These interfaces provide students full functionality for data collection, changing the experimental setup, and live data collection with real-time updates for each experiment. Students can manually do all experimental setups and parameter changes in this lab. Experimental results can then be tracked and analyzed in an oscilloscope, a multi-channel analyzer, or a single-channel analyzer (SCA). The advanced virtual radiation detection and measurement laboratory developed in this study enabled the NSE school to offer a fully online MHP program. This flexibility of course modality helped us to attract more non-traditional students, including international students. It is a valuable educational tool as students can walk around the virtual lab, make mistakes, and learn from them. They have an unlimited amount of time to repeat and engage in experiments. This lab will also help us speed up training in nuclear science and engineering.Keywords: advanced radiation detection and measurement, virtual laboratory, realistic adaptive interactive learning system (rails), online education in stem fields, student engagement, stem online education, stem laboratory, online engineering education
Procedia PDF Downloads 903775 Glaucoma Detection in Retinal Tomography Using the Vision Transformer
Authors: Sushish Baral, Pratibha Joshi, Yaman Maharjan
Abstract:
Glaucoma is a chronic eye condition that causes vision loss that is irreversible. Early detection and treatment are critical to prevent vision loss because it can be asymptomatic. For the identification of glaucoma, multiple deep learning algorithms are used. Transformer-based architectures, which use the self-attention mechanism to encode long-range dependencies and acquire extremely expressive representations, have recently become popular. Convolutional architectures, on the other hand, lack knowledge of long-range dependencies in the image due to their intrinsic inductive biases. The aforementioned statements inspire this thesis to look at transformer-based solutions and investigate the viability of adopting transformer-based network designs for glaucoma detection. Using retinal fundus images of the optic nerve head to develop a viable algorithm to assess the severity of glaucoma necessitates a large number of well-curated images. Initially, data is generated by augmenting ocular pictures. After that, the ocular images are pre-processed to make them ready for further processing. The system is trained using pre-processed images, and it classifies the input images as normal or glaucoma based on the features retrieved during training. The Vision Transformer (ViT) architecture is well suited to this situation, as it allows the self-attention mechanism to utilise structural modeling. Extensive experiments are run on the common dataset, and the results are thoroughly validated and visualized.Keywords: glaucoma, vision transformer, convolutional architectures, retinal fundus images, self-attention, deep learning
Procedia PDF Downloads 1913774 Load Bearing Capacity and Operational Effectiveness of Single Shear Joints of CFRP Composite Laminate with Spread Tow Thin Plies
Authors: Tabrej Khan, Tamer A. Sebaey, Balbir Singh, M. A. Umarfarooq
Abstract:
Spread-tow thin-ply-based technology has resulted in the progress of optimized reinforced composite plies with ultra-low thicknesses. There is wide use of composite bolted joints in the aircraft industry for load-bearing structures, and they are regarded as the primary source of stress concentration. The purpose of this study is to look into the bearing strength and structural performance of single shear bolt joint configurations in composite laminates, which are basically a combination of conventional thin-plies and thick-plies in some specific stacking sequence. The placement effect of thin-ply within the configured stack on bearing strength, as well as the potential damages, were investigated. Mechanical tests were used to understand the disfigurement mechanisms of the plies and their reciprocity, as well as to reflect on the single shear bolt joint properties and its load-bearing capacity. The results showed that changing the configuration of laminates by inserting the thin plies inside improved the bearing strength by up to 19%.Keywords: hybrid composites, delamination, stress concentrations, mechanical testing, single bolt joint, thin-plies
Procedia PDF Downloads 643773 Lipid-polymer Nanocarrier Platform Enables X-Ray Induced Photodynamic Therapy against Human Colorectal Cancer Cells
Authors: Rui Sang, Fei Deng, Alexander Engel, Ewa M. Goldys, Wei Deng
Abstract:
In this study, we brought together X-ray induced photodynamic therapy (X-PDT) and chemo-drug (5-FU) for the treatment on colorectal cancer cells. This was achieved by developing a lipid-polymer hybrid nanoparticle delivery system (FA-LPNPs-VP-5-FU). It was prepared by incorporating a photosensitizer (verteporfin), chemotherapy drug (5-FU), and a targeting moiety (folic acid) into one platform. The average size of these nanoparticles was around 100 nm with low polydispersity. When exposed to clinical doses of 4 Gy X-ray radiation, FA-LPNPs-VP-5-FU generated sufficient amounts of reactive oxygen species, triggering the apoptosis and necrosis pathway of cancer cells. Our combined X-PDT and chemo-drug strategy was effective in inhibiting cancer cells’ growth and proliferation. Cell cycle analyses revealed that our treatment induced G2/M and S phase arrest in HCT116 cells. Our results indicate that this combined treatment provides better antitumour effect in colorectal cancer cells than each of these modalities alone. This may offer a novel approach for effective colorectal cancer treatment with reduced off-target effect and drug toxicity.Keywords: pdt, targeted lipid-polymer nanoparticles, verteporfin, colorectal cancer
Procedia PDF Downloads 763772 An Adaptive Hybrid Surrogate-Assisted Particle Swarm Optimization Algorithm for Expensive Structural Optimization
Authors: Xiongxiong You, Zhanwen Niu
Abstract:
Choosing an appropriate surrogate model plays an important role in surrogates-assisted evolutionary algorithms (SAEAs) since there are many types and different kernel functions in the surrogate model. In this paper, an adaptive selection of the best suitable surrogate model method is proposed to solve different kinds of expensive optimization problems. Firstly, according to the prediction residual error sum of square (PRESS) and different model selection strategies, the excellent individual surrogate models are integrated into multiple ensemble models in each generation. Then, based on the minimum root of mean square error (RMSE), the best suitable surrogate model is selected dynamically. Secondly, two methods with dynamic number of models and selection strategies are designed, which are used to show the influence of the number of individual models and selection strategy. Finally, some compared studies are made to deal with several commonly used benchmark problems, as well as a rotor system optimization problem. The results demonstrate the accuracy and robustness of the proposed method.Keywords: adaptive selection, expensive optimization, rotor system, surrogates assisted evolutionary algorithms
Procedia PDF Downloads 1413771 Interaction between River and City Morphology
Authors: Ehsan Abshirini
Abstract:
Rivers as one of the most important topographic factors have played a strategic role not only on the appearance of cities but they also affect the structure and morphology of cities. In this paper author intends to find out how a city in its physical network interacts with a river flowing inside. The pilot study is Angers, a city in western France, in which it is influenced by the Maine River. To this purpose space syntax method integrating with GIS is used to extract the properties of physical form of cities in terms of global and local integration value, accessibility and choice value. Simulating the state of absence of river in this city and comparing the result to the current state of city according to the effect of river on the morphology of areas located in different banks of river is also part of interest in this paper. The results show that although a river is not comparable to the city based on size and the area occupied by, it has a significant effect on the form of the city in both global and local properties. In addition, this study endorses that tracking the effect of river-cities and their interaction to rivers in a hybrid of space syntax and GIS may lead researchers to improve their interpretation of physical form of these types of cities.Keywords: river-cities, Physical form, space syntax properties, GIS, topographic factor
Procedia PDF Downloads 4273770 Optimal Pricing Based on Real Estate Demand Data
Authors: Vanessa Kummer, Maik Meusel
Abstract:
Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning
Procedia PDF Downloads 2853769 Interspecific Hybridization in Natural Sturgeon Populations of the Eastern Black Sea: The Consequence of Drastic Population Decline
Authors: Tamar Beridze, Elisa Boscari, Fleur Scheele, Tamari Edisherashvili, Cort Anderson, Leonardo Congiu
Abstract:
The eastern part of the Black Sea and its tributaries are suitable habitats for several sturgeon species, among which Acipenser gueldenstaedtii, A. stellatus, A. nudiventris, A. persicus, A. sturio, and H. huso are well documented. However, different threats have led these species to a dramatic decline; all of them are currently listed as Critically Endangered and some Locally Extinct in that area. We tested 94 wild sturgeon samples from the Black Sea and Rioni River by analyzing the mitochondrial Control Region and nuclear markers for hybrid identification. The data analyses (1) assessed mitochondrial diversity among samples, (2) identified their species, as well as (3) indicated instances of hybridization. The data collected, besides confirming a sharp decrease of catches of Beluga and Stellate sturgeon in recent years, also revealed four juvenile hybrids between Russian and Stellate sturgeon, providing the first evidence of natural interspecific hybridization in the Rioni. The present communication raises concerns about the status of sturgeon species in this area and underlines the urgent need for conservation programs to restore self-sustaining populations.Keywords: black sea, sturgeon, Rioni river, interspecific hybridization
Procedia PDF Downloads 1363768 Education Delivery in Youth Justice Centres: Inside-Out Prison Exchange Program Pedagogy in an Australian Context
Authors: Tarmi A'Vard
Abstract:
This paper discusses the transformative learning experience for students participating in the Inside-Out Prison Exchange Program (Inside-out) and explores the value this pedagogical approach may have in youth justice centers. Inside-Out is a semester-long university course which is unique as it takes 15 university students, with their textbook and theory-based knowledge, behind the walls to study alongside 15 incarcerated students, who have the lived experience of the criminal justice system. Inside-out is currently offered in three Victorian prisons, expanding to five in 2020. The Inside-out pedagogy which is based on transformative dialogic learning is reliant upon the participants sharing knowledge and experiences to develop an understanding and appreciation of the diversity and uniqueness of one another. Inside-out offers the class an opportunity to create its own guidelines for dialogue, which can lead to the student’s sense of equality, which is fundamental in the success of this program. Dialogue allows active participation by all parties in reconciling differences, collaborating ideas, critiquing and developing hypotheses and public policies, and encouraging self-reflection and exploration. The structure of the program incorporates the implementation of circular seating (where the students alternate between inside and outside), activities, individual reflective tasks, group work, and theory analysis. In this circle everyone is equal, this includes the educator, who serves as a facilitator more so than the traditional teacher role. A significant function of the circle is to develop a group consciousness, allowing the whole class to see itself as a collective, and no one person holds a superior role. This also encourages participants to be responsible and accountable for their behavior and contributions. Research indicates completing academic courses, like Inside-Out, contributes positively to reducing recidivism. Inside-Out’s benefits and success in many adult correctional institutions have been outlined in evaluation reports and scholarly articles. The key findings incorporate the learning experiences for the students in both an academic capability and professional practice and development. Furthermore, stereotypes and pre-determined ideas are challenged, and there is a promotion of critical thinking and evidence of self-discovery and growth. There is empirical data supporting positive outcomes of education in youth justice centers in reducing recidivism and increasing the likelihood of returning to education upon release. Hence, this research could provide the opportunity to increase young people’s engagement in education which is a known protective factor for assisting young people to move away from criminal behavior. In 2016, Tarmi completed the Inside-Out educator training in Philadelphia, Pennsylvania, and has developed an interest in exploring the pedagogy of Inside-Out, specifically targeting young offenders in a Youth Justice Centre.Keywords: dialogic transformative learning, inside-out prison exchange program, prison education, youth justice
Procedia PDF Downloads 1263767 Health Trajectory Clustering Using Deep Belief Networks
Authors: Farshid Hajati, Federico Girosi, Shima Ghassempour
Abstract:
We present a Deep Belief Network (DBN) method for clustering health trajectories. Deep Belief Network (DBN) is a deep architecture that consists of a stack of Restricted Boltzmann Machines (RBM). In a deep architecture, each layer learns more complex features than the past layers. The proposed method depends on DBN in clustering without using back propagation learning algorithm. The proposed DBN has a better a performance compared to the deep neural network due the initialization of the connecting weights. We use Contrastive Divergence (CD) method for training the RBMs which increases the performance of the network. The performance of the proposed method is evaluated extensively on the Health and Retirement Study (HRS) database. The University of Michigan Health and Retirement Study (HRS) is a nationally representative longitudinal study that has surveyed more than 27,000 elderly and near-elderly Americans since its inception in 1992. Participants are interviewed every two years and they collect data on physical and mental health, insurance coverage, financial status, family support systems, labor market status, and retirement planning. The dataset is publicly available and we use the RAND HRS version L, which is easy to use and cleaned up version of the data. The size of sample data set is 268 and the length of the trajectories is equal to 10. The trajectories do not stop when the patient dies and represent 10 different interviews of live patients. Compared to the state-of-the-art benchmarks, the experimental results show the effectiveness and superiority of the proposed method in clustering health trajectories.Keywords: health trajectory, clustering, deep learning, DBN
Procedia PDF Downloads 3693766 Cyber Violence Behaviors Among Social Media Users in Ghana: An Application of Self-Control Theory and Social Learning Theory
Authors: Aisha Iddrisu
Abstract:
The proliferation of cyberviolence in the wave of increased social media consumption calls for immediate attention both at the local and global levels. With over 4.70 billion social media users worldwide and 8.8 social media users in Ghana, various forms of violence have become the order of the day in most countries and communities. Cyber violence is defined as producing, retrieving, and sharing of hurtful or dangerous online content to cause emotional, psychological, or physical harm. The urgency and severity of cyber violence have led to the enactment of laws in various countries though lots still need to be done, especially in Ghana. In Ghana, studies on cyber violence have not been extensively dealt with. Existing studies concentrate only on one form or the other form of cyber violence, thus cybercrime and cyber bullying. Also, most studies in Africa have not explored cyber violence forms using empirical theories and the few that existed were qualitatively researched, whereas others examine the effect of cyber violence rather than examining why those who involve in it behave the way they behave. It is against this backdrop that this study aims to examine various cyber violence behaviour among social media users in Ghana by applying the theory of Self-control and Social control theory. This study is important for the following reasons. The outcome of this research will help at both national and international level of policymaking by adding to the knowledge of understanding cyberviolence and why people engage in various forms of cyberviolence. It will also help expose other ways by which such behaviours are enforced thereby serving as a guide in the enactment of the rightful rules and laws to curb such behaviours. It will add to literature on consequences of new media. This study seeks to confirm or reject to the following research hypotheses. H1 Social media usage has direct significant effect of cyberviolence behaviours. H2 Ineffective parental management has direct significant positive relation to Low self-control. H3 Low self-control has direct significant positive effect on cyber violence behaviours among social, H4 Differential association has significant positive effect on cyberviolence behaviour among social media users in Ghana. H5 Definitions have a significant positive effect on cyberviolence behaviour among social media users in Ghana. H6 Imitation has a significant positive effect on cyberviolence behaviour among social media users in Ghana. H7 Differential reinforcement has a significant positive effect on cyberviolence behaviour among social media users in Ghana. H8 Differential association has a significant positive effect on definitions. H9 Differential association has a significant positive effect on imitation. H10 Differential association has a significant positive effect on differential reinforcement. H11 Differential association has significant indirect positive effects on cyberviolence through the learning process.Keywords: cyberviolence, social media users, self-control theory, social learning theory
Procedia PDF Downloads 863765 Numerical Study of Dynamic Buckling of Fiber Metal Laminates's Profile
Authors: Monika Kamocka, Radoslaw Mania
Abstract:
The design of Fiber Metal Laminates - combining thin aluminum sheets and prepreg layers, allows creating a hybrid structure with high strength to weight ratio. This feature makes FMLs very attractive for aerospace industry, where thin-walled structures are commonly used. Nevertheless, those structures are prone to buckling phenomenon. Buckling could occur also under static load as well as dynamic pulse loads. In this paper, the problem of dynamic buckling of open cross-section FML profiles under axial dynamic compression in the form of pulse load of finite duration is investigated. In the numerical model, material properties of FML constituents were assumed as nonlinear elastic-plastic aluminum and linear-elastic glass-fiber-reinforced composite. The influence of pulse shape was investigated. Sinusoidal and rectangular pulse loads of finite duration were compared in two ways, i.e. with respect to magnitude and force pulse. The dynamic critical buckling load was determined based on Budiansky-Hutchinson, Ari Gur, and Simonetta dynamic buckling criteria.Keywords: dynamic buckling, dynamic stability, Fiber Metal Laminate, Finite Element Method
Procedia PDF Downloads 1943764 Waterproofing Agent in Concrete for Tensile Improvement
Authors: Muhamad Azani Yahya, Umi Nadiah Nor Ali, Mohammed Alias Yusof, Norazman Mohamad Nor, Vikneswaran Munikanan
Abstract:
In construction, concrete is one of the materials that can commonly be used as for structural elements. Concrete consists of cement, sand, aggregate and water. Concrete can be added with admixture in the wet condition to suit the design purpose such as to prolong the setting time to improve workability. For strength improvement, concrete is being added with other hybrid materials to increase strength; this is because the tensile strength of concrete is very low in comparison to the compressive strength. This paper shows the usage of a waterproofing agent in concrete to enhance the tensile strength. High tensile concrete is expensive because the concrete mix needs fiber and also high cement content to be incorporated in the mix. High tensile concrete being used for structures that are being imposed by high impact dynamic load such as blast loading that hit the structure. High tensile concrete can be defined as a concrete mix design that achieved 30%-40% tensile strength compared to its compression strength. This research evaluates the usage of a waterproofing agent in a concrete mix as an element of reinforcement to enhance the tensile strength. According to the compression and tensile test, it shows that the concrete mix with a waterproofing agent enhanced the mechanical properties of the concrete. It is also show that the composite concrete with waterproofing is a high tensile concrete; this is because of the tensile is between 30% and 40% of the compression strength. This mix is economical because it can produce high tensile concrete with low cost.Keywords: high tensile concrete, waterproofing agent, concrete, rheology
Procedia PDF Downloads 3283763 Fostering Resilience in Early Adolescents: A Canadian Evaluation of the HEROES Program
Authors: Patricia L. Fontanilla, David Nordstokke
Abstract:
Introduction: Today’s children and youth face increasing social and behavioural challenges, leading to delays in social development and greater mental health needs. Early adolescents (aged 9 to 14) are experiencing a rise in mental health symptoms and diagnoses. This study examines the impact of HEROES, a social-emotional learning (SEL) program, on resilience and academic outcomes in early adolescents. The HEROES program is designed to enhance resilience the ability to adapt and thrive in the face of adversity, equipping youth to navigate developmental transitions and challenges. This study’s objective was to evaluate the program’s long-term effectiveness by measuring changes in resilience and academic resilience across 10 months. Methodology: This study collected data from 21 middle school students (grades 7 to 9) in a rural Canadian school. Quantitative data were gathered at four intervals: pre-intervention, post-intervention, and at 2- and 4-month follow-ups. Data were analyzed with linear mixed models (LMM). Results: Findings showed statistically significant increases in academic resilience over time and significant increases in resilience from pre-intervention to 2 and 4 months later. Limitations included a small sample size, which may affect generalizability. Conclusion: The HEROES program demonstrates promise in increasing resilience and academic resilience among early adolescents through SEL skill development.Keywords: academic resilience, early adolescence, resilience, SEL, social-emotional learning program
Procedia PDF Downloads 113762 DenseNet and Autoencoder Architecture for COVID-19 Chest X-Ray Image Classification and Improved U-Net Lung X-Ray Segmentation
Authors: Jonathan Gong
Abstract:
Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.Keywords: artificial intelligence, convolutional neural networks, deep learning, image processing, machine learning
Procedia PDF Downloads 1303761 Working within the Zone of Proximal Development: Does It Help for Reading Strategy?
Authors: Mahmood Dehqan, Peyman Peyvasteh
Abstract:
In recent years there has been a growing interest in issues concerning the impact of sociocultural theory (SCT) of learning on different aspects of second/foreign language learning. This study aimed to find the possible effects of sociocultural teaching techniques on reading strategy of EFL learners. Indeed, the present research compared the impact of peer and teacher scaffolding on EFL learners’ reading strategy use across two proficiency levels. To this end, a pre-test post-test quasi-experimental research design was used and two instruments were utilized to collect the data: Nelson English language test and reading strategy questionnaire. Ninety five university students participated in this study were divided into two groups of teacher and peer scaffolding. Teacher scaffolding group received scaffolded help from the teacher based on three mechanisms of effective help within ZPD: graduated, contingent, dialogic. In contrast, learners of peer scaffolding group were unleashed from the teacher-fronted classroom as they were asked to carry out the reading comprehension tasks with the feedback they provided for each other. Results obtained from ANOVA revealed that teacher scaffolding group outperformed the peer scaffolding group in terms of reading strategy use. It means teacher’s scaffolded help provided within the learners’ ZPD led to better reading strategy improvement compared with the peer scaffolded help. However, the interaction effect between proficiency factor and teaching technique was non-significant, leading to the conclusion that strategy use of the learners was not affected by their proficiency level in either teacher or peer scaffolding groups.Keywords: peer scaffolding, proficiency level, reading strategy, sociocultural theory, teacher scaffolding
Procedia PDF Downloads 381