Search results for: innovative learning
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8593

Search results for: innovative learning

3373 Critical Thinking in the Moroccan Textbooks of English: Ticket to English as a Case Study

Authors: Mohsine Jebbour

Abstract:

The ultimate aim of this study was to analyze a second-year baccalaureate textbook of English to see to what extent it includes elements of critical thinking. A further purpose was to assess the extent to which the teachers’ teaching practices help students develop some degree of critical thinking. The literature on critical thinking indicated that all the writers agree that critical thinking is skilled and dispositional oriented, and most of the definitions highlight the skill and disposition to select, collect, analyze and evaluate information effectively. In this study, two instruments were used, namely content analysis and questionnaire to ensure validity and reliability. The sample of this study, on the one hand, was a second year textbook of English, namely Ticket to English. The process of collecting data was carried out through designing a checklist to analyze the textbook of English. On the other hand, high school students (second baccalaureate grade) and teachers of English constituted the second sample. Two questionnaires were administered—One was completed by 28 high school teachers (18 males and10 females), and the other was completed by 51 students (26 males and 25 females) from Fez, Morocco. The items of the questionnaire tended to elicit both qualitative and quantitative data. An attempt was made to answer two research questions. One pertained to the extent to which the textbooks of English contain critical thinking elements (Critical thinking skills and dispositions, types of questions, language learning strategies, classroom activities); the second was concerned with whether the teaching practices of teachers of English help improve students’ critical thinking. The results demonstrated that the textbooks of English include elements of critical thinking, and the teachers’ teaching practices help the students develop some degree of critical thinking. Yet, the textbooks do not include problem-solving activities and media analysis and 86% of the teacher-respondents tended to skip activities in the textbooks, mainly the units dealing with Project Work and Study Skills which are necessary for enhancing critical thinking among the students. Therefore, the textbooks need to be designed around additional activities and the teachers are required to cover the units skipped so as to make the teaching of critical thinking effective.

Keywords: critical thinking, language learning strategies, language proficiency, teaching practices

Procedia PDF Downloads 594
3372 Convolutional Neural Network Based on Random Kernels for Analyzing Visual Imagery

Authors: Ja-Keoung Koo, Kensuke Nakamura, Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Byung-Woo Hong

Abstract:

The machine learning techniques based on a convolutional neural network (CNN) have been actively developed and successfully applied to a variety of image analysis tasks including reconstruction, noise reduction, resolution enhancement, segmentation, motion estimation, object recognition. The classical visual information processing that ranges from low level tasks to high level ones has been widely developed in the deep learning framework. It is generally considered as a challenging problem to derive visual interpretation from high dimensional imagery data. A CNN is a class of feed-forward artificial neural network that usually consists of deep layers the connections of which are established by a series of non-linear operations. The CNN architecture is known to be shift invariant due to its shared weights and translation invariance characteristics. However, it is often computationally intractable to optimize the network in particular with a large number of convolution layers due to a large number of unknowns to be optimized with respect to the training set that is generally required to be large enough to effectively generalize the model under consideration. It is also necessary to limit the size of convolution kernels due to the computational expense despite of the recent development of effective parallel processing machinery, which leads to the use of the constantly small size of the convolution kernels throughout the deep CNN architecture. However, it is often desired to consider different scales in the analysis of visual features at different layers in the network. Thus, we propose a CNN model where different sizes of the convolution kernels are applied at each layer based on the random projection. We apply random filters with varying sizes and associate the filter responses with scalar weights that correspond to the standard deviation of the random filters. We are allowed to use large number of random filters with the cost of one scalar unknown for each filter. The computational cost in the back-propagation procedure does not increase with the larger size of the filters even though the additional computational cost is required in the computation of convolution in the feed-forward procedure. The use of random kernels with varying sizes allows to effectively analyze image features at multiple scales leading to a better generalization. The robustness and effectiveness of the proposed CNN based on random kernels are demonstrated by numerical experiments where the quantitative comparison of the well-known CNN architectures and our models that simply replace the convolution kernels with the random filters is performed. The experimental results indicate that our model achieves better performance with less number of unknown weights. The proposed algorithm has a high potential in the application of a variety of visual tasks based on the CNN framework. Acknowledgement—This work was supported by the MISP (Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by IITP, and NRF-2014R1A2A1A11051941, NRF2017R1A2B4006023.

Keywords: deep learning, convolutional neural network, random kernel, random projection, dimensionality reduction, object recognition

Procedia PDF Downloads 278
3371 Achieving Maximum Performance through the Practice of Entrepreneurial Ethics: Evidence from SMEs in Nigeria

Authors: S. B. Tende, H. L. Abubakar

Abstract:

It is acknowledged that small and medium enterprises (SMEs) may encounter different ethical issues and pressures that could affect the way in which they strategize or make decisions concerning the outcome of their business. Therefore, this research aimed at assessing entrepreneurial ethics in the business of SMEs in Nigeria. Secondary data were adopted as source of corpus for the analysis. The findings conclude that a sound entrepreneurial ethics system has a significant effect on the level of performance of SMEs in Nigeria. The Nigerian Government needs to provide both guiding and physical structures; as well as learning systems that could inculcate these entrepreneurial ethics.

Keywords: culture, entrepreneurial ethics, performance, SME

Procedia PDF Downloads 368
3370 Examining the Teaching and Learning Needs of Science and Mathematics Educators in South Africa

Authors: M. Shaheed Hartley

Abstract:

There has been increasing pressure on education researchers and practitioners at higher education institutions to focus on the development of South Africa’s rural and peri-urban communities and improving their quality of life. Many tertiary institutions are obliged to review their outreach interventions in schools. To ensure that the support provided to schools is still relevant, a systemic evaluation of science educator needs is central to this process. These prioritised needs will serve as guide not only for the outreach projects of tertiary institutions, but also to service providers in general so that the process of addressing educators needs become coordinated, organised and delivered in a systemic manner. This paper describes one area of a broader needs assessment exercise to collect data regarding the needs of educators in a district of 45 secondary schools in the Western Cape Province of South Africa. This research focuses on the needs and challenges faced by science educators at these schools as articulated by the relevant stakeholders. The objectives of this investigation are two-fold: (1) to create a data base that will capture the needs and challenges identified by science educators of the selected secondary schools; and (2) to develop a needs profile for each of the participating secondary schools that will serve as a strategic asset to be shared with the various service providers as part of a community of practice whose core business is to support science educators and science education at large. The data was collected by a means of a needs assessment questionnaire (NAQ) which was developed in both actual and preferred versions. An open-ended questionnaire was also administered which allowed teachers to express their views. The categories of the questionnaire were predetermined by participating researchers, educators and education department officials. Group interviews were also held with the science teachers at each of the schools. An analysis of the data revealed important trends in terms of science educator needs and identified schools that can be clustered around priority needs, logistic reasoning and educator profiles. The needs database also provides opportunity for the community of practice to strategise and coordinate their interventions.

Keywords: needs assessment, science and mathematics education, evaluation, teaching and learning, South Africa

Procedia PDF Downloads 166
3369 Innovation Management: A Comparative Analysis among Organizations from United Arab Emirates, Saudi Arabia, Brazil and China

Authors: Asmaa Abazaid, Maram Al-Ostah, Nadeen Abu-Zahra, Ruba Bawab, Refaat Abdel-Razek

Abstract:

Innovation audit is defined as a tool that can be used to reflect on how the innovation is managed in an organization. The aim of this study is to audit innovation in the second top Engineering Firms in the world, and one of the Small Medium Enterprises (SMEs) companies that are working in United Arab Emirates (UAE). The obtained results are then compared with four international companies from China and Brazil. The Diamond model has been used for auditing innovation in the two companies in UAE to evaluate their innovation management and to identify each company’s strengths and weaknesses from an innovation perspective. The results of the comparison between the two companies (Jacobs and Hyper General Contracting) revealed that Jacobs has support for innovation, its innovation processes are well managed, the company is committed to the development of its employees worldwide and the innovation system is flexible. Jacobs was doing best in all innovation management dimensions: strategy, process, organization, linkages and learning, while Hyper General Contracting did not score as Jacobs in any of the innovation dimensions. Furthermore, the audit results of both companies were compared with international companies to examine how well the two construction companies in UAE manage innovation relative to SABIC (Saudi company), Poly Easy and Arnious (Brazilian companies), Huagong tools and Guizohou Yibai (Chinese companies). The results revealed that Jacobs is doing best in learning and organization dimensions, while PolyEasy and Jacobs are equal in the linkage dimension. Huagong Tools scored the highest score in process dimension among all the compared companies. However, the highest score of strategy dimension was given to PolyEasy. On the other hand, Hyper General Contracting scored the lowest in all of the innovation management dimensions. It needs to improve its management of all the innovation management dimensions with special attention to be given to strategy, process, and linkage as they got scores below 4 out of 7 comparing with other dimensions. Jacobs scored the highest in three innovation management dimensions related to the six companies. However, the strategy dimension is considered low, and special attention is needed in this dimension.

Keywords: Brazil, China, innovation audit, innovation evaluation, innovation management, Saudi Arabia, United Arab Emirates

Procedia PDF Downloads 274
3368 Contextualization and Localization: Acceptability of the Developed Activity Sheets in Science 5 Integrating Climate Change Adaptation

Authors: Kim Alvin De Lara

Abstract:

The research aimed to assess the level of acceptability of the developed activity sheets in Science 5 integrating climate change adaptation of grade 5 science teachers in the District of Pililla school year 2016-2017. In this research, participants were able to recognize and understand the importance of environmental education in improving basic education and integrating them in lessons through localization and contextualization. The researcher conducted the study to develop a material to use by Science teachers in Grade 5. It served also as a self-learning resource for students. The respondents of the study were the thirteen Grade 5 teachers teaching Science 5 in the District of Pililla. Respondents were selected purposively and identified by the researcher. A descriptive method of research was utilized in the research. The main instrument was a checklist which includes items on the objectives, content, tasks, contextualization and localization of the developed activity sheets. The researcher developed a 2-week lesson in Science 5 for 4th Quarter based on the curriculum guide with integration of climate change adaptation. The findings revealed that majority of respondents are female, 31 years old and above, 10 years above in teaching science and have units in master’s degree. With regards to the level of acceptability, the study revealed developed activity sheets in science 5 is very much acceptable. In view of the findings, lessons in science 5 must be contextualized and localized to improve to make the curriculum responds, conforms, reflects, and be flexible to the needs of the learners, especially the 21st century learners who need to be holistically and skillfully developed. As revealed by the findings, it is more acceptable to localized and contextualized the learning materials for pupils. Policy formation and re-organization of the lessons and competencies in Science must be reviewed and re-evaluated. Lessons in science must also be integrated with climate change adaptation since nowadays, people are experiencing change in climate due to global warming and other factors. Through developed activity sheets, researcher strongly supports environmental education and believes this to serve as a way to instill environmental literacy to students.

Keywords: activity sheets, climate change adaptation, contextualization, localization

Procedia PDF Downloads 315
3367 Smartphone-Based Human Activity Recognition by Machine Learning Methods

Authors: Yanting Cao, Kazumitsu Nawata

Abstract:

As smartphones upgrading, their software and hardware are getting smarter, so the smartphone-based human activity recognition will be described as more refined, complex, and detailed. In this context, we analyzed a set of experimental data obtained by observing and measuring 30 volunteers with six activities of daily living (ADL). Due to the large sample size, especially a 561-feature vector with time and frequency domain variables, cleaning these intractable features and training a proper model becomes extremely challenging. After a series of feature selection and parameters adjustment, a well-performed SVM classifier has been trained.

Keywords: smart sensors, human activity recognition, artificial intelligence, SVM

Procedia PDF Downloads 137
3366 Rapid Building Detection in Population-Dense Regions with Overfitted Machine Learning Models

Authors: V. Mantey, N. Findlay, I. Maddox

Abstract:

The quality and quantity of global satellite data have been increasing exponentially in recent years as spaceborne systems become more affordable and the sensors themselves become more sophisticated. This is a valuable resource for many applications, including disaster management and relief. However, while more information can be valuable, the volume of data available is impossible to manually examine. Therefore, the question becomes how to extract as much information as possible from the data with limited manpower. Buildings are a key feature of interest in satellite imagery with applications including telecommunications, population models, and disaster relief. Machine learning tools are fast becoming one of the key resources to solve this problem, and models have been developed to detect buildings in optical satellite imagery. However, by and large, most models focus on affluent regions where buildings are generally larger and constructed further apart. This work is focused on the more difficult problem of detection in populated regions. The primary challenge with detecting small buildings in densely populated regions is both the spatial and spectral resolution of the optical sensor. Densely packed buildings with similar construction materials will be difficult to separate due to a similarity in color and because the physical separation between structures is either non-existent or smaller than the spatial resolution. This study finds that training models until they are overfitting the input sample can perform better in these areas than a more robust, generalized model. An overfitted model takes less time to fine-tune from a generalized pre-trained model and requires fewer input data. The model developed for this study has also been fine-tuned using existing, open-source, building vector datasets. This is particularly valuable in the context of disaster relief, where information is required in a very short time span. Leveraging existing datasets means that little to no manpower or time is required to collect data in the region of interest. The training period itself is also shorter for smaller datasets. Requiring less data means that only a few quality areas are necessary, and so any weaknesses or underpopulated regions in the data can be skipped over in favor of areas with higher quality vectors. In this study, a landcover classification model was developed in conjunction with the building detection tool to provide a secondary source to quality check the detected buildings. This has greatly reduced the false positive rate. The proposed methodologies have been implemented and integrated into a configurable production environment and have been employed for a number of large-scale commercial projects, including continent-wide DEM production, where the extracted building footprints are being used to enhance digital elevation models. Overfitted machine learning models are often considered too specific to have any predictive capacity. However, this study demonstrates that, in cases where input data is scarce, overfitted models can be judiciously applied to solve time-sensitive problems.

Keywords: building detection, disaster relief, mask-RCNN, satellite mapping

Procedia PDF Downloads 163
3365 A Dataset of Program Educational Objectives Mapped to ABET Outcomes: Data Cleansing, Exploratory Data Analysis and Modeling

Authors: Addin Osman, Anwar Ali Yahya, Mohammed Basit Kamal

Abstract:

Datasets or collections are becoming important assets by themselves and now they can be accepted as a primary intellectual output of a research. The quality and usage of the datasets depend mainly on the context under which they have been collected, processed, analyzed, validated, and interpreted. This paper aims to present a collection of program educational objectives mapped to student’s outcomes collected from self-study reports prepared by 32 engineering programs accredited by ABET. The manual mapping (classification) of this data is a notoriously tedious, time consuming process. In addition, it requires experts in the area, which are mostly not available. It has been shown the operational settings under which the collection has been produced. The collection has been cleansed, preprocessed, some features have been selected and preliminary exploratory data analysis has been performed so as to illustrate the properties and usefulness of the collection. At the end, the collection has been benchmarked using nine of the most widely used supervised multiclass classification techniques (Binary Relevance, Label Powerset, Classifier Chains, Pruned Sets, Random k-label sets, Ensemble of Classifier Chains, Ensemble of Pruned Sets, Multi-Label k-Nearest Neighbors and Back-Propagation Multi-Label Learning). The techniques have been compared to each other using five well-known measurements (Accuracy, Hamming Loss, Micro-F, Macro-F, and Macro-F). The Ensemble of Classifier Chains and Ensemble of Pruned Sets have achieved encouraging performance compared to other experimented multi-label classification methods. The Classifier Chains method has shown the worst performance. To recap, the benchmark has achieved promising results by utilizing preliminary exploratory data analysis performed on the collection, proposing new trends for research and providing a baseline for future studies.

Keywords: ABET, accreditation, benchmark collection, machine learning, program educational objectives, student outcomes, supervised multi-class classification, text mining

Procedia PDF Downloads 161
3364 Accelerating Decision-Making in Oil and Gas Wells: 'A Digital Transformation Journey for Rapid and Precise Insights from Well History Data'

Authors: Linung Kresno Adikusumo, Ivan Ramos Sampe Immanuel, Liston Sitanggang

Abstract:

An excellent, well work program in the oil and gas industry can have numerous positive business impacts, contributing to operational efficiency, increased production, enhanced safety, and improved financial performance. In summary, an excellent, well work program not only ensures the immediate success of specific projects but also has a broader positive impact on the overall business performance and reputation of the oil and gas company. It positions the company for long-term success in a competitive and dynamic industry. Nevertheless, a number of challenges were encountered when developing a good work program, such as the poor quality and lack of integration of well documentation, the incompleteness of the well history, and the low accessibility of well documentation. As a result, the well work program was delivered less accurately, plus well damage was managed slowly. Our solution implementing digital technology by developing a web-based database and application not only solves those issues but also provides an easy-to-access report and user-friendly display for management as well as engineers to analyze the report’s content. This application aims to revolutionize the documentation of well history in the field of oil and gas exploration and production. The current lack of a streamlined and comprehensive system for capturing, organizing, and accessing well-related data presents challenges in maintaining accurate and up-to-date records. Our innovative solution introduces a user-friendly and efficient platform designed to capture well history documentation seamlessly.

Keywords: digital, drilling, well work, application

Procedia PDF Downloads 63
3363 Influence of Spelling Errors on English Language Performance among Learners with Dysgraphia in Public Primary Schools in Embu County, Kenya

Authors: Madrine King'endo

Abstract:

This study dealt with the influence of spelling errors on English language performance among learners with dysgraphia in public primary schools in West Embu, Embu County, Kenya. The study purposed to investigate the influence of spelling errors on the English language performance among the class three pupils with dysgraphia in public primary schools. The objectives of the study were to identify the spelling errors that learners with dysgraphia make when writing English words and classify the spelling errors they make. Further, the study will establish how the spelling errors affect the performance of the language among the study participants, and suggest the remediation strategies that teachers could use to address the errors. The study could provide the stakeholders with relevant information in writing skills that could help in developing a responsive curriculum to accommodate the teaching and learning needs of learners with dysgraphia, and probably ensure training of teachers in teacher training colleges is tailored within the writing needs of the pupils with dysgraphia. The study was carried out in Embu county because the researcher did not find any study in related literature review concerning the influence of spelling errors on English language performance among learners with dysgraphia in public primary schools done in the area. Moreover, besides being relatively populated enough for a sample population of the study, the area was fairly cosmopolitan to allow a generalization of the study findings. The study assumed the sampled schools will had class three pupils with dysgraphia who exhibited written spelling errors. The study was guided by two spelling approaches: the connectionist stimulation of spelling process and orthographic autonomy hypothesis with a view to explain how participants with learning disabilities spell written words. Data were collected through interviews, pupils’ exercise books, and progress records, and a spelling test made by the researcher based on the spelling scope set for class three pupils by the ministry of education in the primary education syllabus. The study relied on random sampling techniques in identifying general and specific participants. Since the study used children in schools as participants, voluntary consent was sought from themselves, their teachers and the school head teachers who were their caretakers in a school setting.

Keywords: dysgraphia, writing, language, performance

Procedia PDF Downloads 146
3362 Using Optical Character Recognition to Manage the Unstructured Disaster Data into Smart Disaster Management System

Authors: Dong Seop Lee, Byung Sik Kim

Abstract:

In the 4th Industrial Revolution, various intelligent technologies have been developed in many fields. These artificial intelligence technologies are applied in various services, including disaster management. Disaster information management does not just support disaster work, but it is also the foundation of smart disaster management. Furthermore, it gets historical disaster information using artificial intelligence technology. Disaster information is one of important elements of entire disaster cycle. Disaster information management refers to the act of managing and processing electronic data about disaster cycle from its’ occurrence to progress, response, and plan. However, information about status control, response, recovery from natural and social disaster events, etc. is mainly managed in the structured and unstructured form of reports. Those exist as handouts or hard-copies of reports. Such unstructured form of data is often lost or destroyed due to inefficient management. It is necessary to manage unstructured data for disaster information. In this paper, the Optical Character Recognition approach is used to convert handout, hard-copies, images or reports, which is printed or generated by scanners, etc. into electronic documents. Following that, the converted disaster data is organized into the disaster code system as disaster information. Those data are stored in the disaster database system. Gathering and creating disaster information based on Optical Character Recognition for unstructured data is important element as realm of the smart disaster management. In this paper, Korean characters were improved to over 90% character recognition rate by using upgraded OCR. In the case of character recognition, the recognition rate depends on the fonts, size, and special symbols of character. We improved it through the machine learning algorithm. These converted structured data is managed in a standardized disaster information form connected with the disaster code system. The disaster code system is covered that the structured information is stored and retrieve on entire disaster cycle such as historical disaster progress, damages, response, and recovery. The expected effect of this research will be able to apply it to smart disaster management and decision making by combining artificial intelligence technologies and historical big data.

Keywords: disaster information management, unstructured data, optical character recognition, machine learning

Procedia PDF Downloads 116
3361 Sustainability Assessment of Food Delivery with Last-Mile Delivery Droids, A Case Study at the European Commission's JRC Ispra Site

Authors: Ada Garus

Abstract:

This paper presents the outcomes of the sustainability assessment of food delivery with a last-mile delivery service introduced in a real-world case study. The methodology used in the sustainability assessment integrates multi-criteria decision-making analysis, sustainability pillars, and scenario analysis to best reflect the conflicting needs of stakeholders involved in the last mile delivery system. The case study provides an application of the framework to the food delivery system of the Joint Research Centre of the European Commission where three alternative solutions were analyzed I) the existent state in which individuals frequent the local cantine or pick up their food, using their preferred mode of transport II) the hypothetical scenario in which individuals can only order their food using the delivery droid system III) a scenario in which the food delivery droid based system is introduced as a supplement to the current system. The environmental indices are calculated using a simulation study in which decision regarding the food delivery is predicted using a multinomial logit model. The vehicle dynamics model is used to predict the fuel consumption of the regular combustion engines vehicles used by the cantine goers and the electricity consumption of the droid. The sustainability assessment allows for the evaluation of the economic, environmental, and social aspects of food delivery, making it an apt input for policymakers. Moreover, the assessment is one of the first studies to investigate automated delivery droids, which could become a frequent addition to the urban landscape in the near future.

Keywords: innovations in transportation technologies, behavioural change and mobility, urban freight logistics, innovative transportation systems

Procedia PDF Downloads 186
3360 Quality Assurance in Higher Education: Doha Institute for Graduate Studies as a Case Study

Authors: Ahmed Makhoukh

Abstract:

Quality assurance (QA) has recently become a common practice, which is endorsed by most Higher Education (HE) institutions worldwide, due to the pressure of internal and external forces. One of the aims of this quality movement is to make the contribution of university education to socio-economic development highly significant. This entails that graduates are currently required have a high-quality profile, i.e., to be competent and master the 21st-century skills needed in the labor market. This wave of change, mostly imposed by globalization, has the effect that university education should be learner-centered in order to satisfy the different needs of students and meet the expectations of other stakeholders. Such a shift of focus on the student learning outcomes has led HE institutions to reconsider their strategic planning, their mission, the curriculum, the pedagogical competence of the academic staff, among other elements. To ensure that the overall institutional performance is on the right way, a QA system should be established to assume this task of checking regularly the extent to which the set of standards of evaluation are strictly respected as expected. This operation of QA has the advantage of proving the accountability of the institution, gaining the trust of the public with transparency and enjoying an international recognition. This is the case of Doha Institute (DI) for Graduate Studies, in Qatar, the object of the present study. The significance of this contribution is to show that the conception of quality has changed in this digital age, and the need to integrate a department responsible for QA in every HE institution to ensure educational quality, enhance learners and achieve academic leadership. Thus, to undertake the issue of QA in DI for Graduate Studies, an elite university (in the academic sense) that focuses on a small and selected number of students, a qualitative method will be adopted in the description and analysis of the data (document analysis). In an attempt to investigate the extent to which QA is achieved in Doha Institute for Graduate Studies, three broad indicators will be evaluated (input, process and learning outcomes). This investigation will be carried out in line with the UK Quality Code for Higher Education represented by Quality Assurance Agency (QAA).

Keywords: accreditation, higher education, quality, quality assurance, standards

Procedia PDF Downloads 142
3359 Subtitling in the Classroom: Combining Language Mediation, ICT and Audiovisual Material

Authors: Rossella Resi

Abstract:

This paper describes a project carried out in an Italian school with English learning pupils combining three didactic tools which are attested to be relevant for the success of young learner’s language curriculum: the use of technology, the intralingual and interlingual mediation (according to CEFR) and the cultural dimension. Aim of this project was to test a technological hands-on translation activity like subtitling in a formal teaching context and to exploit its potential as motivational tool for developing listening and writing, translation and cross-cultural skills among language learners. The activities proposed involved the use of professional subtitling software called Aegisub and culture-specific films. The workshop was optional so motivation was entirely based on the pleasure of engaging in the use of a realistic subtitling program and on the challenge of meeting the constraints that a real life/work situation might involve. Twelve pupils in the age between 16 and 18 have attended the afternoon workshop. The workshop was organized in three parts: (i) An introduction where the learners were opened up to the concept and constraints of subtitling and provided with few basic rules on spotting and segmentation. During this session learners had also the time to familiarize with the main software features. (ii) The second part involved three subtitling activities in plenum or in groups. In the first activity the learners experienced the technical dimensions of subtitling. They were provided with a short video segment together with its transcription to be segmented and time-spotted. The second activity involved also oral comprehension. Learners had to understand and transcribe a video segment before subtitling it. The third activity embedded a translation activity of a provided transcription including segmentation and spotting of subtitles. (iii) The workshop ended with a small final project. At this point learners were able to master a short subtitling assignment (transcription, translation, segmenting and spotting) on their own with a similar video interview. The results of these assignments were above expectations since the learners were highly motivated by the authentic and original nature of the assignment. The subtitled videos were evaluated and watched in the regular classroom together with other students who did not take part to the workshop.

Keywords: ICT, L2, language learning, language mediation, subtitling

Procedia PDF Downloads 407
3358 Novel Point of Care Test for Rapid Diagnosis of COVID-19 Using Recombinant Nanobodies against SARS-CoV-2 Spike1 (S1) Protein

Authors: Manal Kamel, Sara Maher, Hanan El Baz, Faten Salah, Omar Sayyouh, Zeinab Demerdash

Abstract:

In the recent COVID 19 pandemic, experts of public health have emphasized testing, tracking infected people, and tracing their contacts as an effective strategy to reduce the spread of the virus. Development of rapid and sensitive diagnostic assays to replace reverse transcription polymerase chain reaction (RT-PCR) is mandatory..Our innovative test strip relying on the application of nanoparticles conjugated to recombinant nanobodies for SARS-COV-2 spike protein (S1) & angiotensin-converting enzyme 2 (that is responsible for the virus entry into host cells) for rapid detection of SARS-COV-2 spike protein (S1) in saliva or sputum specimens. Comparative tests with RT-PCR will be held to estimate the significant effect of using COVID 19 nanobodies for the first time in the development of lateral flow test strip. The SARS-CoV-2 S1 (3 ng of recombinant proteins) was detected by our developed LFIA in saliva specimen of COVID-19 Patients No cross-reaction was detected with Middle East respiratory syndrome coronavirus (MERS-CoV) or SARS- CoV antigens..Our developed system revealed 96 % sensitivity and 100% specificity for saliva samples compared to 89 % and 100% sensitivity and specificity for nasopharyngeal swabs. providing a reliable alternative for the painful and uncomfortable nasopharyngeal swab process and the complexes, time consuming PCR test. An increase in testing compliances to be expected.

Keywords: COVID 19, diagnosis, LFIA, nanobodies, ACE2

Procedia PDF Downloads 121
3357 Comparison between Approaches Used in Two Walk About Projects

Authors: Derek O Reilly, Piotr Milczarski, Shane Dowdall, Artur Hłobaż, Krzysztof Podlaski, Hiram Bollaert

Abstract:

Learning through creation of contextual games is a very promising way/tool for interdisciplinary and international group projects. During 2013 and 2014 we took part and organized two intensive students projects in different conditions. The projects enrolled 68 students and 12 mentors from 5 countries. In the paper we want to share our experience how to strengthen the chances to succeed in short (12-15 days long) student projects. In our case almost all teams prepared working prototype and the results were highly appreciated by external experts.

Keywords: contextual games, mobile games, GGULIVRR, walkabout, Erasmus intensive programme

Procedia PDF Downloads 491
3356 Understanding Tourism Innovation through Fuzzy Measures

Authors: Marcella De Filippo, Delio Colangelo, Luca Farnia

Abstract:

In recent decades, the hyper-competition of tourism scenario has implicated the maturity of many businesses, attributing a central role to innovative processes and their dissemination in the economy of company management. At the same time, it has defined the need for monitoring the application of innovations, in order to govern and improve the performance of companies and destinations. The study aims to analyze and define the innovation in the tourism sector. The research actions have concerned, on the one hand, some in-depth interviews with experts, identifying innovation in terms of process and product, digitalization, sustainability policies and, on the other hand, to evaluate the interaction between these factors, in terms of substitutability and complementarity in management scenarios, in order to identify which one is essential to be competitive in the global scenario. Fuzzy measures and Choquet integral were used to elicit Experts’ preferences. This method allows not only to evaluate the relative importance of each pillar, but also and more interestingly, the level of interaction, ranging from complementarity to substitutability, between pairs of factors. The results of the survey are the following: in terms of Shapley values, Experts assert that Innovation is the most important factor (32.32), followed by digitalization (31.86), Network (20.57) and Sustainability (15.25). In terms of Interaction indices, given the low degree of consensus among experts, the interaction between couples of criteria on average could be ignored; however, it is worth to note that the factors innovations and digitalization are those in which experts express the highest degree of interaction. However for some of them, these factors have a moderate level of complementarity (with a pick of 57.14), and others consider them moderately substitutes (with a pick of -39.58). Another example, although outlier is the interaction between network and digitalization, in which an expert consider them markedly substitutes (-77.08).

Keywords: innovation, business model, tourism, fuzzy

Procedia PDF Downloads 255
3355 Near-Infrared Optogenetic Manipulation of a Channelrhodopsin via Upconverting Nanoparticles

Authors: Kanchan Yadav, Ai-Chuan Chou, Rajesh Kumar Ulaganathan, Hua-De Gao, Hsien-Ming Lee, Chien-Yuan Pan, Yit-Tsong Chen

Abstract:

Optogenetics is an innovative technology now widely adopted by researchers in different fields of the biological sciences. However, due to the weak tissue penetration capability of the short wavelengths used to activate light-sensitive proteins, an invasive light guide has been used in animal studies for photoexcitation of target tissues. Upconverting nanoparticles (UCNPs), which transform near-infrared (NIR) light to short-wavelength emissions, can help address this issue. To improve optogenetic performance, we enhance the target selectivity for optogenetic controls by specifically conjugating the UCNPs with light-sensitive proteins at a molecular level, which shortens the distance as well as enhances the efficiency of energy transfer. We tagged V5 and Lumio epitopes to the extracellular N-terminal of channelrhodopsin-2 with an mCherry conjugated at the intracellular C-terminal (VL-ChR2m) and then bound NeutrAvidin-functionalized UCNPs (NAv-UCNPs) to the VL-ChR2m via a biotinylated antibody against V5 (bV5-Ab). We observed an apparent energy transfer from the excited UCNP (donor) to the bound VL-ChR2m (receptor) by measuring emission-intensity changes at the donor-receptor complex. The successful patch-clamp electrophysiological test and an intracellular Ca2+ elevation observed in the designed UCNP-ChR2 system under optogenetic manipulation confirmed the practical employment of UCNP-assisted NIR-optogenetic functionality. This work represents a significant step toward improving therapeutic optogenetics.

Keywords: Channelrhodopsin-2, near infrared, optogenetics, upconverting nanoparticles

Procedia PDF Downloads 269
3354 Emulsified Oil Removal in Produced Water by Graphite-Based Adsorbents Using Adsorption Coupled with Electrochemical Regeneration

Authors: Zohreh Fallah, Edward P. L. Roberts

Abstract:

One of the big challenges for produced water treatment is removing oil from water in the form of emulsified droplets which are not easily separated. An attractive approach is adsorption, as it is a simple and effective process. However, adsorbents must be regenerated in order to make the process cost effective. Several sorbents have been tested for treating oily wastewater. However, some issues such as high energy consumption for activated carbon thermal regeneration have been reported. Due to their significant electrical conductivity, Graphite Intercalation Compounds (GIC) were found to be suitable to be regenerated electrochemically. They are non-porous materials with low surface area and fast adsorptive capacity which are useful for removal of low concentration of organics. An innovative adsorption/regeneration process has been developed at the University of Manchester in which adsorption of organics are done by using a patented GIC adsorbent coupled with subsequent electrochemical regeneration. The oxidation of adsorbed organics enables 100% regeneration so that the adsorbent can be reused over multiple adsorption cycles. GIC adsorbents are capable of removing a wide range of organics and pollutants; however, no comparable report is available for removal of emulsified oil in produced water using abovementioned process. In this study the performance of this technology for the removal of emulsified oil in wastewater was evaluated. Batch experiments were carried out to determine the adsorption kinetics and equilibrium isotherm for both real produced water and model emulsions. The amount of oil in wastewater was measured by using the toluene extraction/fluorescence analysis before and after adsorption and electrochemical regeneration cycles. It was found that oil in water emulsion could be successfully treated by the treatment process and More than 70% of oil was removed.

Keywords: adsorption, electrochemical regeneration, emulsified oil, produced water

Procedia PDF Downloads 576
3353 Towards Learning Query Expansion

Authors: Ahlem Bouziri, Chiraz Latiri, Eric Gaussier

Abstract:

The steady growth in the size of textual document collections is a key progress-driver for modern information retrieval techniques whose effectiveness and efficiency are constantly challenged. Given a user query, the number of retrieved documents can be overwhelmingly large, hampering their efficient exploitation by the user. In addition, retaining only relevant documents in a query answer is of paramount importance for an effective meeting of the user needs. In this situation, the query expansion technique offers an interesting solution for obtaining a complete answer while preserving the quality of retained documents. This mainly relies on an accurate choice of the added terms to an initial query. Interestingly enough, query expansion takes advantage of large text volumes by extracting statistical information about index terms co-occurrences and using it to make user queries better fit the real information needs. In this respect, a promising track consists in the application of data mining methods to extract dependencies between terms, namely a generic basis of association rules between terms. The key feature of our approach is a better trade off between the size of the mining result and the conveyed knowledge. Thus, face to the huge number of derived association rules and in order to select the optimal combination of query terms from the generic basis, we propose to model the problem as a classification problem and solve it using a supervised learning algorithm such as SVM or k-means. For this purpose, we first generate a training set using a genetic algorithm based approach that explores the association rules space in order to find an optimal set of expansion terms, improving the MAP of the search results. The experiments were performed on SDA 95 collection, a data collection for information retrieval. It was found that the results were better in both terms of MAP and NDCG. The main observation is that the hybridization of text mining techniques and query expansion in an intelligent way allows us to incorporate the good features of all of them. As this is a preliminary attempt in this direction, there is a large scope for enhancing the proposed method.

Keywords: supervised leaning, classification, query expansion, association rules

Procedia PDF Downloads 314
3352 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 157
3351 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach

Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini

Abstract:

Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms

Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing

Procedia PDF Downloads 144
3350 A Surgical Correction and Innovative Splint for Swan Neck Deformity in Hypermobility Syndrome

Authors: Deepak Ganjiwale, Karthik Vishwanathan

Abstract:

Objective: Splinting is a great domain of occupational therapy profession.Making a splint for the patient would depend upon the need or requirement of the problems and deformities. Swan neck deformity is not very common in finger it may occur after any disease. Conservative treatment of the swan neck deformity is available by using different static splints only. There are very few reports of surgical correction of swan-neck deformity in benign hypermobility syndrome. Method: This case report describes the result of surgical intervention and hand splint in a twenty year old lady with past history of cardiovascular stroke with no residual neurological deficit. She presented with correctable swan neck deformity and failed to improve with static ring splints to correct the deformity. She was noted to have hyperlaxity (EhlerDanlos type) as per modified Beighton’s score of 5/9. She underwent volar plate plication of the proximal interphalangeal joint of the left ring finger along with hemitenodesis of ulnar slip of flexor digitorum superficialis (FDS) tendon whereby, the ulnar slip of FDS was passed through a small surgically created rent in A2 pulley and sutured back to itself. Result: Postoperatively, the patient was referred to occupational therapy for splinting with the instruction that the splint would work some time for as static and some time as dynamic for positional and correction of the finger. Conclusion: After occupational therapy intervention and splinting, the patient had a full correction of the swan-neck deformity with near full flexion of the operated finger and is able to work independently.

Keywords: swan neck, finger, deformity, splint, hypermobility

Procedia PDF Downloads 245
3349 Framework for Explicit Social Justice Nursing Education and Practice: A Constructivist Grounded Theory Research

Authors: Victor Abu

Abstract:

Background: Social justice ideals are considered as the foundation of nursing practice. These ideals are not always clearly integrated into nursing professional standards or curricula. This hinders concerted global nursing agendas for becoming aware of social injustice or engaging in action for social justice to improve the health of individuals and groups. Aim and objectives: The aim was to create an educational framework for empowering nursing students for social justice awareness and action. This purpose was attained by understanding the meaning of social justice, the effect of social injustice, the visibility of social justice learning, and ways of integrating social justice in nursing education and practice. Methods: Critical interpretive methodologies and constructivist grounded theory research designs guided the processes of recruiting nursing students (n = 11) and nurse educators (n = 11) at a London nursing university to participate in interviews and focus groups, which were analysed by coding systems. Findings: Firstly, social justice was described as ethical practices that enable individuals and groups to have good access to health resources. Secondly, social injustice was understood as unfair practices that caused minimal access to resources, social deprivation, and poor health. Thirdly, social justice learning was considered to be invisible in nursing education due to a lack of explicit modules, educator knowledge, and organisational support. Lastly, explicit modules, educating educators, and attracting leaders’ support were suggested as approaches for the visible integration of social justice in nursing education and practice. Discussion: This research proposes approaches for nursing awareness and action for the development of critical active nurse-learner, critical conscious nurse-educator, and servant nurse leader. The framework on Awareness for Social Justice Action (ASJA) created in this research is an approach for empowering nursing students for social justice practices. Conclusion: This research contributes to and advocates for greater nursing scholarship to raise the spotlight on social justice in the profession.

Keywords: social justice, nursing practice, nursing education, nursing curriculum, social justice awareness, social justice action, constructivist grounded theory

Procedia PDF Downloads 38
3348 Sustainable Radiation Curable Palm Oil-Based Products for Advanced Materials Applications

Authors: R. Tajau, R. Rohani, M. S. Alias, N. H. Mudri, K. A. Abdul Halim, M. H. Harun, N. Mat Isa, R. Che Ismail, S. Muhammad Faisal, M. Talib, M. R. Mohamed Zin

Abstract:

Bio-based polymeric materials are increasingly used for a variety of applications, including surface coating, drug delivery systems, and tissue engineering. These polymeric materials are ideal for the aforementioned applications because they are derived from natural resources, non-toxic, low-cost, biocompatible, and biodegradable, and have promising thermal and mechanical properties. The nature of hydrocarbon chains, carbon double bonds, and ester bonds allows various sources of oil (edible), such as soy, sunflower, olive, and oil palm, to fine-tune their particular structures in the development of innovative materials. Palm oil can be the most eminent raw material used for manufacturing new and advanced natural polymeric materials involving radiation techniques, such as coating resins, nanoparticles, scaffold, nanotubes, nanocomposites, and lithography for different branches of the industry in countries where oil palm is abundant. The radiation technique is among the most versatile, cost-effective, simple, and effective methods. Crosslinking, reversible addition-fragmentation chain transfer (RAFT), polymerisation, grafting, and degradation are among the radiation mechanisms. Exposure to gamma, EB, UV, or laser irradiation, which are commonly used in the development of polymeric materials, is used in these mechanisms. Therefore, this review focuses on current radiation processing technologies for the development of various radiation-curable bio-based polymeric materials with a promising future in biomedical and industrial applications. The key focus of this review is on radiation curable palm oil-based products, which have been published frequently in recent studies.

Keywords: palm oil, radiation processing, surface coatings, VOC

Procedia PDF Downloads 177
3347 Compaction of Municipal Solid Waste

Authors: Jovana Jankovic Pantic, Dragoslav Rakic, Tina Djuric, Irena Basaric Ikodinovic, Snezana Bogdanovic

Abstract:

Regardless of the numerous activities undertaken to reduce municipal solid waste, its annual volumes continue to grow. In Serbia, the most common and the only one form of waste disposal is at municipal landfills with daily compaction and soil covering. Municipal waste compacting is one of the basic components of the disposal process. Well compacted waste takes up less volume and allows much safer storage. In order to better predict the behavior of municipal waste at landfills, it is necessary to define compaction parameters: the maximum dry unit weight and optimal moisture content. In current geotechnical practice, the most common method of determination compaction parameters is by the standard method (Proctor compaction test) used in soil mechanics, with an eventual reduction of compaction energy. Although this methodology is accepted in newer geotechnical scientific discipline "waste mechanics", different treatments of municipal waste at the landfill itself (including pretreatment), indicate the need to change this classical approach. The main reason for that is the simulation of the operation of compactors (hedgehogs) at the landfill. Therefore, during the research, various innovative solutions are introduced, such as changing the classic flat Proctor hammer, by adding spikes, whose function is, in addition to compaction, destruction and shredding of municipal waste. The paper presents the behavior of municipal waste for four synthetic waste samples with different waste compositions (Plandište landfill). The samples were tested in standard Proctor apparatus at the same compaction energy, but with two different hammers: standard flat hammer and hammer with spikes.

Keywords: compaction, hammer with spikes, landfill, municipal solid waste, proctor compaction test

Procedia PDF Downloads 210
3346 Microbial Dynamics and Sensory Traits of Spanish- and Greek-Style Table Olives (Olea europaea L. cv. Ascolana tenera) Fermented with Sea Fennel (Crithmum maritimum L.)

Authors: Antonietta Maoloni, Federica Cardinali, Vesna Milanović, Andrea Osimani, Ilario Ferrocino, Maria Rita Corvaglia, Luca Cocolin, Lucia Aquilanti

Abstract:

Table olives (Olea europaea L.) are among the most important fermented vegetables all over the world, while sea fennel (Crithmum maritimum L.) is an emerging food crop with interesting nutritional and sensory traits. Both of them are characterized by the presence of several bioactive compounds with potential beneficial health effects, thus representing two valuable substrates for the manufacture of innovative vegetable-based preserves. Given these premises, the present study was aimed at exploring the co-fermentation of table olives and sea fennel to produce new high-value preserves. Spanish style or Greek style processing method and the use of a multiple strain starter were explored. The preserves were evaluated for their microbial dynamics and key sensory traits. During the fermentation, a progressive pH reduction was observed. Mesophilic lactobacilli, mesophilic lactococci, and yeasts were the main microbial groups at the end of the fermentation, whereas Enterobacteriaceae decreased during fermentation. An evolution of the microbiota was revealed by metataxonomic analysis, with Lactiplantibacillus plantarum dominating in the late stage of fermentation, irrespective of processing method and use of the starter. Greek style preserves resulted in more crunchy and less fibrous than Spanish style one and were preferred by trained panelists.

Keywords: lactic acid bacteria, Lactiplantibacillus plantarum, metataxonomy, panel test, rock samphire

Procedia PDF Downloads 121
3345 Extended Knowledge Exchange with Industrial Partners: A Case Study

Authors: C. Fortin, D. Tokmeninova, O. Ushakova

Abstract:

Among 500 Russian universities Skolkovo Institute of Science and Technology (Skoltech) is one of the youngest (established in 2011), quite small and vastly international, comprising 20 percent of international students and 70 percent of faculty with significant academic experience at top-100 universities (QS, THE). The institute has emerged from close collaboration with MIT and leading Russian universities. Skoltech is an entirely English speaking environment. Skoltech curriculum plans of ten Master programs are based on the CDIO learning outcomes model. However, despite the Institute’s unique focus on industrial innovations and startups, one of the main challenges has become an evident large proportion of nearly half of MSc graduates entering PhD programs at Skoltech or other universities rather than industry or entrepreneurship. In order to increase the share of students joining the industrial sector after graduation, Skoltech started implementing a number of unique practices with a focus on employers’ expectations incorporated into the curriculum redesign. In this sense, extended knowledge exchange with industrial partners via collaboration in learning activities, industrial projects and assessments became essential for students’ headway into industrial and entrepreneurship pathways. Current academic curriculum includes the following types of components based on extended knowledge exchange with industrial partners: innovation workshop, industrial immersion, special industrial tracks, MSc defenses. Innovation workshop is a 4 week full time diving into the Skoltech vibrant ecosystem designed to foster innovators, focuses on teamwork, group projects, and sparks entrepreneurial instincts from the very first days of study. From 2019 the number of mentors from industry and startups significantly increased to guide students across these sectors’ demands. Industrial immersion is an exclusive part of Skoltech curriculum where students after the first year of study spend 8 weeks in an industrial company carrying out an individual or team project and are guided jointly by both Skoltech and company supervisors. The aim of the industrial immersion is to familiarize students with relevant needs of Russian industry and to prepare graduates for job placement. During the immersion a company plays the role of a challenge provider for students. Skoltech has started a special industrial track comprising deep collaboration with IPG Photonics – a leading R&D company and manufacturer of high-performance fiber lasers and amplifiers for diverse applications. The track is aimed to train a new cohort of engineers and includes a variety of activities for students within the “Photonics” MSc program. It is expected to be a successful story and used as an example for similar initiatives with other Russian high-tech companies. One of the pathways of extended knowledge exchange with industrial partners is an active involvement of potential employers in MSc Defense Committees to review and assess MSc thesis projects and to participate in defense procedures. The paper will evaluate the effect and results of the above undertaken measures.

Keywords: Curriculum redesign, knowledge exchange model, learning outcomes framework, stakeholder engagement

Procedia PDF Downloads 72
3344 Modeling the Human Harbor: An Equity Project in New York City, New York USA

Authors: Lauren B. Birney

Abstract:

The envisioned long-term outcome of this three-year research, and implementation plan is for 1) teachers and students to design and build their own computational models of real-world environmental-human health phenomena occurring within the context of the “Human Harbor” and 2) project researchers to evaluate the degree to which these integrated Computer Science (CS) education experiences in New York City (NYC) public school classrooms (PreK-12) impact students’ computational-technical skill development, job readiness, career motivations, and measurable abilities to understand, articulate, and solve the underlying phenomena at the center of their models. This effort builds on the partnership’s successes over the past eight years in developing a benchmark Model of restoration-based Science, Technology, Engineering, and Math (STEM) education for urban public schools and achieving relatively broad-based implementation in the nation’s largest public school system. The Billion Oyster Project Curriculum and Community Enterprise for Restoration Science (BOP-CCERS STEM + Computing) curriculum, teacher professional developments, and community engagement programs have reached more than 200 educators and 11,000 students at 124 schools, with 84 waterfront locations and Out of School of Time (OST) programs. The BOP-CCERS Partnership is poised to develop a more refined focus on integrating computer science across the STEM domains; teaching industry-aligned computational methods and tools; and explicitly preparing students from the city’s most under-resourced and underrepresented communities for upwardly mobile careers in NYC’s ever-expanding “digital economy,” in which jobs require computational thinking and an increasing percentage require discreet computer science technical skills. Project Objectives include the following: 1. Computational Thinking (CT) Integration: Integrate computational thinking core practices across existing middle/high school BOP-CCERS STEM curriculum as a means of scaffolding toward long term computer science and computational modeling outcomes. 2. Data Science and Data Analytics: Enabling Researchers to perform interviews with Teachers, students, community members, partners, stakeholders, and Science, Technology, Engineering, and Mathematics (STEM) industry Professionals. Collaborative analysis and data collection were also performed. As a centerpiece, the BOP-CCERS partnership will expand to include a dedicated computer science education partner. New York City Department of Education (NYCDOE), Computer Science for All (CS4ALL) NYC will serve as the dedicated Computer Science (CS) lead, advising the consortium on integration and curriculum development, working in tandem. The BOP-CCERS Model™ also validates that with appropriate application of technical infrastructure, intensive teacher professional developments, and curricular scaffolding, socially connected science learning can be mainstreamed in the nation’s largest urban public school system. This is evidenced and substantiated in the initial phases of BOP-CCERS™. The BOP-CCERS™ student curriculum and teacher professional development have been implemented in approximately 24% of NYC public middle schools, reaching more than 250 educators and 11,000 students directly. BOP-CCERS™ is a fully scalable and transferable educational model, adaptable to all American school districts. In all settings of the proposed Phase IV initiative, the primary beneficiary group will be underrepresented NYC public school students who live in high-poverty neighborhoods and are traditionally underrepresented in the STEM fields, including African Americans, Latinos, English language learners, and children from economically disadvantaged households. In particular, BOP-CCERS Phase IV will explicitly prepare underrepresented students for skilled positions within New York City’s expanding digital economy, computer science, computational information systems, and innovative technology sectors.

Keywords: computer science, data science, equity, diversity and inclusion, STEM education

Procedia PDF Downloads 47