Search results for: computing curricula
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1192

Search results for: computing curricula

772 Application of Griddization Management to Construction Hazard Management

Authors: Lingzhi Li, Jiankun Zhang, Tiantian Gu

Abstract:

Hazard management that can prevent fatal accidents and property losses is a fundamental process during the buildings’ construction stage. However, due to lack of safety supervision resources and operational pressures, the conduction of hazard management is poor and ineffective in China. In order to improve the quality of construction safety management, it is critical to explore the use of information technologies to ensure that the process of hazard management is efficient and effective. After exploring the existing problems of construction hazard management in China, this paper develops the griddization management model for construction hazard management. First, following the knowledge grid infrastructure, the griddization computing infrastructure for construction hazards management is designed which includes five layers: resource entity layer, information management layer, task management layer, knowledge transformation layer and application layer. This infrastructure will be as the technical support for realizing grid management. Second, this study divides the construction hazards into grids through city level, district level and construction site level according to grid principles. Last, a griddization management process including hazard identification, assessment and control is developed. Meanwhile, all stakeholders of construction safety management, such as owners, contractors, supervision organizations and government departments, should take the corresponding responsibilities in this process. Finally, a case study based on actual construction hazard identification, assessment and control is used to validate the effectiveness and efficiency of the proposed griddization management model. The advantage of this designed model is to realize information sharing and cooperative management between various safety management departments.

Keywords: construction hazard, griddization computing, grid management, process

Procedia PDF Downloads 257
771 A Comparative Case Study on Teaching Romanian Language to Foreign Students: Swedes in Lund versus Arabs in Alba Iulia

Authors: Lucian Vasile Bagiu, Paraschiva Bagiu

Abstract:

The study is a contrastive essay on language acquisition and learning and follows the outcomes of teaching Romanian language to foreign students both at Lund University, Sweden (from 2014 to 2017) and at '1 Decembrie 1918' University in Alba Iulia, Romania (2017-2018). Having employed the same teaching methodology (on campus, same curricula) for the same level of study (beginners’ level: A1-A2), the essay focuses on the written exam at the end of the semester. The study argues on grammar exercises concerned with: the indefinite and the definite article; the conjugation of verbs in the present indicative; the possessive; verbs in the past tense; the subjunctive; the degrees of comparison for adjectives. Identifying similar errors when solving identical grammar exercises by different groups of foreign students is an opportunity to emphasize the major challenges any foreigner has to face and overcome when trying to acquire Romanian language. The conclusion draws attention to the complexity of the morphology of Romanian language in several key elements which may be insurmountable for a foreign speaker no matter if the language acquisition takes place in a foreign country or a Romanian university.

Keywords: Arab students, morphological errors, Romanian language, Swedish students, written exam

Procedia PDF Downloads 238
770 Cloud Resources Utilization and Science Teacher’s Effectiveness in Secondary Schools in Cross River State, Nigeria

Authors: Michael Udey Udam

Abstract:

Background: This study investigated the impact of cloud resources, a component of cloud computing, on science teachers’ effectiveness in secondary schools in Cross River State. Three (3) research questions and three (3) alternative hypotheses guided the study. Method: The descriptive survey design was adopted for the study. The population of the study comprised 1209 science teachers in public secondary schools of Cross River state. Sample: A sample of 487 teachers was drawn from the population using a stratified random sampling technique. The researcher-made structured questionnaire with 18 was used for data collection for the study. Research question one was answered using the Pearson Product Moment Correlation, while research question two and the hypotheses were answered using the Analysis of Variance (ANOVA) statistics in the Statistical Package for Social Sciences (SPSS) at a 0.05 level of significance. Results: The results of the study revealed that there is a positive correlation between the utilization of cloud resources in teaching and teaching effectiveness among science teachers in secondary schools in Cross River state; there is a negative correlation between gender and utilization of cloud resources among science teachers in secondary schools in Cross River state; and that there is a significant correlation between teaching experience and the utilization of cloud resources among science teachers in secondary schools in Cross River state. Conclusion: The study justifies the effectiveness of the Cross River state government policy of introducing cloud computing into the education sector. The study recommends that the policy should be sustained.

Keywords: cloud resources, science teachers, effectiveness, secondary school

Procedia PDF Downloads 54
769 An Adjoint-Based Method to Compute Derivatives with Respect to Bed Boundary Positions in Resistivity Measurements

Authors: Mostafa Shahriari, Theophile Chaumont-Frelet, David Pardo

Abstract:

Resistivity measurements are used to characterize the Earth’s subsurface. They are categorized into two different groups: (a) those acquired on the Earth’s surface, for instance, controlled source electromagnetic (CSEM) and Magnetotellurics (MT), and (b) those recorded with borehole logging instruments such as Logging-While-Drilling (LWD) devices. LWD instruments are mostly used for geo-steering purposes, i.e., to adjust dip and azimuthal angles of a well trajectory to drill along a particular geological target. Modern LWD tools measure all nine components of the magnetic field corresponding to three orthogonal transmitter and receiver orientations. In order to map the Earth’s subsurface and perform geo-steering, we invert measurements using a gradient-based method that utilizes the derivatives of the recorded measurements with respect to the inversion variables. For resistivity measurements, these inversion variables are usually the constant resistivity value of each layer and the bed boundary positions. It is well-known how to compute derivatives with respect to the constant resistivity value of each layer using semi-analytic or numerical methods. However, similar formulas for computing the derivatives with respect to bed boundary positions are unavailable. The main contribution of this work is to provide an adjoint-based formulation for computing derivatives with respect to the bed boundary positions. The key idea to obtain the aforementioned adjoint state formulations for the derivatives is to separate the tangential and normal components of the field and treat them differently. This formulation allows us to compute the derivatives faster and more accurately than with traditional finite differences approximations. In the presentation, we shall first derive a formula for computing the derivatives with respect to the bed boundary positions for the potential equation. Then, we shall extend our formulation to 3D Maxwell’s equations. Finally, by considering a 1D domain and reducing the dimensionality of the problem, which is a common practice in the inversion of resistivity measurements, we shall derive a formulation to compute the derivatives of the measurements with respect to the bed boundary positions using a 1.5D variational formulation. Then, we shall illustrate the accuracy and convergence properties of our formulations by comparing numerical results with the analytical derivatives for the potential equation. For the 1.5D Maxwell’s system, we shall compare our numerical results based on the proposed adjoint-based formulation vs those obtained with a traditional finite difference approach. Numerical results shall show that our proposed adjoint-based technique produces enhanced accuracy solutions while its cost is negligible, as opposed to the finite difference approach that requires the solution of one additional problem per derivative.

Keywords: inverse problem, bed boundary positions, electromagnetism, potential equation

Procedia PDF Downloads 164
768 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 52
767 Yawning Computing Using Bayesian Networks

Authors: Serge Tshibangu, Turgay Celik, Zenzo Ncube

Abstract:

Road crashes kill nearly over a million people every year, and leave millions more injured or permanently disabled. Various annual reports reveal that the percentage of fatal crashes due to fatigue/driver falling asleep comes directly after the percentage of fatal crashes due to intoxicated drivers. This percentage is higher than the combined percentage of fatal crashes due to illegal/Un-Safe U-turn and illegal/Un-Safe reversing. Although a relatively small percentage of police reports on road accidents highlights drowsiness and fatigue, the importance of these factors is greater than we might think, hidden by the undercounting of their events. Some scenarios show that these factors are significant in accidents with killed and injured people. Thus the need for an automatic drivers fatigue detection system in order to considerably reduce the number of accidents owing to fatigue.This research approaches the drivers fatigue detection problem in an innovative way by combining cues collected from both temporal analysis of drivers’ faces and environment. Monotony in driving environment is inter-related with visual symptoms of fatigue on drivers’ faces to achieve fatigue detection. Optical and infrared (IR) sensors are used to analyse the monotony in driving environment and to detect the visual symptoms of fatigue on human face. Internal cues from drivers faces and external cues from environment are combined together using machine learning algorithms to automatically detect fatigue.

Keywords: intelligent transportation systems, bayesian networks, yawning computing, machine learning algorithms

Procedia PDF Downloads 443
766 FRATSAN: A New Software for Fractal Analysis of Signals

Authors: Hamidreza Namazi

Abstract:

Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.

Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 451
765 Integrating Sustainable Construction Principles into Curriculum Design for Built Environment Professional Programs in Nigeria

Authors: M. Yakubu, M. B. Isah, S. Bako

Abstract:

This paper presents the findings of a research which sought to investigate the readiness to integrate sustainable construction principles into curriculum design for built environment professional programs in the Nigerian Universities. Developing the knowledge and understanding that construction professionals acquire of sustainable construction practice leads to considerable improvement in the environmental performance of the construction sector. Integrating sustainable environmental issues within the built environment education curricula provide the basis of this research. An integration of sustainable development principles into the universities built environment professional programmes are carried out with a view of finding solutions to the key issues identified. The perspectives of academia have been assessed and findings tested for validity through the analysis of primary quantitative data that has been collected. The secondary data generated has shown that there are significant differences in the approach to curriculum design within the built environment professional programmes, and this reveals that there is no ‘best practice’ that is clearly identifiable. Sequel to the above, this research reveals that engaging all stakeholders would be a useful component of built environment curriculum development, and that the curriculum be negotiated with interested parties. These parties have been identified as academia, government, construction industry and built environment professionals.

Keywords: built environment, curriculum development, sustainable construction, sustainable development

Procedia PDF Downloads 406
764 A Cross-Sectional Examination of Children’s Developing Understanding of the Rainbow

Authors: Michael Hast

Abstract:

Surprisingly little is known from a research perspective about children’s understanding of rainbows and rainbow formation, and how this understanding changes with increasing age. Yet this kind of research is useful when conceptualizing pedagogy, lesson plans, or more general curricula. The present study aims to rectify this shortcoming. In a cross-sectional approach, children of three different age groups (4-5, 7-8 and 10-11 years) were asked to draw pictures that included rainbows. The pictures will be evaluated according to their scientific representation of rainbows, such as the order of colors, as well as according to any non-scientific conceptions, such as solidity. In addition to the drawings, the children took part in small focus groups where they had to discuss various questions about rainbows and rainbow formation. Similar to the drawings, these conversations will be evaluated around the degree of scientific accuracy of the children’s explanations. Gaining a complete developmental picture of children’s understanding of the rainbow may have important implications for pedagogy in early science education. Many other concepts in science, while not explicitly linked to rainbows and rainbow formation, can benefit from the use of rainbows as illustrations – such as understanding light and color, or the use of prisms. Even in non-science domains, such as art and even storytelling, recognizing the differentiation between fact and myth in relation to rainbows could be of value. In addition, research has pointed out that teachers tend to overestimate the proportion of students’ correct answers, so clarifying the actual level of conceptual understanding is crucial in this respect.

Keywords: conceptual development, cross-sectional research, primary science education, rainbows

Procedia PDF Downloads 204
763 Correlation between Entrepreneur's Perception of Human Resource Function and Company's Growth

Authors: Ivan Todorović, Stefan Komazec, Jelena Anđelković-Labrović, Ondrej Jaško, Miha Marič

Abstract:

Micro, small and medium enterprises (MSME) are important factors of the economy in each country. Recent years have brought increased number and higher sophistication of scientific research related to numerous aspects of entrepreneurship. Various authors try to find the positive correlation between entrepreneur's personal characteristics, skills and knowledge on one hand, and company growth and success of small business on the other hand. Different models recognize staff as one of the key elements in every organizational system. Human resource (HR) function is present in almost all large companies, despite the geographical location or industry. Small and medium enterprises also often have separate positions or even departments for HR administration. However, in early stages of organizational life cycle human resources are usually managed by the founder, entrepreneur. In this paper we want to question whether the companies where founder, entrepreneur, recognizes the significance of human capital in the organization and understands the importance of HR management have higher growth rate and better business results. The findings of this research can be implemented in practice, but also in the academia, for improving the curricula related to the MSME and entrepreneurship.

Keywords: entrepreneurship, MSME, micro small and medium enterprises, company growth, human resources, HR management

Procedia PDF Downloads 340
762 Synchronous Versus Asynchronous Telecollaboration in Intercultural Communication

Authors: Vita Kalnberzina, Lauren Miller Anderson

Abstract:

The aim of the paper is to report on the results of the telecollaboration project results carried out between the students of the University of Latvia, National Louis University in the US, and Austral University in Chili during the Intercultural Communication course. The objectives of the study are 1) to compare different forms of student telecollaboration and virtual exchange, 2) to collect and analyse the student feedback on the telecollaboration project, 3) to evaluate the products (films) produced during the telecollaboration project. The methods of research used are as follows: Survey of the student feedback after the project, video text analysis of the films produced by the students, and interview of the students participating in the project. We would like to compare the results of a three-year collaboration project, where we tried out synchronous telecollaboration and asynchronous collaboration. The different variables that were observed were the impact of the different time zones, different language proficiency levels of students, and different curricula developed for collaboration. The main findings suggest that the effort spent by students to organize meetings in different time zones and to get to know each other diminishes the quality of the product developed and thus reduces the students' feeling of accomplishment. Therefore, we would like to propose that asynchronous collaboration where the national teams work on a film project specifically developed by the students of one university for the students of another university ends up with a better quality film, which in its turn appeals more to the students of the other university and creates a deeper intercultural bond between the collaborating students.

Keywords: telecollaboration, intercultural communication, synchronous collaboration, asynchronous collaboration

Procedia PDF Downloads 85
761 Managing Data from One Hundred Thousand Internet of Things Devices Globally for Mining Insights

Authors: Julian Wise

Abstract:

Newcrest Mining is one of the world’s top five gold and rare earth mining organizations by production, reserves and market capitalization in the world. This paper elaborates on the data acquisition processes employed by Newcrest in collaboration with Fortune 500 listed organization, Insight Enterprises, to standardize machine learning solutions which process data from over a hundred thousand distributed Internet of Things (IoT) devices located at mine sites globally. Through the utilization of software architecture cloud technologies and edge computing, the technological developments enable for standardized processes of machine learning applications to influence the strategic optimization of mineral processing. Target objectives of the machine learning optimizations include time savings on mineral processing, production efficiencies, risk identification, and increased production throughput. The data acquired and utilized for predictive modelling is processed through edge computing by resources collectively stored within a data lake. Being involved in the digital transformation has necessitated the standardization software architecture to manage the machine learning models submitted by vendors, to ensure effective automation and continuous improvements to the mineral process models. Operating at scale, the system processes hundreds of gigabytes of data per day from distributed mine sites across the globe, for the purposes of increased improved worker safety, and production efficiency through big data applications.

Keywords: mineral technology, big data, machine learning operations, data lake

Procedia PDF Downloads 97
760 Integrating Generic Skills into Disciplinary Curricula

Authors: Sitalakshmi Venkatraman, Fiona Wahr, Anthony de Souza-Daw, Samuel Kaspi

Abstract:

There is a growing emphasis on generic skills in higher education to match the changing skill-set requirements of the labour market. However, researchers and policy makers have not arrived at a consensus on the generic skills that actually contribute towards workplace employability and performance that complement and/or underpin discipline-specific graduate attributes. In order to strengthen the qualifications framework, a range of ‘generic’ learning outcomes have been considered for students undergoing higher education programs and among them it is necessary to have the fundamental generic skills such as literacy and numeracy at a level appropriate to the qualification type. This warrants for curriculum design approaches to contextualise the form and scope of these fundamental generic skills for supporting both students’ learning engagement in the course, as well as the graduate attributes required for employability and to progress within their chosen profession. Little research is reported in integrating such generic skills into discipline-specific learning outcomes. This paper explores the literature of the generic skills required for graduates from the discipline of Information Technology (IT) in relation to an Australian higher education institution. The paper presents the rationale of a proposed Bachelor of IT curriculum designed to contextualize the learning of these generic skills within the students’ discipline studies.

Keywords: curriculum, employability, generic skills, graduate attributes, higher education, information technology

Procedia PDF Downloads 243
759 Human Resource Development in Sri Lankan Universities: An Analysis of the Staff Development Programme at the University of Kelaniya, Sri Lanka

Authors: Chamindi Dilkushi Senaratne

Abstract:

Staff development both formal and informal, structured and unstructured is universally accepted as fundamental to the growth of individuals and institutions. This study is based on feedback summaries collected from 2014 to 2017 from 240 participants of the staff development programme for probationary lecturers at the University of Kelaniya, Sri Lanka. It also contains data from interviews conducted with the resource persons in the programme. The study further includes observations from experts involved in staff training in higher education institutions in Sri Lanka The data reveals that though the programme has many aspects that can be improved, the selected topics in the curriculum and new topics that were incorporated had positive impacts to enhance continuing professional development of staff in Sri Lankan universities. The participants also believe that the programme has an impact on professional development, teaching, and management of classroom and curricula and research skills. Based on the findings, the study recommends the addition of new topics to the curriculum such as continuing professional development, code of conduct in universities, gender awareness and the green concept. The study further recommends programmes for senior academic staff in universities to assist them to reach higher levels in their career by focusing on areas such as teaching, research, and administrative skills.

Keywords: staff development, higher education, curriculum, research

Procedia PDF Downloads 207
758 Evaluation of the Nursing Management Course in Undergraduate Nursing Programs of State Universities in Turkey

Authors: Oznur Ispir, Oya Celebi Cakiroglu, Esengul Elibol, Emine Ceribas, Gizem Acikgoz, Hande Yesilbas, Merve Tarhan

Abstract:

This study was conducted to evaluate the academic staff teaching the 'Nursing Management' course in the undergraduate nursing programs of the state universities in Turkey and to assess the current content of the course. Design of the study is descriptive. Population of the study consists of seventy-eight undergraduate nursing programs in the state universities in Turkey. The questionnaire/survey prepared by the researchers was used as a data collection tool. The data were obtained by screening the content of the websites of nursing education programs between March and May 2016. Descriptive statistics were used to analyze the data. The research performed within the study indicated that 58% of the undergraduate nursing programs from which the data were derived were included in the school of health, 81% of the academic staff graduated from the undergraduate nursing programs, 40% worked as a lecturer and 37% specialized in a field other than the nursing. The research also implied that the above-mentioned course was included in 98% of the programs from which it was possible to obtain data. The full name of the course was 'Nursing Management' in 95% of the programs and 98% stated that the course was compulsory. Theory and application hours were 3.13 and 2.91, respectively. Moreover, the content of the course was not shared in 65% of the programs reviewed. This study demonstrated that the experience and expertise of the academic staff teaching the 'Nursing Management' course was not sufficient in the management area, and the schedule and content of the course were not sufficient although many nursing education programs provided the course. Comparison between the curricula of the course revealed significant differences.

Keywords: nursing, nursing management, nursing management course, undergraduate program

Procedia PDF Downloads 344
757 Integrating Experiential Real-World Learning in Undergraduate Degrees: Maximizing Benefits and Overcoming Challenges

Authors: Anne E. Goodenough

Abstract:

One of the most important roles of higher education professionals is to ensure that graduates have excellent employment prospects. This means providing students with the skills necessary to be immediately effective in the workplace. Increasingly, universities are seeking to achieve this by moving from lecture-based and campus-delivered curricula to more varied delivery, which takes students out of their academic comfort zone and allows them to engage with, and be challenged by, real world issues. One popular approach is integration of problem-based learning (PBL) projects into curricula. However, although the potential benefits of PBL are considerable, it can be difficult to devise projects that are meaningful, such that they can be regarded as mere ‘hoop jumping’ exercises. This study examines three-way partnerships between academics, students, and external link organizations. It studied the experiences of all partners involved in different collaborative projects to identify how benefits can be maximized and challenges overcome. Focal collaborations included: (1) development of real-world modules with novel assessment whereby the organization became the ‘client’ for student consultancy work; (2) frameworks where students collected/analyzed data for link organizations in research methods modules; (3) placement-based internships and dissertations; (4) immersive fieldwork projects in novel locations; and (5) students working as partners on staff-led research with link organizations. Focus groups, questionnaires and semi-structured interviews were used to identify opportunities and barriers, while quantitative analysis of students’ grades was used to determine academic effectiveness. Common challenges identified by academics were finding suitable link organizations and devising projects that simultaneously provided education opportunities and tangible benefits. There was no ‘one size fits all’ formula for success, but careful planning and ensuring clarity of roles/responsibilities were vital. Students were very positive about collaboration projects. They identified benefits to confidence, time-keeping and communication, as well as conveying their enthusiasm when their work was of benefit to the wider community. They frequently highlighted employability opportunities that collaborative projects opened up and analysis of grades demonstrated the potential for such projects to increase attainment. Organizations generally recognized the value of project outputs, but often required considerable assistance to put the right scaffolding in place to ensure projects worked. Benefits were maximized by ensuring projects were well-designed, innovative, and challenging. Co-publication of projects in peer-reviewed journals sometimes gave additional benefits for all involved, being especially beneficial for student curriculum vitae. PBL and student projects are by no means new pedagogic approaches: the novelty here came from creating meaningful three-way partnerships between academics, students, and link organizations at all undergraduate levels. Such collaborations can allow students to make a genuine contribution to knowledge, answer real questions, solve actual problems, all while providing tangible benefits to organizations. Because projects are actually needed, students tend to engage with learning at a deep level. This enhances student experience, increases attainment, encourages development of subject-specific and transferable skills, and promotes networking opportunities. Such projects frequently rely upon students and staff working collaboratively, thereby also acting to break down the traditional teacher/learner division that is typically unhelpful in developing students as advanced learners.

Keywords: higher education, employability, link organizations, innovative teaching and learning methods, interactions between enterprise and education, student experience

Procedia PDF Downloads 175
756 An Evolutionary Approach for QAOA for Max-Cut

Authors: Francesca Schiavello

Abstract:

This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.

Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization

Procedia PDF Downloads 44
755 An Analytical Metric and Process for Critical Infrastructure Architecture System Availability Determination in Distributed Computing Environments under Infrastructure Attack

Authors: Vincent Andrew Cappellano

Abstract:

In the early phases of critical infrastructure system design, translating distributed computing requirements to an architecture has risk given the multitude of approaches (e.g., cloud, edge, fog). In many systems, a single requirement for system uptime / availability is used to encompass the system’s intended operations. However, when architected systems may perform to those availability requirements only during normal operations and not during component failure, or during outages caused by adversary attacks on critical infrastructure (e.g., physical, cyber). System designers lack a structured method to evaluate availability requirements against candidate system architectures through deep degradation scenarios (i.e., normal ops all the way down to significant damage of communications or physical nodes). This increases risk of poor selection of a candidate architecture due to the absence of insight into true performance for systems that must operate as a piece of critical infrastructure. This research effort proposes a process to analyze critical infrastructure system availability requirements and a candidate set of systems architectures, producing a metric assessing these architectures over a spectrum of degradations to aid in selecting appropriate resilient architectures. To accomplish this effort, a set of simulation and evaluation efforts are undertaken that will process, in an automated way, a set of sample requirements into a set of potential architectures where system functions and capabilities are distributed across nodes. Nodes and links will have specific characteristics and based on sampled requirements, contribute to the overall system functionality, such that as they are impacted/degraded, the impacted functional availability of a system can be determined. A machine learning reinforcement-based agent will structurally impact the nodes, links, and characteristics (e.g., bandwidth, latency) of a given architecture to provide an assessment of system functional uptime/availability under these scenarios. By varying the intensity of the attack and related aspects, we can create a structured method of evaluating the performance of candidate architectures against each other to create a metric rating its resilience to these attack types/strategies. Through multiple simulation iterations, sufficient data will exist to compare this availability metric, and an architectural recommendation against the baseline requirements, in comparison to existing multi-factor computing architectural selection processes. It is intended that this additional data will create an improvement in the matching of resilient critical infrastructure system requirements to the correct architectures and implementations that will support improved operation during times of system degradation due to failures and infrastructure attacks.

Keywords: architecture, resiliency, availability, cyber-attack

Procedia PDF Downloads 84
754 Exploring Data Stewardship in Fog Networking Using Blockchain Algorithm

Authors: Ruvaitha Banu, Amaladhithyan Krishnamoorthy

Abstract:

IoT networks today solve various consumer problems, from home automation systems to aiding in driving autonomous vehicles with the exploration of multiple devices. For example, in an autonomous vehicle environment, multiple sensors are available on roads to monitor weather and road conditions and interact with each other to aid the vehicle in reaching its destination safely and timely. IoT systems are predominantly dependent on the cloud environment for data storage, and computing needs that result in latency problems. With the advent of Fog networks, some of this storage and computing is pushed to the edge/fog nodes, saving the network bandwidth and reducing the latency proportionally. Managing the data stored in these fog nodes becomes crucial as it might also store sensitive information required for a certain application. Data management in fog nodes is strenuous because Fog networks are dynamic in terms of their availability and hardware capability. It becomes more challenging when the nodes in the network also live a short span, detaching and joining frequently. When an end-user or Fog Node wants to access, read, or write data stored in another Fog Node, then a new protocol becomes necessary to access/manage the data stored in the fog devices as a conventional static way of managing the data doesn’t work in Fog Networks. The proposed solution discusses a protocol that acts by defining sensitivity levels for the data being written and read. Additionally, a distinct data distribution and replication model among the Fog nodes is established to decentralize the access mechanism. In this paper, the proposed model implements stewardship towards the data stored in the Fog node using the application of Reinforcement Learning so that access to the data is determined dynamically based on the requests.

Keywords: IoT, fog networks, data stewardship, dynamic access policy

Procedia PDF Downloads 42
753 Raising Awareness of Education for Sustainable Development Oriented School Programs and Curriculum

Authors: Dina L. DiSantis

Abstract:

The Japan-U.S. Teacher Exchange Program for Education for Sustainable Development (ESD) provides an opportunity for teachers from the United States and Japan to travel to each other’s countries in order to experience and learn how each country is implementing efforts to educate for sustainability. By offering programs such as the Japan-U.S. Teacher Exchange Program for Education for Sustainable Development (ESD); teachers from both countries become more aware of what ESD school programs and curricula are being implemented in both countries. Teachers gain a greater sense of global interconnectedness when they are given the opportunity to share in each other’s culture and life. The primary objectives of the program are to foster a mutual exchange between the teachers in the United States and Japan, to increase an understanding of culture and educational systems, to give teachers opportunities to collaborate on lessons and projects in areas of sustainability and to enhance professional development opportunities for both U.S and Japanese teachers. The two areas of focus for teachers, are food education and environmental education. Teachers from both countries collaborate and design curriculum and projects for their students in order to help them become more aware of the importance of global sustainability. An overview of the program and the results of an international collaborative project, encouraging local eating and forging a cultural connection to food will be presented.

Keywords: education for sustainable development, environmental education, food education, international collaboration

Procedia PDF Downloads 145
752 High Performance Computing Enhancement of Agent-Based Economic Models

Authors: Amit Gill, Lalith Wijerathne, Sebastian Poledna

Abstract:

This research presents the details of the implementation of high performance computing (HPC) extension of agent-based economic models (ABEMs) to simulate hundreds of millions of heterogeneous agents. ABEMs offer an alternative approach to study the economy as a dynamic system of interacting heterogeneous agents, and are gaining popularity as an alternative to standard economic models. Over the last decade, ABEMs have been increasingly applied to study various problems related to monetary policy, bank regulations, etc. When it comes to predicting the effects of local economic disruptions, like major disasters, changes in policies, exogenous shocks, etc., on the economy of the country or the region, it is pertinent to study how the disruptions cascade through every single economic entity affecting its decisions and interactions, and eventually affect the economic macro parameters. However, such simulations with hundreds of millions of agents are hindered by the lack of HPC enhanced ABEMs. In order to address this, a scalable Distributed Memory Parallel (DMP) implementation of ABEMs has been developed using message passing interface (MPI). A balanced distribution of computational load among MPI-processes (i.e. CPU cores) of computer clusters while taking all the interactions among agents into account is a major challenge for scalable DMP implementations. Economic agents interact on several random graphs, some of which are centralized (e.g. credit networks, etc.) whereas others are dense with random links (e.g. consumption markets, etc.). The agents are partitioned into mutually-exclusive subsets based on a representative employer-employee interaction graph, while the remaining graphs are made available at a minimum communication cost. To minimize the number of communications among MPI processes, real-life solutions like the introduction of recruitment agencies, sales outlets, local banks, and local branches of government in each MPI-process, are adopted. Efficient communication among MPI-processes is achieved by combining MPI derived data types with the new features of the latest MPI functions. Most of the communications are overlapped with computations, thereby significantly reducing the communication overhead. The current implementation is capable of simulating a small open economy. As an example, a single time step of a 1:1 scale model of Austria (i.e. about 9 million inhabitants and 600,000 businesses) can be simulated in 15 seconds. The implementation is further being enhanced to simulate 1:1 model of Euro-zone (i.e. 322 million agents).

Keywords: agent-based economic model, high performance computing, MPI-communication, MPI-process

Procedia PDF Downloads 113
751 Local Homology Modules

Authors: Fatemeh Mohammadi Aghjeh Mashhad

Abstract:

In this paper, we give several ways for computing generalized local homology modules by using Gorenstein flat resolutions. Also, we find some bounds for vanishing of generalized local homology modules.

Keywords: a-adic completion functor, generalized local homology modules, Gorenstein flat modules

Procedia PDF Downloads 402
750 Heat Transfer and Diffusion Modelling

Authors: R. Whalley

Abstract:

The heat transfer modelling for a diffusion process will be considered. Difficulties in computing the time-distance dynamics of the representation will be addressed. Incomplete and irrational Laplace function will be identified as the computational issue. Alternative approaches to the response evaluation process will be provided. An illustration application problem will be presented. Graphical results confirming the theoretical procedures employed will be provided.

Keywords: heat, transfer, diffusion, modelling, computation

Procedia PDF Downloads 540
749 Creativity and Intelligence: Psychoeducational Connections

Authors: Cristina Costa-Lobo, Carla B. Vestena, Filomena E. Ponte

Abstract:

Creativity and intelligence are concepts that have aroused very expressive interest in the field of educational sciences and the field of psychological science since the middle of the last century since they have a great impact on the potential and well-being of individuals. However, due to progress in cognitive and positive psychology, there has been a growing interest in the psychoeducational domain of intelligence and creativity in the last decade. In this theoretical work, are analyzed comparatively the theoretical models that relate the intelligence and the creativity, are analyzed several psychoeducational intervention programs that have been implemented with a view to the promotion of creativity and signal possibilities, realities and ironies around the psychological evaluation of intelligence and creativity. In order to reach a broad perspective on creativity, the evidence is presented that points the need to evaluate different psychological domains. The psychoeducational intervention programs addressed have, with a common characteristic, the full stimulation of the creative potential of the participants, assumed as a highly valued capacity at the present time. The results point to the systematize that all interventions in the ambit of creativity have two guiding principles: all individuals can be creative, and creativity is a capacity that can be stimulated. This work refers to the importance of stimulus creativity in educational contexts, to the usefulness and pertinence of the creation, the implementation, and monitoring of flexible curricula, adapted to the educational needs of students, promoting a collaborative work among teachers, parents, students, psychologists, managers and educational administrators.

Keywords: creativity, intelligence, psychoeducational intervention programs, psychological evaluation, educational contexts

Procedia PDF Downloads 389
748 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations

Authors: Deepak Singh, Rail Kuliev

Abstract:

The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.

Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization

Procedia PDF Downloads 54
747 Introduction, Implementation and Challenges Facing Competency Based Curriculum in Kenya, a Case Study for Developing Countries

Authors: Hannah Wamaitha Irungu

Abstract:

Educational reforms have been made from time to time since independence in Kenya. Kenya previously had a curriculum system coined as 8.4.4, where learners go through 8 years of primary, 4 years of secondary, and 4 years of tertiary or college education. The 8.4.4 system was very theoretical, examinational oriented, lacked career guidance, lacked I.C.T. infrastructure and had the pressure for exam grading results to move to the next level. Kenya is now implementing a Competency Based Curriculum (C.B.C) system of education. C.B.C, on the other hand, is learner based. It focuses mainly on the ability of the learners, their strengths/likings, not what they are systematically trained to pass exams only for progression. The academic pressure will be eased, which gives a chance to all learners to pursue their fields of strength and not only those endowed academically/theoretically. With C.B.C., each learner’s progress is nurtured and monitored over a period of 14 years that are divided into four major levels (2-6-3-3): 1. Pre-primary education [pp1 and pp2]-2 years; 2. Lower-primary [grades 1 - 6]-6 years; 3. Junior-secondary [grades 7 - 9]-3 years; 4. Senior secondary [grades 10 - 12]-3 years. In this paper, we look at these aspects with regards to C.B.C.: What necessitates it, its key strengths/benefits and application in a developing country; Implementation, what has worked and what is not working with the approach taken by Kenya education stakeholders during this process; Stakeholders, who should be involved/own the process; Conclusion, lessons learned, current status and recommendations going forward.

Keywords: benefits, challenges, competency, curricula, Kenya, successes

Procedia PDF Downloads 82
746 Immersing Socio-Affective Instruction within the Constructs of the Academic Curriculum: A Study of Gifted and Talented Programs

Authors: R. Granger-Ellis, R. B. Speaker, Jr., P. J. Austin

Abstract:

This research study examined more than 340 gifted and talented students enrolled in various gifted and talented programs in a large southeastern United States metropolitan area (creative arts, urban charters, suburban public schools) for socio-affective psychological development and whether a particular curriculum encouraged developmental growth. This study focused on students receiving distinctive gifted and talented curricula (creative arts, arts-integrated, and academic acceleration) and analyzed for (1) socio-affective development levels and (2) whether a particular curriculum encouraged developmental growth. Research questions guiding the study: (1) How do academically and artistically gifted 10th and 11th grade students perform on psychological scales of social and emotional intelligence? (2) Do adolescents receiving distinctive gifted and talented curriculum differ in their socio-affective developmental profiles? Students’ performances on psychometric scales were compared over time and by curriculum type. Over the first semester of the academic year, participants took pre- and post-tests assessing socio-affective intelligence (BarOn EQ-I: YV). Differences in growth on these psychological scales (individuals and programs) were examined. Program artifacts provided insight for curriculum correlation.

Keywords: gifted and talented curriculum, social and emotional development, moral development, socio-affective curriculum

Procedia PDF Downloads 351
745 Developing an Indigenous Mathematics, Science and Technology Education Master’s Program: A Three Universities Collaboration

Authors: Mishack Thiza Gumbo

Abstract:

The participatory action research study reported in this paper aims to explore indigenous mathematics, science, and technology to develop an indigenous Mathematics, Science and Technology Education Master’s Programme ultimately. The study is based on an ongoing collaborative project between the Mathematics, Science and Technology Education Departments of the University of South Africa, University of Botswana and Chinhoyi University of Technology. The study targets the Mathematics, Science and Technology Education Master’s students and indigenous knowledge holders in these three contexts as research participants. They will be interviewed; documents of existing Mathematics, Science and Technology Education Master’s Programmes will be analysed; mathematics, science and technology-related artefacts will also be collected and analysed. Mathematics, Science, and Technology Education are traditionally referred to as gateway subjects because the world economy revolves around them. Scores of scholars call for the indigenisation of research and methodologies so that research can suit and advance indigenous knowledge and sustainable development. There are ethnomathematics, ethnoscience and ethnotechnology which exist in indigenous contexts such as blacksmithing, woodcarving, textile-weaving and dyeing, but the current curricula and research in institutions of learning reflect the Western notions of these subjects. Indigenisation of the academic programmecontributes toward the decolonisation of education. Hence, the development of an indigenous Mathematics, Science and Technology Education Master’s Programme, which will be jointly offered by the three universities mentioned above, will contribute to the transformation of higher education in this sense.

Keywords: indigenous, mathematics, science, technology, master's program, universities, collaboration

Procedia PDF Downloads 139
744 Detailed Quantum Circuit Design and Evaluation of Grover's Algorithm for the Bounded Degree Traveling Salesman Problem Using the Q# Language

Authors: Wenjun Hou, Marek Perkowski

Abstract:

The Traveling Salesman problem is famous in computing and graph theory. In short, it asks for the Hamiltonian cycle of the least total weight in a given graph with N nodes. All variations on this problem, such as those with K-bounded-degree nodes, are classified as NP-complete in classical computing. Although several papers propose theoretical high-level designs of quantum algorithms for the Traveling Salesman Problem, no quantum circuit implementation of these algorithms has been created up to our best knowledge. In contrast to previous papers, the goal of this paper is not to optimize some abstract complexity measures based on the number of oracle iterations, but to be able to evaluate the real circuit and time costs of the quantum computer. Using the emerging quantum programming language Q# developed by Microsoft, which runs quantum circuits in a quantum computer simulation, an implementation of the bounded-degree problem and its respective quantum circuit were created. To apply Grover’s algorithm to this problem, a quantum oracle was designed, evaluating the cost of a particular set of edges in the graph as well as its validity as a Hamiltonian cycle. Repeating the Grover algorithm with an oracle that finds successively lower cost each time allows to transform the decision problem to an optimization problem, finding the minimum cost of Hamiltonian cycles. N log₂ K qubits are put into an equiprobablistic superposition by applying the Hadamard gate on each qubit. Within these N log₂ K qubits, the method uses an encoding in which every node is mapped to a set of its encoded edges. The oracle consists of several blocks of circuits: a custom-written edge weight adder, node index calculator, uniqueness checker, and comparator, which were all created using only quantum Toffoli gates, including its special forms, which are Feynman and Pauli X. The oracle begins by using the edge encodings specified by the qubits to calculate each node that this path visits and adding up the edge weights along the way. Next, the oracle uses the calculated nodes from the previous step and check that all the nodes are unique. Finally, the oracle checks that the calculated cost is less than the previously-calculated cost. By performing the oracle an optimal number of times, a correct answer can be generated with very high probability. The oracle of the Grover Algorithm is modified using the recalculated minimum cost value, and this procedure is repeated until the cost cannot be further reduced. This algorithm and circuit design have been verified, using several datasets, to generate correct outputs.

Keywords: quantum computing, quantum circuit optimization, quantum algorithms, hybrid quantum algorithms, quantum programming, Grover’s algorithm, traveling salesman problem, bounded-degree TSP, minimal cost, Q# language

Procedia PDF Downloads 171
743 A Cross-Disciplinary Educational Model in Biomanufacturing to Sustain a Competitive Workforce Ecosystem

Authors: Rosa Buxeda, Lorenzo Saliceti-Piazza, Rodolfo J. Romañach, Luis Ríos, Sandra L. Maldonado-Ramírez

Abstract:

Biopharmaceuticals manufacturing is one of the major economic activities worldwide. Ninety-three percent of the workforce in a biomanufacturing environment concentrates in production-related areas. As a result, strategic collaborations between industry and academia are crucial to ensure the availability of knowledgeable workforce needed in an economic region to become competitive in biomanufacturing. In the past decade, our institution has been a key strategic partner with multinational biotechnology companies in supplying science and engineering graduates in the field of industrial biotechnology. Initiatives addressing all levels of the educational pipeline, from K-12 to college to continued education for company employees have been established along a ten-year span. The Amgen BioTalents Program was designed to provide undergraduate science and engineering students with training in biomanufacturing. The areas targeted by this educational program enhance their academic development, since these topics are not part of their traditional science and engineering curricula. The educational curriculum involved the process of producing a biomolecule from the genetic engineering of cells to the production of an especially targeted polypeptide, protein expression and purification, to quality control, and validation. This paper will report and describe the implementation details and outcomes of the first sessions of the program.

Keywords: biomanufacturing curriculum, interdisciplinary learning, workforce development, industry-academia partnering

Procedia PDF Downloads 270