Search results for: computing curricula
1090 Navigating Cyber Attacks with Quantum Computing: Leveraging Vulnerabilities and Forensics for Advanced Penetration Testing in Cybersecurity
Authors: Sayor Ajfar Aaron, Ashif Newaz, Sajjat Hossain Abir, Mushfiqur Rahman
Abstract:
This paper examines the transformative potential of quantum computing in the field of cybersecurity, with a focus on advanced penetration testing and forensics. It explores how quantum technologies can be leveraged to identify and exploit vulnerabilities more efficiently than traditional methods and how they can enhance the forensic analysis of cyber-attacks. Through theoretical analysis and practical simulations, this study highlights the enhanced capabilities of quantum algorithms in detecting and responding to sophisticated cyber threats, providing a pathway for developing more resilient cybersecurity infrastructures.Keywords: cybersecurity, cyber forensics, penetration testing, quantum computing
Procedia PDF Downloads 671089 Method and Apparatus for Optimized Job Scheduling in the High-Performance Computing Cloud Environment
Authors: Subodh Kumar, Amit Varde
Abstract:
Typical on-premises high-performance computing (HPC) environments consist of a fixed number and a fixed set of computing hardware. During the design of the HPC environment, the hardware components, including but not limited to CPU, Memory, GPU, and networking, are carefully chosen from select vendors for optimal performance. High capital cost for building the environment is a prime factor influencing the design environment. A class of software called “Job Schedulers” are critical to maximizing these resources and running multiple workloads to extract the maximum value for the high capital cost. In principle, schedulers work by preventing workloads and users from monopolizing the finite hardware resources by queuing jobs in a workload. A cloud-based HPC environment does not have the limitations of fixed (type of and quantity of) hardware resources. In theory, users and workloads could spin up any number and type of hardware resource. This paper discusses the limitations of using traditional scheduling algorithms for cloud-based HPC workloads. It proposes a new set of features, called “HPC optimizers,” for maximizing the benefits of the elasticity and scalability of the cloud with the goal of cost-performance optimization of the workload.Keywords: high performance computing, HPC, cloud computing, optimization, schedulers
Procedia PDF Downloads 931088 Crow Search Algorithm-Based Task Offloading Strategies for Fog Computing Architectures
Authors: Aniket Ganvir, Ritarani Sahu, Suchismita Chinara
Abstract:
The rapid digitization of various aspects of life is leading to the creation of smart IoT ecosystems, where interconnected devices generate significant amounts of valuable data. However, these IoT devices face constraints such as limited computational resources and bandwidth. Cloud computing emerges as a solution by offering ample resources for offloading tasks efficiently despite introducing latency issues, especially for time-sensitive applications like fog computing. Fog computing (FC) addresses latency concerns by bringing computation and storage closer to the network edge, minimizing data travel distance, and enhancing efficiency. Offloading tasks to fog nodes or the cloud can conserve energy and extend IoT device lifespan. The offloading process is intricate, with tasks categorized as full or partial, and its optimization presents an NP-hard problem. Traditional greedy search methods struggle to address the complexity of task offloading efficiently. To overcome this, the efficient crow search algorithm (ECSA) has been proposed as a meta-heuristic optimization algorithm. ECSA aims to effectively optimize computation offloading, providing solutions to this challenging problem.Keywords: IoT, fog computing, task offloading, efficient crow search algorithm
Procedia PDF Downloads 581087 Importance of Standards in Engineering and Technology Education
Authors: Ahmed S. Khan, Amin Karim
Abstract:
During the past several decades, the economy of each nation has been significantly affected by globalization and technology. Government regulations and private sector standards affect a majority of world trade. Countries have been working together to establish international standards in almost every field. As a result, workers in all sectors need to have an understanding of standards. Engineering and technology students must not only possess an understanding of engineering standards and applicable government codes, but also learn to apply them in designing, developing, testing and servicing products, processes and systems. Accreditation Board for Engineering & Technology (ABET) criteria for engineering and technology education require students to learn and apply standards in their class projects. This paper is a follow-up of a 2006-2009 NSF initiative awarded to IEEE to help develop tutorials and case study modules for students and encourage standards education at college campuses. It presents the findings of a faculty/institution survey conducted through various U.S.-based listservs representing the major engineering and technology disciplines. The intent of the survey was to the gauge the status of use of standards and regulations in engineering and technology coursework and to identify benchmark practices. In light of survey findings, recommendations are made to standards development organizations, industry, and academia to help enhance the use of standards in engineering and technology curricula.Keywords: standards, regulations, ABET, IEEE, engineering, technology curricula
Procedia PDF Downloads 2881086 A Distributed Cryptographically Generated Address Computing Algorithm for Secure Neighbor Discovery Protocol in IPv6
Authors: M. Moslehpour, S. Khorsandi
Abstract:
Due to shortage in IPv4 addresses, transition to IPv6 has gained significant momentum in recent years. Like Address Resolution Protocol (ARP) in IPv4, Neighbor Discovery Protocol (NDP) provides some functions like address resolution in IPv6. Besides functionality of NDP, it is vulnerable to some attacks. To mitigate these attacks, Internet Protocol Security (IPsec) was introduced, but it was not efficient due to its limitation. Therefore, SEND protocol is proposed to automatic protection of auto-configuration process. It is secure neighbor discovery and address resolution process. To defend against threats on NDP’s integrity and identity, Cryptographically Generated Address (CGA) and asymmetric cryptography are used by SEND. Besides advantages of SEND, its disadvantages like the computation process of CGA algorithm and sequentially of CGA generation algorithm are considerable. In this paper, we parallel this process between network resources in order to improve it. In addition, we compare the CGA generation time in self-computing and distributed-computing process. We focus on the impact of the malicious nodes on the CGA generation time in the network. According to the result, although malicious nodes participate in the generation process, CGA generation time is less than when it is computed in a one-way. By Trust Management System, detecting and insulating malicious nodes is easier.Keywords: NDP, IPsec, SEND, CGA, modifier, malicious node, self-computing, distributed-computing
Procedia PDF Downloads 2781085 Accelerating Side Channel Analysis with Distributed and Parallelized Processing
Authors: Kyunghee Oh, Dooho Choi
Abstract:
Although there is no theoretical weakness in a cryptographic algorithm, Side Channel Analysis can find out some secret data from the physical implementation of a cryptosystem. The analysis is based on extra information such as timing information, power consumption, electromagnetic leaks or even sound which can be exploited to break the system. Differential Power Analysis is one of the most popular analyses, as computing the statistical correlations of the secret keys and power consumptions. It is usually necessary to calculate huge data and takes a long time. It may take several weeks for some devices with countermeasures. We suggest and evaluate the methods to shorten the time to analyze cryptosystems. Our methods include distributed computing and parallelized processing.Keywords: DPA, distributed computing, parallelized processing, side channel analysis
Procedia PDF Downloads 4281084 Parallel Computing: Offloading Matrix Multiplication to GPU
Authors: Bharath R., Tharun Sai N., Bhuvan G.
Abstract:
This project focuses on developing a Parallel Computing method aimed at optimizing matrix multiplication through GPU acceleration. Addressing algorithmic challenges, GPU programming intricacies, and integration issues, the project aims to enhance efficiency and scalability. The methodology involves algorithm design, GPU programming, and optimization techniques. Future plans include advanced optimizations, extended functionality, and integration with high-level frameworks. User engagement is emphasized through user-friendly interfaces, open- source collaboration, and continuous refinement based on feedback. The project's impact extends to significantly improving matrix multiplication performance in scientific computing and machine learning applications.Keywords: matrix multiplication, parallel processing, cuda, performance boost, neural networks
Procedia PDF Downloads 581083 A Study on How to Develop the Usage Metering Functions of BIM (Building Information Modeling) Software under Cloud Computing Environment
Authors: Kim Byung-Kon, Kim Young-Jin
Abstract:
As project opportunities for the Architecture, Engineering and Construction (AEC) industry have grown more complex and larger, the utilization of BIM (Building Information Modeling) technologies for 3D design and simulation practices has been increasing significantly; the typical applications of the BIM technologies include clash detection and design alternative based on 3D planning, which have been expanded over to the technology of construction management in the AEC industry for virtual design and construction. As for now, commercial BIM software has been operated under a single-user environment, which is why initial costs for its introduction are very high. Cloud computing, one of the most promising next-generation Internet technologies, enables simple Internet devices to use services and resources provided with BIM software. Recently in Korea, studies to link between BIM and cloud computing technologies have been directed toward saving costs to build BIM-related infrastructure, and providing various BIM services for small- and medium-sized enterprises (SMEs). This study addressed how to develop the usage metering functions of BIM software under cloud computing architecture in order to archive and use BIM data and create an optimal revenue structure so that the BIM services may grow spontaneously, considering a demand for cloud resources. To this end, the author surveyed relevant cases, and then analyzed needs and requirements from AEC industry. Based on the results & findings of the foregoing survey & analysis, the author proposed herein how to optimally develop the usage metering functions of cloud BIM software.Keywords: construction IT, BIM (Building Information Modeling), cloud computing, BIM-based cloud computing, 3D design, cloud BIM
Procedia PDF Downloads 5061082 ACO-TS: an ACO-based Algorithm for Optimizing Cloud Task Scheduling
Authors: Fahad Y. Al-dawish
Abstract:
The current trend by a large number of organizations and individuals to use cloud computing. Many consider it a significant shift in the field of computing. Cloud computing are distributed and parallel systems consisting of a collection of interconnected physical and virtual machines. With increasing request and profit of cloud computing infrastructure, diverse computing processes can be executed on cloud environment. Many organizations and individuals around the world depend on the cloud computing environments infrastructure to carry their applications, platform, and infrastructure. One of the major and essential issues in this environment related to allocating incoming tasks to suitable virtual machine (cloud task scheduling). Cloud task scheduling is classified as optimization problem, and there are several meta-heuristic algorithms have been anticipated to solve and optimize this problem. Good task scheduler should execute its scheduling technique on altering environment and the types of incoming task set. In this research project a cloud task scheduling methodology based on ant colony optimization ACO algorithm, we call it ACO-TS Ant Colony Optimization for Task Scheduling has been proposed and compared with different scheduling algorithms (Random, First Come First Serve FCFS, and Fastest Processor to the Largest Task First FPLTF). Ant Colony Optimization (ACO) is random optimization search method that will be used for assigning incoming tasks to available virtual machines VMs. The main role of proposed algorithm is to minimizing the makespan of certain tasks set and maximizing resource utilization by balance the load among virtual machines. The proposed scheduling algorithm was evaluated by using Cloudsim toolkit framework. Finally after analyzing and evaluating the performance of experimental results we find that the proposed algorithm ACO-TS perform better than Random, FCFS, and FPLTF algorithms in each of the makespaan and resource utilization.Keywords: cloud Task scheduling, ant colony optimization (ACO), cloudsim, cloud computing
Procedia PDF Downloads 4211081 A Type-2 Fuzzy Model for Link Prediction in Social Network
Authors: Mansoureh Naderipour, Susan Bastani, Mohammad Fazel Zarandi
Abstract:
Predicting links that may occur in the future and missing links in social networks is an attractive problem in social network analysis. Granular computing can help us to model the relationships between human-based system and social sciences in this field. In this paper, we present a model based on granular computing approach and Type-2 fuzzy logic to predict links regarding nodes’ activity and the relationship between two nodes. Our model is tested on collaboration networks. It is found that the accuracy of prediction is significantly higher than the Type-1 fuzzy and crisp approach.Keywords: social network, link prediction, granular computing, type-2 fuzzy sets
Procedia PDF Downloads 3261080 Integrating and Evaluating Computational Thinking in an Undergraduate Marine Science Course
Authors: Dana Christensen
Abstract:
Undergraduate students, particularly in the environmental sciences, have difficulty displaying quantitative skills in their laboratory courses. Students spend time sampling in the field, often using new methods, and are expected to make sense of the data they collect. Computational thinking may be used to navigate these new experiences. We developed a curriculum for the marine science department at a small liberal arts college in the Northeastern United States based on previous computational thinking frameworks. This curriculum incorporates marine science data sets with specific objectives and topics selected by the faculty at the College. The curriculum was distributed to all students enrolled in introductory marine science classes as a mandatory module. Two pre-tests and post-tests will be used to quantitatively assess student progress on both content-based and computational principles. Student artifacts are being collected with each lesson to be coded for content-specific and computational-specific items in qualitative assessment. There is an overall gap in marine science education research, especially curricula that focus on computational thinking and associated quantitative assessment. The curricula itself, the assessments, and our results may be modified and applied to other environmental science courses due to the nature of the inquiry-based laboratory components that use quantitative skills to understand nature.Keywords: marine science, computational thinking, curriculum assessment, quantitative skills
Procedia PDF Downloads 591079 Closing the Assessment Loop: Case Study in Improving Outcomes for Online College Students during Pandemic
Authors: Arlene Caney, Linda Fellag
Abstract:
To counter the adverse effect of Covid-19 on college student success, two faculty members at a US community college have used web-based assessment data to improve curricula and, thus, student outcomes. This case study exemplifies how “closing the loop” by analyzing outcome assessments in real time can improve student learning for academically underprepared students struggling during the pandemic. The purpose of the study was to develop ways to mitigate the negative impact of Covid-19 on student success of underprepared college students. Using the Assessment, Evaluation, Feedback and Intervention System (AEFIS) and other assessment tools provided by the college’s Office of Institutional Research, an English professor and a Music professor collected data in skill areas related to their curricula over four semesters, gaining insight into specific course sections and learners’ performance across different Covid-driven course formats—face-to-face, hybrid, synchronous, and asynchronous. Real-time data collection allowed faculty to shorten and close the assessment loop, and prompted faculty to enhance their curricula with engaging material, student-centered activities, and a variety of tech tools. Frequent communication, individualized study, constructive criticism, and encouragement were among other measures taken to enhance teaching and learning. As a result, even while student success rates were declining college-wide, student outcomes in these faculty members’ asynchronous and synchronous online classes improved or remained comparable to student outcomes in hybrid and face-to-face sections. These practices have demonstrated that even high-risk students who enter college with remedial level language and mathematics skills, interrupted education, work and family responsibilities, and language and cultural diversity can maintain positive outcomes in college across semesters, even during the pandemic.Keywords: AEFIS, assessment, distance education, institutional research center
Procedia PDF Downloads 871078 Indigenous Knowledge and Nature of Science Interface: Content Considerations for Science, Technology, Engineering, and Mathematics Education
Authors: Mpofu Vongai, Vhurumuku Elaosi
Abstract:
Many African countries, such as Zimbabwe and South Africa, have curricula reform agendas that include incorporation of Indigenous Knowledge and Nature of Science (NOS) into school Science, Technology, Engineering and Mathematics (STEM) education. It is argued that at high school level, STEM learning, which incorporates understandings of indigenization science and NOS, has the potential to provide a strong foundation for a culturally embedded scientific knowledge essential for their advancement in Science and Technology. Globally, investment in STEM education is recognized as essential for economic development. For this reason, developing countries such as Zimbabwe and South Africa have been investing into training specialized teachers in natural sciences and technology. However, in many cases this training has been detached from the cultural realities and contexts of indigenous learners. For this reason, the STEM curricula reform has provided implementation challenges to teachers. An issue of major concern is the teachers’ pedagogical content knowledge (PCK), which is essential for effective implementation of these STEM curricula. Well-developed Teacher PCK include an understanding of both the nature of indigenous knowledge (NOIK) and of NOS. This paper reports the results of a study that investigated the development of 3 South African and 3 Zimbabwean in-service teachers’ abilities to integrate NOS and NOIK as part of their PCK. A participatory action research design was utilized. The main focus was on capturing, determining and developing teachers STEM knowledge for integrating NOIK and NOS in science classrooms. Their use of indigenous games was used to determine how their subject knowledge for STEM and pedagogical abilities could be developed. Qualitative data were gathered through the use dialogues between the researchers and the in-service teachers, as well as interviewing the participating teachers. Analysis of the data provides a methodological window through which in-service teachers’ PCK can be STEMITIZED and their abilities to integrate NOS and NOIK developed. Implications are raised for developing teachers’ STEM education in universities and teacher training colleges.Keywords: indigenous knowledge, nature of science, pedagogical content knowledge, STEM education
Procedia PDF Downloads 2791077 Using High Performance Computing for Online Flood Monitoring and Prediction
Authors: Stepan Kuchar, Martin Golasowski, Radim Vavrik, Michal Podhoranyi, Boris Sir, Jan Martinovic
Abstract:
The main goal of this article is to describe the online flood monitoring and prediction system Floreon+ primarily developed for the Moravian-Silesian region in the Czech Republic and the basic process it uses for running automatic rainfall-runoff and hydrodynamic simulations along with their calibration and uncertainty modeling. It takes a long time to execute such process sequentially, which is not acceptable in the online scenario, so the use of high-performance computing environment is proposed for all parts of the process to shorten their duration. Finally, a case study on the Ostravice river catchment is presented that shows actual durations and their gain from the parallel implementation.Keywords: flood prediction process, high performance computing, online flood prediction system, parallelization
Procedia PDF Downloads 4931076 DNA Multiplier: A Design Architecture of a Multiplier Circuit Using DNA Molecules
Authors: Hafiz Md. Hasan Babu, Khandaker Mohammad Mohi Uddin, Nitish Biswas, Sarreha Tasmin Rikta, Nuzmul Hossain Nahid
Abstract:
Nanomedicine and bioengineering use biological systems that can perform computing operations. In a biocomputational circuit, different types of biomolecules and DNA (Deoxyribose Nucleic Acid) are used as active components. DNA computing has the capability of performing parallel processing and a large storage capacity that makes it diverse from other computing systems. In most processors, the multiplier is treated as a core hardware block, and multiplication is one of the time-consuming and lengthy tasks. In this paper, cost-effective DNA multipliers are designed using algorithms of molecular DNA operations with respect to conventional ones. The speed and storage capacity of a DNA multiplier are also much higher than a traditional silicon-based multiplier.Keywords: biological systems, DNA multiplier, large storage, parallel processing
Procedia PDF Downloads 2141075 Hierarchical Queue-Based Task Scheduling with CloudSim
Authors: Wanqing You, Kai Qian, Ying Qian
Abstract:
The concepts of Cloud Computing provide users with infrastructure, platform and software as service, which make those services more accessible for people via Internet. To better analysis the performance of Cloud Computing provisioning policies as well as resources allocation strategies, a toolkit named CloudSim proposed. With CloudSim, the Cloud Computing environment can be easily constructed by modelling and simulating cloud computing components, such as datacenter, host, and virtual machine. A good scheduling strategy is the key to achieve the load balancing among different machines as well as to improve the utilization of basic resources. Recently, the existing scheduling algorithms may work well in some presumptive cases in a single machine; however they are unable to make the best decision for the unforeseen future. In real world scenario, there would be numbers of tasks as well as several virtual machines working in parallel. Based on the concepts of multi-queue, this paper presents a new scheduling algorithm to schedule tasks with CloudSim by taking into account several parameters, the machines’ capacity, the priority of tasks and the history log.Keywords: hierarchical queue, load balancing, CloudSim, information technology
Procedia PDF Downloads 4221074 Performance Analysis of Elliptic Curve Cryptography Using Onion Routing to Enhance the Privacy and Anonymity in Grid Computing
Authors: H. Parveen Begam, M. A. Maluk Mohamed
Abstract:
Grid computing is an environment that allows sharing and coordinated use of diverse resources in dynamic, heterogeneous and distributed environment using Virtual Organization (VO). Security is a critical issue due to the open nature of the wireless channels in the grid computing which requires three fundamental services: authentication, authorization, and encryption. The privacy and anonymity are considered as an important factor while communicating over publicly spanned network like web. To ensure a high level of security we explored an extension of onion routing, which has been used with dynamic token exchange along with protection of privacy and anonymity of individual identity. To improve the performance of encrypting the layers, the elliptic curve cryptography is used. Compared to traditional cryptosystems like RSA (Rivest-Shamir-Adelman), ECC (Elliptic Curve Cryptosystem) offers equivalent security with smaller key sizes which result in faster computations, lower power consumption, as well as memory and bandwidth savings. This paper presents the estimation of the performance improvements of onion routing using ECC as well as the comparison graph between performance level of RSA and ECC.Keywords: grid computing, privacy, anonymity, onion routing, ECC, RSA
Procedia PDF Downloads 3981073 A Two Level Load Balancing Approach for Cloud Environment
Authors: Anurag Jain, Rajneesh Kumar
Abstract:
Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.Keywords: cloud analyst, cloud computing, join idle queue, join shortest queue, load balancing, task scheduling
Procedia PDF Downloads 4311072 Secure Hashing Algorithm and Advance Encryption Algorithm in Cloud Computing
Authors: Jaimin Patel
Abstract:
Cloud computing is one of the most sharp and important movement in various computing technologies. It provides flexibility to users, cost effectiveness, location independence, easy maintenance, enables multitenancy, drastic performance improvements, and increased productivity. On the other hand, there are also major issues like security. Being a common server, security for a cloud is a major issue; it is important to provide security to protect user’s private data, and it is especially important in e-commerce and social networks. In this paper, encryption algorithms such as Advanced Encryption Standard algorithms, their vulnerabilities, risk of attacks, optimal time and complexity management and comparison with other algorithms based on software implementation is proposed. Encryption techniques to improve the performance of AES algorithms and to reduce risk management are given. Secure Hash Algorithms, their vulnerabilities, software implementations, risk of attacks and comparison with other hashing algorithms as well as the advantages and disadvantages between hashing techniques and encryption are given.Keywords: Cloud computing, encryption algorithm, secure hashing algorithm, brute force attack, birthday attack, plaintext attack, man in middle attack
Procedia PDF Downloads 2801071 MLProxy: SLA-Aware Reverse Proxy for Machine Learning Inference Serving on Serverless Computing Platforms
Authors: Nima Mahmoudi, Hamzeh Khazaei
Abstract:
Serving machine learning inference workloads on the cloud is still a challenging task at the production level. The optimal configuration of the inference workload to meet SLA requirements while optimizing the infrastructure costs is highly complicated due to the complex interaction between batch configuration, resource configurations, and variable arrival process. Serverless computing has emerged in recent years to automate most infrastructure management tasks. Workload batching has revealed the potential to improve the response time and cost-effectiveness of machine learning serving workloads. However, it has not yet been supported out of the box by serverless computing platforms. Our experiments have shown that for various machine learning workloads, batching can hugely improve the system’s efficiency by reducing the processing overhead per request. In this work, we present MLProxy, an adaptive reverse proxy to support efficient machine learning serving workloads on serverless computing systems. MLProxy supports adaptive batching to ensure SLA compliance while optimizing serverless costs. We performed rigorous experiments on Knative to demonstrate the effectiveness of MLProxy. We showed that MLProxy could reduce the cost of serverless deployment by up to 92% while reducing SLA violations by up to 99% that can be generalized across state-of-the-art model serving frameworks.Keywords: serverless computing, machine learning, inference serving, Knative, google cloud run, optimization
Procedia PDF Downloads 1791070 Educating for Acceptance or Action: Bachelor of Social Work Education in Canada
Authors: Elizabeth Radian
Abstract:
In a challenging era of neoliberalism and managerialism in social services, the status of Canadian social work education at the Bachelor of Social Work level (BSW) was examined to determine how prepared students were to practice in a time of resource cutbacks and insecurity. Curricula in BSW programs was the focus as this generalist degree results in the greatest number of social work graduates in Canada, most of whom work at the front lines in service delivery. The study reviewed the practice frameworks that students in BSW programs were exposed to. Traditionally, schools of social work have embraced two major practice frameworks. The person in environment framework is a well-established practice framework taught in most schools. The framework offers some focus on smaller scale social change, tweaking existing arrangements and is more accepting of the status quo. An alternate practice framework taught in fewer schools has been described as a structural, progressive or anti oppressive framework. This latter framework challenges the status quo, is focused on social justice and social transformation, often incorporating social action strategies to ensure marginalized voices are heard. Using a content analysis methodology of keywords and phrases to delineate framework orientation, practice frameworks articulated in the curricula were determined by reviewing the mission/mandate of schools offering a BSW degree, their core course outlines and core course textbooks. Social action, as one strategy for initiating social change and transformation was considered. Initial research for 28 schools was completed in 2000, with follow up replications of the initial study in 2005 and 2014. These earlier studies displayed that the dominant practice framework taught in BSW programs was the person in environment framework. A lesser number of schools were categorized as primarily offering a structural, progressive or anti oppressive framework. The findings from the current study of 39 Canadian schools of social work are considered to determine how prominent structural, progressive and anti oppressive frameworks exist in current BSW curricula. This study can assist in contemplating the question – are we educating future practitioners for acceptance or action.Keywords: social work education and pedagogy, social change, social justice, social services
Procedia PDF Downloads 1921069 Classification of Attacks Over Cloud Environment
Authors: Karim Abouelmehdi, Loubna Dali, Elmoutaoukkil Abdelmajid, Hoda Elsayed, Eladnani Fatiha, Benihssane Abderahim
Abstract:
The security of cloud services is the concern of cloud service providers. In this paper, we will mention different classifications of cloud attacks referred by specialized organizations. Each agency has its classification of well-defined properties. The purpose is to present a high-level classification of current research in cloud computing security. This classification is organized around attack strategies and corresponding defenses.Keywords: cloud computing, classification, risk, security
Procedia PDF Downloads 5481068 On the Factors Affecting Computing Students’ Awareness of the Latest ICTs
Authors: O. D. Adegbehingbe, S. D. Eyono Obono
Abstract:
The education sector is constantly faced with rapid changes in technologies in terms of ensuring that the curriculum is up to date and in terms of making sure that students are aware of these technological changes. This challenge can be seen as the motivation for this study, which is to examine the factors affecting computing students’ awareness of the latest Information Technologies (ICTs). The aim of this study is divided into two sub-objectives which are: the selection of relevant theories and the design of a conceptual model to support it as well as the empirical testing of the designed model. The first objective is achieved by a review of existing literature on technology adoption theories and models. The second objective is achieved using a survey of computing students in the four universities of the KwaZulu-Natal province of South Africa. Data collected from this survey is analyzed using Statistical package for the Social Science (SPSS) using descriptive statistics, ANOVA and Pearson correlations. The main hypothesis of this study is that there is a relationship between the demographics and the prior conditions of the computing students and their awareness of general ICT trends and of Digital Switch Over (DSO) a new technology which involves the change from analog to digital television broadcasting in order to achieve improved spectrum efficiency. The prior conditions of the computing students that were considered in this study are students’ perceived exposure to career guidance and students’ perceived curriculum currency. The results of this study confirm that gender, ethnicity, and high school computing course affect students’ perceived curriculum currency while high school location affects students’ awareness of DSO. The results of this study also confirm that there is a relationship between students prior conditions and their awareness of general ICT trends and DSO in particular.Keywords: education, information technologies, IDT, awareness
Procedia PDF Downloads 3571067 Efficient Semi-Systolic Finite Field Multiplier Using Redundant Basis
Authors: Hyun-Ho Lee, Kee-Won Kim
Abstract:
The arithmetic operations over GF(2m) have been extensively used in error correcting codes and public-key cryptography schemes. Finite field arithmetic includes addition, multiplication, division and inversion operations. Addition is very simple and can be implemented with an extremely simple circuit. The other operations are much more complex. The multiplication is the most important for cryptosystems, such as the elliptic curve cryptosystem, since computing exponentiation, division, and computing multiplicative inverse can be performed by computing multiplication iteratively. In this paper, we present a parallel computation algorithm that operates Montgomery multiplication over finite field using redundant basis. Also, based on the multiplication algorithm, we present an efficient semi-systolic multiplier over finite field. The multiplier has less space and time complexities compared to related multipliers. As compared to the corresponding existing structures, the multiplier saves at least 5% area, 50% time, and 53% area-time (AT) complexity. Accordingly, it is well suited for VLSI implementation and can be easily applied as a basic component for computing complex operations over finite field, such as inversion and division operation.Keywords: finite field, Montgomery multiplication, systolic array, cryptography
Procedia PDF Downloads 2941066 Proposed Solutions Based on Affective Computing
Authors: Diego Adrian Cardenas Jorge, Gerardo Mirando Guisado, Alfredo Barrientos Padilla
Abstract:
A system based on Affective Computing can detect and interpret human information like voice, facial expressions and body movement to detect emotions and execute a corresponding response. This data is important due to the fact that a person can communicate more effectively with emotions than can be possible with words. This information can be processed through technological components like Facial Recognition, Gait Recognition or Gesture Recognition. As of now, solutions proposed using this technology only consider one component at a given moment. This research investigation proposes two solutions based on Affective Computing taking into account more than one component for emotion detection. The proposals reflect the levels of dependency between hardware devices and software, as well as the interaction process between the system and the user which implies the development of scenarios where both proposals will be put to the test in a live environment. Both solutions are to be developed in code by software engineers to prove the feasibility. To validate the impact on society and business interest, interviews with stakeholders are conducted with an investment mind set where each solution is labeled on a scale of 1 through 5, being one a minimum possible investment and 5 the maximum.Keywords: affective computing, emotions, emotion detection, face recognition, gait recognition
Procedia PDF Downloads 3691065 Arithmetic Operations in Deterministic P Systems Based on the Weak Rule Priority
Authors: Chinedu Peter, Dashrath Singh
Abstract:
Membrane computing is a computability model which abstracts its structures and functions from the biological cell. The main ingredient of membrane computing is the notion of a membrane structure, which consists of several cell-like membranes recurrently placed inside a unique skin membrane. The emergence of several variants of membrane computing gives rise to the notion of a P system. The paper presents a variant of P systems for arithmetic operations on non-negative integers based on the weak priorities for rule application. Consequently, we obtain deterministic P systems. Two membranes suffice. There are at most four objects for multiplication and five objects for division throughout the computation processes. The model is simple and has a potential for possible extension to non-negative integers and real numbers in general.Keywords: P system, binary operation, determinism, weak rule priority
Procedia PDF Downloads 4451064 Systems Thinking in Practice Supporting Competence and Sustainable Development Goal Implementation Capability in Student Teaching
Authors: Anette Hay, Zama Simamane
Abstract:
Capacity-building and integration of practical activities is one of the key SDGs of the 2030 Agenda for Sustainable Development. This paper will focus on SDG# 17 – “the means of implementation” - and the role of systems thinking in practice (STiP) in supporting both competence and SDG implementation capability in teacher education curricula at North-West University, South Africa. The “Environmental Management for Sustainability” module (EDTM 312), which is compulsory for all students enrolled in the education program at North-West University, will be used as a case study. There is a need for higher education to implement and practically integrate SDG goals into their curricula, and one way to achieve this is through the development of competencies. Education for Sustainable Development (ESD) has the potential to offer approaches that can be useful in the development of capacity-building activities to foster sustainability. The methodological approach adopted is based on a participatory paradigm followed by two cycles and reflection. This paper focuses on systems thinking in practice demonstrating how students apply and reflect on competencies to situations and how praxis captures the actual experiences. The results of this research indicated how to re-orientate the EDTM 312 curriculum to include an environmental justice focus. This research shares practical knowledge of systems thinking as a sustainability competency.Keywords: education for sustainable development, environmental justice competencies, sustainable development goals, systems thinking in practice
Procedia PDF Downloads 641063 An Effective Route to Control of the Safety of Accessing and Storing Data in the Cloud-Based Data Base
Authors: Omid Khodabakhshi, Amir Rozdel
Abstract:
The subject of cloud computing security research has allocated a number of challenges and competitions because the data center is comprised of complex private information and are always faced various risks of information disclosure by hacker attacks or internal enemies. Accordingly, the security of virtual machines in the cloud computing infrastructure layer is very important. So far, there are many software solutions to develop security in virtual machines. But using software alone is not enough to solve security problems. The purpose of this article is to examine the challenges and security requirements for accessing and storing data in an insecure cloud environment. In other words, in this article, a structure is proposed for the implementation of highly isolated security-sensitive codes using secure computing hardware in virtual environments. It also allows remote code validation with inputs and outputs. We provide these security features even in situations where the BIOS, the operating system, and even the super-supervisor are infected. To achieve these goals, we will use the hardware support provided by the new Intel and AMD processors, as well as the TPM security chip. In conclusion, the use of these technologies ultimately creates a root of dynamic trust and reduces TCB to security-sensitive codes.Keywords: code, cloud computing, security, virtual machines
Procedia PDF Downloads 1911062 Design and Implementation of Remote Application Virtualization in Cloud Environments
Authors: Shuen-Tai Wang, Ying-Chuan Chen, Hsi-Ya Chang
Abstract:
Cloud computing is a paradigm of computing that shifts the way computing has been done in the past. The users can use cloud resources such as application software or storage space from the cloud without needing to own them. This paper is focused on solutions that are anticipated to introduce IaaS idea to build cloud base services and enable the individual remote user's applications in cloud environments, which appear as if they are running on the end user's local computer. The available features of application delivery solution have been developed based on our previous research on the virtualization technology to offer applications independent of location so that the users can work online, offline, anywhere, with appropriate device and at any time. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for cloud service. Users no longer need to burden the system managers and drastically reduces the overall cost of hardware and software licenses. Moreover, this flexible remote application virtualization service represents the next significant step to the mobile workplace, and it lets users access their applications remotely through cloud services anywhere. This is also made possible by the low administrative costs as well as relatively inexpensive end-user terminals and reduced energy expenses.Keywords: cloud computing, IaaS, virtualization, application delivery
Procedia PDF Downloads 2811061 Modernization of Translation Studies Curriculum at Higher Education Level in Armenia
Authors: A. Vahanyan
Abstract:
The paper touches upon the problem of revision and modernization of the current curriculum on translation studies at the Armenian Higher Education Institutions (HEIs). In the contemporary world where quality and speed of services provided are mostly valued, certain higher education centers in Armenia though do not demonstrate enough flexibility in terms of the revision and amendment of courses taught. This issue is present for various curricula at the university level and Translation Studies related curriculum, in particular. Technological innovations that are of great help for translators have been long ago smoothly implemented into the global Translation Industry. According to the European Master's in Translation (EMT) framework, translation service provision comprises linguistic, intercultural, information mining, thematic, and technological competencies. Therefore, to form the competencies mentioned above, the curriculum should be seriously restructured to meet the modern education and job market requirements, relevant courses should be proposed. New courses, in particular, should focus on the formation of technological competences. These suggestions have been made upon the author’s research of the problem across various HEIs in Armenia. The updated curricula should include courses aimed at familiarization with various computer-assisted translation (CAT) tools (MemoQ, Trados, OmegaT, Wordfast, etc.) in the translation process, creation of glossaries and termbases compatible with different platforms), which will ensure consistency in translation of similar texts and speeding up the translation process itself. Another aspect that may be strengthened via curriculum modification is the introduction of interdisciplinary and Project-Based Learning courses, which will enable info mining and thematic competences, which are of great importance as well. Of course, the amendment of the existing curriculum with the mentioned courses will require corresponding faculty development via training, workshops, and seminars. Finally, the provision of extensive internship with translation agencies is strongly recommended as it will ensure the synthesis of theoretical background and practical skills highly required for the specific area. Summing up, restructuring and modernization of the existing curricula on Translation Studies should focus on three major aspects, i.e., introduction of new courses that meet the global quality standards of education, professional development for faculty, and integration of extensive internship supervised by experts in the field.Keywords: competencies, curriculum, modernization, technical literacy, translation studies
Procedia PDF Downloads 131