Search results for: resilient cloud computing
1189 A Parallel Algorithm for Solving the PFSP on the Grid
Authors: Samia Kouki
Abstract:
Solving NP-hard combinatorial optimization problems by exact search methods, such as Branch-and-Bound, may degenerate to complete enumeration. For that reason, exact approaches limit us to solve only small or moderate size problem instances, due to the exponential increase in CPU time when problem size increases. One of the most promising ways to reduce significantly the computational burden of sequential versions of Branch-and-Bound is to design parallel versions of these algorithms which employ several processors. This paper describes a parallel Branch-and-Bound algorithm called GALB for solving the classical permutation flowshop scheduling problem as well as its implementation on a Grid computing infrastructure. The experimental study of our distributed parallel algorithm gives promising results and shows clearly the benefit of the parallel paradigm to solve large-scale instances in moderate CPU time.Keywords: grid computing, permutation flow shop problem, branch and bound, load balancing
Procedia PDF Downloads 2841188 Fire and Explosion Consequence Modeling Using Fire Dynamic Simulator: A Case Study
Authors: Iftekhar Hassan, Sayedil Morsalin, Easir A Khan
Abstract:
Accidents involving fire occur frequently in recent times and their causes showing a great deal of variety which require intervention methods and risk assessment strategies are unique in each case. On September 4, 2020, a fire and explosion occurred in a confined space caused by a methane gas leak from an underground pipeline in Baitus Salat Jame mosque during Night (Esha) prayer in Narayanganj District, Bangladesh that killed 34 people. In this research, this incident is simulated using Fire Dynamics Simulator (FDS) software to analyze and understand the nature of the accident and associated consequences. FDS is an advanced computational fluid dynamics (CFD) system of fire-driven fluid flow which solves numerically a large eddy simulation form of the Navier–Stokes’s equations for simulation of the fire and smoke spread and prediction of thermal radiation, toxic substances concentrations and other relevant parameters of fire. This study focuses on understanding the nature of the fire and consequence evaluation due to thermal radiation caused by vapor cloud explosion. An evacuation modeling was constructed to visualize the effect of evacuation time and fractional effective dose (FED) for different types of agents. The results were presented by 3D animation, sliced pictures and graphical representation to understand fire hazards caused by thermal radiation or smoke due to vapor cloud explosion. This study will help to design and develop appropriate respond strategy for preventing similar accidents.Keywords: consequence modeling, fire and explosion, fire dynamics simulation (FDS), thermal radiation
Procedia PDF Downloads 2301187 Creation of a Realistic Railway Simulator Developed on a 3D Graphic Game Engine Using a Numerical Computing Programming Environment
Authors: Kshitij Ansingkar, Yohei Hoshino, Liangliang Yang
Abstract:
Advances in algorithms related to autonomous systems have made it possible to research on improving the accuracy of a train’s location. This has the capability of increasing the throughput of a railway network without the need for the creation of additional infrastructure. To develop such a system, the railway industry requires data to test sensor fusion theories or implement simultaneous localization and mapping (SLAM) algorithms. Though such simulation data and ground truth datasets are available for testing automation algorithms of vehicles, however, due to regulations and economic considerations, there is a dearth of such datasets in the railway industry. Thus, there is a need for the creation of a simulation environment that can generate realistic synthetic datasets. This paper proposes (1) to leverage the capabilities of open-source 3D graphic rendering software to create a visualization of the environment. (2) to utilize open-source 3D geospatial data for accurate visualization and (3) to integrate the graphic rendering software with a programming language and numerical computing platform. To develop such an integrated platform, this paper utilizes the computing platform’s advanced sensor models like LIDAR, camera, IMU or GPS and merges it with the 3D rendering of the game engine to generate high-quality synthetic data. Further, these datasets can be used to train Railway models and improve the accuracy of a train’s location.Keywords: 3D game engine, 3D geospatial data, dataset generation, railway simulator, sensor fusion, SLAM
Procedia PDF Downloads 181186 A TiO₂-Based Memristor Reliable for Neuromorphic Computing
Authors: X. S. Wu, H. Jia, P. H. Qian, Z. Zhang, H. L. Cai, F. M. Zhang
Abstract:
A bipolar resistance switching behaviour is detected for a Ti/TiO2-x/Au memristor device, which is fabricated by a masked designed magnetic sputtering. The current dependence of voltage indicates the curve changes slowly and continuously. When voltage pulses are applied to the device, the set and reset processes maintains linearity, which is used to simulate the synapses. We argue that the conduction mechanism of the device is from the oxygen vacancy channel model, and the resistance of the device change slowly due to the reaction between the titanium electrode and the intermediate layer and the existence of a large number of oxygen vacancies in the intermediate layer. Then, Hopfield neural network is constructed to simulate the behaviour of neural network in image processing, and the accuracy rate is more than 98%. This shows that titanium dioxide memristor has a broad application prospect in high performance neural network simulation.Keywords: memristor fabrication, neuromorphic computing, bionic synaptic application, TiO₂-based
Procedia PDF Downloads 941185 Automated Detection of Targets and Retrieve the Corresponding Analytics Using Augmented Reality
Authors: Suvarna Kumar Gogula, Sandhya Devi Gogula, P. Chanakya
Abstract:
Augmented reality is defined as the collection of the digital (or) computer generated information like images, audio, video, 3d models, etc. and overlay them over the real time environment. Augmented reality can be thought as a blend between completely synthetic and completely real. Augmented reality provides scope in a wide range of industries like manufacturing, retail, gaming, advertisement, tourism, etc. and brings out new dimensions in the modern digital world. As it overlays the content, it makes the users enhance the knowledge by providing the content blended with real world. In this application, we integrated augmented reality with data analytics and integrated with cloud so the virtual content will be generated on the basis of the data present in the database and we used marker based augmented reality where every marker will be stored in the database with corresponding unique ID. This application can be used in wide range of industries for different business processes, but in this paper, we mainly focus on the marketing industry which helps the customer in gaining the knowledge about the products in the market which mainly focus on their prices, customer feedback, quality, and other benefits. This application also focuses on providing better market strategy information for marketing managers who obtain the data about the stocks, sales, customer response about the product, etc. In this paper, we also included the reports from the feedback got from different people after the demonstration, and finally, we presented the future scope of Augmented Reality in different business processes by integrating with new technologies like cloud, big data, artificial intelligence, etc.Keywords: augmented reality, data analytics, catch room, marketing and sales
Procedia PDF Downloads 2401184 2016 Taiwan's 'Health and Physical Education Field of 12-Year Basic Education Curriculum Outline (Draft)' Reform and Its Implications
Authors: Hai Zeng, Yisheng Li, Jincheng Huang, Chenghui Huang, Ying Zhang
Abstract:
Children are strong; the country strong, the development of children Basketball is a strategic advantage. Common forms of basketball equipment has been difficult to meet the needs of young children teaching the game of basketball, basketball development for 3-6 years old children in the form of appropriate teaching aids is a breakthrough basketball game teaching children bottlenecks, improve teaching critical path pleasure, but also the development of early childhood basketball a necessary requirement. In this study, literature, questionnaires, focus group interviews, comparative analysis, for domestic and foreign use of 12 kinds of basketball teaching aids (cloud computing MINI basketball, adjustable basketball MINI, MINI basketball court, shooting assist paw print ball, dribble goggles, dribbling machine, machine cartoon shooting, rebounding machine, against the mat, elastic belt, ladder, fitness ball), from fun and improve early childhood shooting technique, dribbling technology, as well as offensive and defensive rebounding against technology conduct research on conversion technology. The results show that by using appropriate forms of teaching children basketball aids, can effectively improve children's fun basketball game, targeted to improve a technology, different types of aids from different perspectives enrich the connotation of children basketball game. Recommended for children of color psychology, cartoon and environmentally friendly material production aids, and increase research efforts basketball aids children, encourage children to sports teachers aids applications.Keywords: health and physical education field of curriculum outline, health fitness, sports and health curriculum reform, Taiwan, twelve years basic education
Procedia PDF Downloads 3971183 Resourcing for Post-Disaster Housing Reconstruction: The Case of Cyclone Sidr and Aila in Bangladesh
Authors: Zahidul Islam
Abstract:
This study investigates the effectiveness of resourcing in post-disaster housing reconstruction with reference to Cyclones Sidr and Aila in Bangladesh. Through evaluating three key theories- Build Back Better approach, Balance Scorecard approach and Dynamic Competency theories, the synthesis of literature, and empirical fieldwork, this research develops a dynamic theoretical framework that moves the trajectory of post-disaster housing reconstruction towards the reconstruction of more resilient houses. The ultimate goal of any post-disaster housing reconstruction project is to provide quality houses and to achieve high levels of satisfaction for beneficiaries. However, post-disaster reconstruction projects often fail in their stated objectives; only 10-20% housing needs are met, with most houses constructed on a temporary rather than permanent basis. A number of scholars have argued that access to resources can significantly increase the capacity and capability of disaster victims to rebuild their lives, including the construction of new homes. This study draws on structured interviews of 285 villagers affected by cyclones to investigate the effectiveness of resourcing in rebuilding houses after Cyclone Sidr in 2007 and Cyclone Aila in 2009. Furthermore, semi-structured interviews were conducted with 20 key stakeholders in UNDP, Oxfam, government officials, and national and international NGOs. The results of this study show that recovery rate of cyclone resilient houses that can withstand cyclone is very low and majority of the population are still vulnerable. Furthermore, hierarchical regression of survey data and thematic analyses of qualitative data indicate that access to resources, level of education, quality of building materials and income generating activities of the respondents are critical for effective post-disaster recovery. Conversely, resource availability, lack of coordination among participant organisations, corruption and lack of access to appropriate land constituted significant obstacles to livelihood recovery. Finally, this study makes significant theoretical contributions to theories of post-disaster recovery by introducing new variables and measures for evaluating the quality and effectiveness of post-disaster housing.Keywords: disaster, resourcing, housing, resilience
Procedia PDF Downloads 1491182 Model and Algorithm for Dynamic Wireless Electric Vehicle Charging Network Design
Authors: Trung Hieu Tran, Jesse O'Hanley, Russell Fowler
Abstract:
When in-wheel wireless charging technology for electric vehicles becomes mature, a need for such integrated charging stations network development is essential. In this paper, we thus investigate the optimisation problem of in-wheel wireless electric vehicle charging network design. A mixed-integer linear programming model is formulated to solve into optimality the problem. In addition, a meta-heuristic algorithm is proposed for efficiently solving large-sized instances within a reasonable computation time. A parallel computing strategy is integrated into the algorithm to speed up its computation time. Experimental results carried out on the benchmark instances show that our model and algorithm can find the optimal solutions and their potential for practical applications.Keywords: electric vehicle, wireless charging station, mathematical programming, meta-heuristic algorithm, parallel computing
Procedia PDF Downloads 851181 Design and Implementation of Security Middleware for Data Warehouse Signature, Framework
Authors: Mayada Al Meghari
Abstract:
Recently, grid middlewares have provided large integrated use of network resources as the shared data and the CPU to become a virtual supercomputer. In this work, we present the design and implementation of the middleware for Data Warehouse Signature, DWS Framework. The aim of using the middleware in our DWS framework is to achieve the high performance by the parallel computing. This middleware is developed on Alchemi.Net framework to increase the security among the network nodes through the authentication and group-key distribution model. This model achieves the key security and prevents any intermediate attacks in the middleware. This paper presents the flow process structures of the middleware design. In addition, the paper ensures the implementation of security for DWS middleware enhancement with the authentication and group-key distribution model. Finally, from the analysis of other middleware approaches, the developed middleware of DWS framework is the optimal solution of a complete covering of security issues.Keywords: middleware, parallel computing, data warehouse, security, group-key, high performance
Procedia PDF Downloads 1221180 Training for Digital Manufacturing: A Multilevel Teaching Model
Authors: Luís Rocha, Adam Gąska, Enrico Savio, Michael Marxer, Christoph Battaglia
Abstract:
The changes observed in the last years in the field of manufacturing and production engineering, popularly known as "Fourth Industry Revolution", utilizes the achievements in the different areas of computer sciences, introducing new solutions at almost every stage of the production process, just to mention such concepts as mass customization, cloud computing, knowledge-based engineering, virtual reality, rapid prototyping, or virtual models of measuring systems. To effectively speed up the production process and make it more flexible, it is necessary to tighten the bonds connecting individual stages of the production process and to raise the awareness and knowledge of employees of individual sectors about the nature and specificity of work in other stages. It is important to discover and develop a suitable education method adapted to the specificities of each stage of the production process, becoming an extremely crucial issue to exploit the potential of the fourth industrial revolution properly. Because of it, the project “Train4Dim” (T4D) intends to develop complex training material for digital manufacturing, including content for design, manufacturing, and quality control, with a focus on coordinate metrology and portable measuring systems. In this paper, the authors present an approach to using an active learning methodology for digital manufacturing. T4D main objective is to develop a multi-degree (apprenticeship up to master’s degree studies) and educational approach that can be adapted to different teaching levels. It’s also described the process of creating the underneath methodology. The paper will share the steps to achieve the aims of the project (training model for digital manufacturing): 1) surveying the stakeholders, 2) Defining the learning aims, 3) producing all contents and curriculum, 4) training for tutors, and 5) Pilot courses test and improvements.Keywords: learning, Industry 4.0, active learning, digital manufacturing
Procedia PDF Downloads 1031179 Pervasive Computing: Model to Increase Arable Crop Yield through Detection Intrusion System (IDS)
Authors: Idowu Olugbenga Adewumi, Foluke Iyabo Oluwatoyinbo
Abstract:
Presently, there are several discussions on the food security with increase in yield of arable crop throughout the world. This article, briefly present research efforts to create digital interfaces to nature, in particular to area of crop production in agriculture with increase in yield with interest on pervasive computing. The approach goes beyond the use of sensor networks for environmental monitoring but also by emphasizing the development of a system architecture that detect intruder (Intrusion Process) which reduce the yield of the farmer at the end of the planting/harvesting period. The objective of the work is to set a model for setting up the hand held or portable device for increasing the quality and quantity of arable crop. This process incorporates the use of infrared motion image sensor with security alarm system which can send a noise signal to intruder on the farm. This model of the portable image sensing device in monitoring or scaring human, rodent, birds and even pests activities will reduce post harvest loss which will increase the yield on farm. The nano intelligence technology was proposed to combat and minimize intrusion process that usually leads to low quality and quantity of produce from farm. Intranet system will be in place with wireless radio (WLAN), router, server, and client computer system or hand held device e.g PDAs or mobile phone. This approach enables the development of hybrid systems which will be effective as a security measure on farm. Since, precision agriculture has developed with the computerization of agricultural production systems and the networking of computerized control systems. In the intelligent plant production system of controlled greenhouses, information on plant responses, measured by sensors, is used to optimize the system. Further work must be carry out on modeling using pervasive computing environment to solve problems of agriculture, as the use of electronics in agriculture will attracts more youth involvement in the industry.Keywords: pervasive computing, intrusion detection, precision agriculture, security, arable crop
Procedia PDF Downloads 4091178 Communication of Sensors in Clustering for Wireless Sensor Networks
Authors: Kashish Sareen, Jatinder Singh Bal
Abstract:
The use of wireless sensor networks (WSNs) has grown vastly in the last era, pointing out the crucial need for scalable and energy-efficient routing and data gathering and aggregation protocols in corresponding large-scale environments. Wireless Sensor Networks have now recently emerged as a most important computing platform and continue to grow in diverse areas to provide new opportunities for networking and services. However, the energy constrained and limited computing resources of the sensor nodes present major challenges in gathering data. The sensors collect data about their surrounding and forward it to a command centre through a base station. The past few years have witnessed increased interest in the potential use of wireless sensor networks (WSNs) as they are very useful in target detecting and other applications. However, hierarchical clustering protocols have maximum been used in to overall system lifetime, scalability and energy efficiency. In this paper, the state of the art in corresponding hierarchical clustering approaches for large-scale WSN environments is shown.Keywords: clustering, DLCC, MLCC, wireless sensor networks
Procedia PDF Downloads 4861177 Applications of AI, Machine Learning, and Deep Learning in Cyber Security
Authors: Hailyie Tekleselase
Abstract:
Deep learning is increasingly used as a building block of security systems. However, neural networks are hard to interpret and typically solid to the practitioner. This paper presents a detail survey of computing methods in cyber security, and analyzes the prospects of enhancing the cyber security capabilities by suggests that of accelerating the intelligence of the security systems. There are many AI-based applications used in industrial scenarios such as Internet of Things (IoT), smart grids, and edge computing. Machine learning technologies require a training process which introduces the protection problems in the training data and algorithms. We present machine learning techniques currently applied to the detection of intrusion, malware, and spam. Our conclusions are based on an extensive review of the literature as well as on experiments performed on real enterprise systems and network traffic. We conclude that problems can be solved successfully only when methods of artificial intelligence are being used besides human experts or operators.Keywords: artificial intelligence, machine learning, deep learning, cyber security, big data
Procedia PDF Downloads 1301176 Digital Homeostasis: Tangible Computing as a Multi-Sensory Installation
Authors: Andrea Macruz
Abstract:
This paper explores computation as a process for design by examining how computers can become more than an operative strategy in a designer's toolkit. It documents this, building upon concepts of neuroscience and Antonio Damasio's Homeostasis Theory, which is the control of bodily states through feedback intended to keep conditions favorable for life. To do this, it follows a methodology through algorithmic drawing and discusses the outcomes of three multi-sensory design installations, which culminated from a course in an academic setting. It explains both the studio process that took place to create the installations and the computational process that was developed, related to the fields of algorithmic design and tangible computing. It discusses how designers can use computational range to achieve homeostasis related to sensory data in a multi-sensory installation. The outcomes show clearly how people and computers interact with different sensory modalities and affordances. They propose using computers as meta-physical stabilizers rather than tools.Keywords: algorithmic drawing, Antonio Damasio, emotion, homeostasis, multi-sensory installation, neuroscience
Procedia PDF Downloads 1111175 First Systematic Review on Aerosol Bound Water: Exploring the Existing Knowledge Domain Using the CiteSpace Software
Authors: Kamila Widziewicz-Rzonca
Abstract:
The presence of PM bound water as an integral chemical compound of suspended aerosol particles (PM) has become one of the hottest issues in recent years. The UN climate summits on climate change (COP24) indicate that PM of anthropogenic origin (released mostly from coal combustion) is directly responsible for climate change. Chemical changes at the particle-liquid (water) interface determine many phenomena occurring in the atmosphere such as visibility, cloud formation or precipitation intensity. Since water-soluble particles such as nitrates, sulfates, or sea salt easily become cloud condensation nuclei, they affect the climate for example by increasing cloud droplet concentration. Aerosol water is a master component of atmospheric aerosols and a medium that enables all aqueous-phase reactions occurring in the atmosphere. Thanks to a thorough bibliometric analysis conducted using CiteSpace Software, it was possible to identify past trends and possible future directions in measuring aerosol-bound water. This work, in fact, doesn’t aim at reviewing the existing literature in the related topic but is an in-depth bibliometric analysis exploring existing gaps and new frontiers in the topic of PM-bound water. To assess the major scientific areas related to PM-bound water and clearly define which among those are the most active topics we checked Web of Science databases from 1996 till 2018. We give an answer to the questions: which authors, countries, institutions and aerosol journals to the greatest degree influenced PM-bound water research? Obtained results indicate that the paper with the greatest citation burst was Tang In and Munklewitz H.R. 'water activities, densities, and refractive indices of aqueous sulfates and sodium nitrate droplets of atmospheric importance', 1994. The largest number of articles in this specific field was published in atmospheric chemistry and physics. An absolute leader in the quantity of publications among all research institutions is the National Aeronautics Space Administration (NASA). Meteorology and atmospheric sciences is a category with the most studies in this field. A very small number of studies on PM-bound water conduct a quantitative measurement of its presence in ambient particles or its origin. Most articles rather point PM-bound water as an artifact in organic carbon and ions measurements without any chemical analysis of its contents. This scientometric study presents the current and most actual literature regarding particulate bound water.Keywords: systematic review, aerosol-bound water, PM-bound water, CiteSpace, knowledge domain
Procedia PDF Downloads 1281174 An Exploratory Case Study of the Transference of Skills and Dispositions Used by a Newly Qualified Teacher
Authors: Lynn Machin
Abstract:
Using the lens of a theoretical framework relating to learning to learn the intention of the case study was to explore how transferable the teaching and learning skills of a newly qualified teacher (post-compulsory education) were when used in an overseas, unfamiliar and challenging post-compulsory educational environment. Particularly, the research sought to explore how this newly qualified teacher made use of the skills developed during their teacher training and to ascertain if, and what, other skills were necessary in order for them to have a positive influence on their learners and for them to be able to thrive within a different country and learning milieu. This case study looks at the experience of a trainee teacher who recently qualified in the UK to teach in post compulsory education (i.e. post 16 education). Rather than gaining employment in a UK based academy or college of further education this newly qualified teacher secured her first employment as a teacher in a province in China. Moreover, the newly qualified teacher had limited travel experience and had never travelled to Asia. She was one of the quieter and more reserved members on the one year teacher training course and was the least likely of the group to have made the decision to work abroad. How transferable the pedagogical skills that she had gained during her training would be when used in a culturally different and therefore (to her, challenging) environment was a key focus of the study. Another key focus was to explore the dispositions being used by the newly qualified teacher in order for her to teach and to thrive in an overseas educational environment. The methodological approach used for this study was both interpretative and qualitative. Associated methods were: Observation: observing the wider and operational practice of the newly qualified teacher over a five day period, and their need, ability and willingness to be reflective, resilient, reciprocal and resourceful. Interview: semi-structured interview with the newly qualified teacher following the observation of her practice. Findings from this case study illuminate the modifications made by the newly qualified teacher to her bank of teaching and learning strategies as well as the essentiality of dispositions used by her to know how to learn and also, crucially, to be ready and willing to do so. Such dispositions include being resilient, resourceful, reciprocal and reflective; necessary in order to adapt to the emerging challenges encountered by the teacher during their first months of employment in China. It is concluded that developing the skills to teach is essential for good teaching and learning practices. Having dispositions that enable teachers to work in ever changing conditions and surroundings is, this paper argues, essential for transferability and longevity of use of these skills.Keywords: learning, post-compulsory, resilience, transferable
Procedia PDF Downloads 2991173 Variants of Mathematical Induction as Strong Proof Techniques in Theory of Computing
Authors: Ahmed Tarek, Ahmed Alveed
Abstract:
In the theory of computing, there are a wide variety of direct and indirect proof techniques. However, mathematical induction (MI) stands out to be one of the most powerful proof techniques for proving hypotheses, theorems, and new results. There are variations of mathematical induction-based proof techniques, which are broadly classified into three categories, such as structural induction (SI), weak induction (WI), and strong induction (SI). In this expository paper, several different variants of the mathematical induction techniques are explored, and the specific scenarios are discussed where a specific induction technique stands out to be more advantageous as compared to other induction strategies. Also, the essential difference among the variants of mathematical induction are explored. The points of separation among mathematical induction, recursion, and logical deduction are precisely analyzed, and the relationship among variations of recurrence relations, and mathematical induction are being explored. In this context, the application of recurrence relations, and mathematical inductions are considered together in a single framework for codewords over a given alphabet.Keywords: alphabet, codeword, deduction, mathematical, induction, recurrence relation, strong induction, structural induction, weak induction
Procedia PDF Downloads 1671172 Effects of Research-Based Blended Learning Model Using Adaptive Scaffolding to Enhance Graduate Students' Research Competency and Analytical Thinking Skills
Authors: Panita Wannapiroon, Prachyanun Nilsook
Abstract:
This paper is a report on the findings of a Research and Development (R&D) aiming to develop the model of Research-Based Blended Learning Model Using Adaptive Scaffolding (RBBL-AS) to enhance graduate students’ research competency and analytical thinking skills, to study the result of using such model. The sample consisted of 10 experts in the fields during the model developing stage, while there were 23 graduate students of KMUTNB for the RBBL-AS model try out stage. The research procedures included 4 phases: 1) literature review, 2) model development, 3) model experiment, and 4) model revision and confirmation. The research results were divided into 3 parts according to the procedures as described in the following session. First, the data gathering from the literature review were reported as a draft model; followed by the research finding from the experts’ interviews indicated that the model should be included 8 components to enhance graduate students’ research competency and analytical thinking skills. The 8 components were 1) cloud learning environment, 2) Ubiquitous Cloud Learning Management System (UCLMS), 3) learning courseware, 4) learning resources, 5) adaptive Scaffolding, 6) communication and collaboration tolls, 7) learning assessment, and 8) research-based blended learning activity. Second, the research finding from the experimental stage found that there were statistically significant difference of the research competency and analytical thinking skills posttest scores over the pretest scores at the .05 level. The Graduate students agreed that learning with the RBBL-AS model was at a high level of satisfaction. Third, according to the finding from the experimental stage and the comments from the experts, the developed model was revised and proposed in the report for further implication and references.Keywords: research based learning, blended learning, adaptive scaffolding, research competency, analytical thinking skills
Procedia PDF Downloads 4241171 Portable and Parallel Accelerated Development Method for Field-Programmable Gate Array (FPGA)-Central Processing Unit (CPU)- Graphics Processing Unit (GPU) Heterogeneous Computing
Authors: Nan Hu, Chao Wang, Xi Li, Xuehai Zhou
Abstract:
The field-programmable gate array (FPGA) has been widely adopted in the high-performance computing domain. In recent years, the embedded system-on-a-chip (SoC) contains coarse granularity multi-core CPU (central processing unit) and mobile GPU (graphics processing unit) that can be used as general-purpose accelerators. The motivation is that algorithms of various parallel characteristics can be efficiently mapped to the heterogeneous architecture coupled with these three processors. The CPU and GPU offload partial computationally intensive tasks from the FPGA to reduce the resource consumption and lower the overall cost of the system. However, in present common scenarios, the applications always utilize only one type of accelerator because the development approach supporting the collaboration of the heterogeneous processors faces challenges. Therefore, a systematic approach takes advantage of write-once-run-anywhere portability, high execution performance of the modules mapped to various architectures and facilitates the exploration of design space. In this paper, A servant-execution-flow model is proposed for the abstraction of the cooperation of the heterogeneous processors, which supports task partition, communication and synchronization. At its first run, the intermediate language represented by the data flow diagram can generate the executable code of the target processor or can be converted into high-level programming languages. The instantiation parameters efficiently control the relationship between the modules and computational units, including two hierarchical processing units mapping and adjustment of data-level parallelism. An embedded system of a three-dimensional waveform oscilloscope is selected as a case study. The performance of algorithms such as contrast stretching, etc., are analyzed with implementations on various combinations of these processors. The experimental results show that the heterogeneous computing system with less than 35% resources achieves similar performance to the pure FPGA and approximate energy efficiency.Keywords: FPGA-CPU-GPU collaboration, design space exploration, heterogeneous computing, intermediate language, parameterized instantiation
Procedia PDF Downloads 1211170 Personalizing Human Physical Life Routines Recognition over Cloud-based Sensor Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Pervasive computing is a growing research field that aims to acknowledge human physical life routines (HPLR) based on body-worn sensors such as MEMS sensors-based technologies. The use of these technologies for human activity recognition is progressively increasing. On the other hand, personalizing human life routines using numerous machine-learning techniques has always been an intriguing topic. In contrast, various methods have demonstrated the ability to recognize basic movement patterns. However, it still needs to be improved to anticipate the dynamics of human living patterns. This study introduces state-of-the-art techniques for recognizing static and dy-namic patterns and forecasting those challenging activities from multi-fused sensors. Further-more, numerous MEMS signals are extracted from one self-annotated IM-WSHA dataset and two benchmarked datasets. First, we acquired raw data is filtered with z-normalization and denoiser methods. Then, we adopted statistical, local binary pattern, auto-regressive model, and intrinsic time scale decomposition major features for feature extraction from different domains. Next, the acquired features are optimized using maximum relevance and minimum redundancy (mRMR). Finally, the artificial neural network is applied to analyze the whole system's performance. As a result, we attained a 90.27% recognition rate for the self-annotated dataset, while the HARTH and KU-HAR achieved 83% on nine living activities and 90.94% on 18 static and dynamic routines. Thus, the proposed HPLR system outperformed other state-of-the-art systems when evaluated with other methods in the literature.Keywords: artificial intelligence, machine learning, gait analysis, local binary pattern (LBP), statistical features, micro-electro-mechanical systems (MEMS), maximum relevance and minimum re-dundancy (MRMR)
Procedia PDF Downloads 251169 Spherical Harmonic Based Monostatic Anisotropic Point Scatterer Model for RADAR Applications
Authors: Eric Huang, Coleman DeLude, Justin Romberg, Saibal Mukhopadhyay, Madhavan Swaminathan
Abstract:
High performance computing (HPC) based emulators can be used to model the scattering from multiple stationary and moving targets for RADAR applications. These emulators rely on the RADAR Cross Section (RCS) of the targets being available in complex scenarios. Representing the RCS using tables generated from electromagnetic (EM) simulations is often times cumbersome leading to large storage requirement. This paper proposed a spherical harmonic based anisotropic scatterer model to represent the RCS of complex targets. The problem of finding the locations and reflection profiles of all scatterers can be formulated as a linear least square problem with a special sparsity constraint. This paper solves this problem using a modified Orthogonal Matching Pursuit algorithm. The results show that the spherical harmonic based scatterer model can effectively represent the RCS data of complex targets.Keywords: RADAR, RCS, high performance computing, point scatterer model
Procedia PDF Downloads 1941168 Using the M-Learning to Support Learning of the Concept of the Derivative
Authors: Elena F. Ruiz, Marina Vicario, Chadwick Carreto, Rubén Peredo
Abstract:
One of the main obstacles in Mexico’s engineering programs is math comprehension, especially in the Derivative concept. Due to this, we present a study case that relates Mobile Computing and Classroom Learning in the “Escuela Superior de Cómputo”, based on the Educational model of the Instituto Politécnico Nacional (competence based work and problem solutions) in which we propose apps and activities to teach the concept of the Derivative. M- Learning is emphasized as one of its lines, as the objective is the use of mobile devices running an app that uses its components such as sensors, screen, camera and processing power in classroom work. In this paper, we employed Augmented Reality (ARRoC), based on the good results this technology has had in the field of learning. This proposal was developed using a qualitative research methodology supported by quantitative research. The methodological instruments used on this proposal are: observation, questionnaires, interviews and evaluations. We obtained positive results with a 40% increase using M-Learning, from the 20% increase using traditional means.Keywords: augmented reality, classroom learning, educational research, mobile computing
Procedia PDF Downloads 3631167 Approach on Conceptual Design and Dimensional Synthesis of the Linear Delta Robot for Additive Manufacturing
Authors: Efrain Rodriguez, Cristhian Riano, Alberto Alvares
Abstract:
In recent years, robots manipulators with parallel architectures are used in additive manufacturing processes – 3D printing. These robots have advantages such as speed and lightness that make them suitable to help with the efficiency and productivity of these processes. Consequently, the interest for the development of parallel robots for additive manufacturing applications has increased. This article deals with the conceptual design and dimensional synthesis of the linear delta robot for additive manufacturing. Firstly, a methodology based on structured processes for the development of products through the phases of informational design, conceptual design and detailed design is adopted: a) In the informational design phase the Mudge diagram and the QFD matrix are used to aid a set of technical requirements, to define the form, functions and features of the robot. b) In the conceptual design phase, the functional modeling of the system through of an IDEF0 diagram is performed, and the solution principles for the requirements are formulated using a morphological matrix. This phase includes the description of the mechanical, electro-electronic and computational subsystems that constitute the general architecture of the robot. c) In the detailed design phase, a digital model of the robot is drawn on CAD software. A list of commercial and manufactured parts is detailed. Tolerances and adjustments are defined for some parts of the robot structure. The necessary manufacturing processes and tools are also listed, including: milling, turning and 3D printing. Secondly, a dimensional synthesis method applied on design of the linear delta robot is presented. One of the most important key factors in the design of a parallel robot is the useful workspace, which strongly depends on the joint space, the dimensions of the mechanism bodies and the possible interferences between these bodies. The objective function is based on the verification of the kinematic model for a prescribed cylindrical workspace, considering geometric constraints that possibly lead to singularities of the mechanism. The aim is to determine the minimum dimensional parameters of the mechanism bodies for the proposed workspace. A method based on genetic algorithms was used to solve this problem. The method uses a cloud of points with the cylindrical shape of the workspace and checks the kinematic model for each of the points within the cloud. The evolution of the population (point cloud) provides the optimal parameters for the design of the delta robot. The development process of the linear delta robot with optimal dimensions for additive manufacture is presented. The dimensional synthesis enabled to design the mechanism of the delta robot in function of the prescribed workspace. Finally, the implementation of the robotic platform developed based on a linear delta robot in an additive manufacturing application using the Fused Deposition Modeling (FDM) technique is presented.Keywords: additive manufacturing, delta parallel robot, dimensional synthesis, genetic algorithms
Procedia PDF Downloads 1931166 Enhancement Dynamic Cars Detection Based on Optimized HOG Descriptor
Authors: Mansouri Nabila, Ben Jemaa Yousra, Motamed Cina, Watelain Eric
Abstract:
Research and development efforts in intelligent Advanced Driver Assistance Systems (ADAS) seek to save lives and reduce the number of on-road fatalities. For traffic and emergency monitoring, the essential but challenging task is vehicle detection and tracking in reasonably short time. This purpose needs first of all a powerful dynamic car detector model. In fact, this paper presents an optimized HOG process based on shape and motion parameters fusion. Our proposed approach mains to compute HOG by bloc feature from foreground blobs using configurable research window and pathway in order to overcome the shortcoming in term of computing time of HOG descriptor and improve their dynamic application performance. Indeed we prove in this paper that HOG by bloc descriptor combined with motion parameters is a very suitable car detector which reaches in record time a satisfactory recognition rate in dynamic outside area and bypasses several popular works without using sophisticated and expensive architectures such as GPU and FPGA.Keywords: car-detector, HOG, motion, computing time
Procedia PDF Downloads 3251165 Some Conjectures and Programs about Computing the Detour Index of Molecular Graphs of Nanotubes
Authors: Shokofeh Ebrtahimi
Abstract:
Let G be the chemical graph of a molecule. The matrix D = [dij ] is called the detour matrix of G, if dij is the length of longest path between atoms i and j. The sum of all entries above the main diagonal of D is called the detour index of G.Chemical graph theory is the topology branch of mathematical chemistry which applies graph theory to mathematical modelling of chemical phenomena.[1] The pioneers of the chemical graph theory are Alexandru Balaban, Ante Graovac, Ivan Gutman, Haruo Hosoya, Milan Randić and Nenad TrinajstićLet G be the chemical graph of a molecule. The matrix D = [dij ] is called the detour matrix of G, if dij is the length of longest path between atoms i and j. The sum of all entries above the main diagonal of D is called the detour index of G. In this paper, a new program for computing the detour index of molecular graphs of nanotubes by heptagons is determineded. Some Conjectures about detour index of Molecular graphs of nanotubes is included.Keywords: chemical graph, detour matrix, Detour index, carbon nanotube
Procedia PDF Downloads 2941164 Teaching Computer Programming to Diverse Students: A Comparative, Mixed-Methods, Classroom Research Study
Authors: Almudena Konrad, Tomás Galguera
Abstract:
Lack of motivation and interest is a serious obstacle to students’ learning computing skills. A need exists for a knowledge base on effective pedagogy and curricula to teach computer programming. This paper presents results from research evaluating a six-year project designed to teach complex concepts in computer programming collaboratively, while supporting students to continue developing their computer thinking and related coding skills individually. Utilizing a quasi-experimental, mixed methods design, the pedagogical approaches and methods were assessed in two contrasting groups of students with different socioeconomic status, gender, and age composition. Analyses of quantitative data from Likert-scale surveys and an evaluation rubric, combined with qualitative data from reflective writing exercises and semi-structured interviews yielded convincing evidence of the project’s success at both teaching and inspiring students.Keywords: computational thinking, computing education, computer programming curriculum, logic, teaching methods
Procedia PDF Downloads 3161163 Resilience and Urban Transformation: A Review of Recent Interventions in Europe and Turkey
Authors: Bilge Ozel
Abstract:
Cities are high-complex living organisms and are subjects to continuous transformations produced by the stress that derives from changing conditions. Today the metropolises are seen like “development engines” of the countries and accordingly they become the centre of better living conditions that encourages demographic growth which constitutes the main reason of the changes. Indeed, the potential for economic advancement of the cities directly represents the economic status of their countries. The term of “resilience”, which sees the changes as natural processes and represents the flexibility and adaptability of the systems in the face of changing conditions, becomes a key concept for the development of urban transformation policies. The term of “resilience” derives from the Latin word ‘resilire’, which means ‘bounce’, ‘jump back’, refers to the ability of a system to withstand shocks and still maintain the basic characteristics. A resilient system does not only survive the potential risks and threats but also takes advantage of the positive outcomes of the perturbations and ensures adaptation to the new external conditions. When this understanding is taken into the urban context - or rather “urban resilience” - it delineates the capacity of cities to anticipate upcoming shocks and changes without undergoing major alterations in its functional, physical, socio-economic systems. Undoubtedly, the issue of coordinating the urban systems in a “resilient” form is a multidisciplinary and complex process as the cities are multi-layered and dynamic structures. The concept of “urban transformation” is first launched in Europe just after World War II. It has been applied through different methods such as renovation, revitalization, improvement and gentrification. These methods have been in continuous advancement by acquiring new meanings and trends over years. With the effects of neoliberal policies in the 1980s, the concept of urban transformation has been associated with economic objectives. Subsequently this understanding has been improved over time and had new orientations such as providing more social justice and environmental sustainability. The aim of this research is to identify the most applied urban transformation methods in Turkey and its main reasons of being selected. Moreover, investigating the lacking and limiting points of the urban transformation policies in the context of “urban resilience” in a comparative way with European interventions. The emblematic examples, which symbolize the breaking points of the recent evolution of urban transformation concepts in Europe and Turkey, are chosen and reviewed in a critical way.Keywords: resilience, urban dynamics, urban resilience, urban transformation
Procedia PDF Downloads 2671162 The Effect of Initial Sample Size and Increment in Simulation Samples on a Sequential Selection Approach
Authors: Mohammad H. Almomani
Abstract:
In this paper, we argue the effect of the initial sample size, and the increment in simulation samples on the performance of a sequential approach that used in selecting the top m designs when the number of alternative designs is very large. The sequential approach consists of two stages. In the first stage the ordinal optimization is used to select a subset that overlaps with the set of actual best k% designs with high probability. Then in the second stage the optimal computing budget is used to select the top m designs from the selected subset. We apply the selection approach on a generic example under some parameter settings, with a different choice of initial sample size and the increment in simulation samples, to explore the impacts on the performance of this approach. The results show that the choice of initial sample size and the increment in simulation samples does affect the performance of a selection approach.Keywords: Large Scale Problems, Optimal Computing Budget Allocation, ordinal optimization, simulation optimization
Procedia PDF Downloads 3571161 Continuous Functions Modeling with Artificial Neural Network: An Improvement Technique to Feed the Input-Output Mapping
Authors: A. Belayadi, A. Mougari, L. Ait-Gougam, F. Mekideche-Chafa
Abstract:
The artificial neural network is one of the interesting techniques that have been advantageously used to deal with modeling problems. In this study, the computing with artificial neural network (CANN) is proposed. The model is applied to modulate the information processing of one-dimensional task. We aim to integrate a new method which is based on a new coding approach of generating the input-output mapping. The latter is based on increasing the neuron unit in the last layer. Accordingly, to show the efficiency of the approach under study, a comparison is made between the proposed method of generating the input-output set and the conventional method. The results illustrated that the increasing of the neuron units, in the last layer, allows to find the optimal network’s parameters that fit with the mapping data. Moreover, it permits to decrease the training time, during the computation process, which avoids the use of computers with high memory usage.Keywords: neural network computing, continuous functions generating the input-output mapping, decreasing the training time, machines with big memories
Procedia PDF Downloads 2861160 Variable Renewable Energy Droughts in the Power Sector – A Model-based Analysis and Implications in the European Context
Authors: Martin Kittel, Alexander Roth
Abstract:
The continuous integration of variable renewable energy sources (VRE) in the power sector is required for decarbonizing the European economy. Power sectors become increasingly exposed to weather variability, as the availability of VRE, i.e., mainly wind and solar photovoltaic, is not persistent. Extreme events, e.g., long-lasting periods of scarce VRE availability (‘VRE droughts’), challenge the reliability of supply. Properly accounting for the severity of VRE droughts is crucial for designing a resilient renewable European power sector. Energy system modeling is used to identify such a design. Our analysis reveals the sensitivity of the optimal design of the European power sector towards VRE droughts. We analyze how VRE droughts impact optimal power sector investments, especially in generation and flexibility capacity. We draw upon work that systematically identifies VRE drought patterns in Europe in terms of frequency, duration, and seasonality, as well as the cross-regional and cross-technological correlation of most extreme drought periods. Based on their analysis, the authors provide a selection of relevant historical weather years representing different grades of VRE drought severity. These weather years will serve as input for the capacity expansion model for the European power sector used in this analysis (DIETER). We additionally conduct robustness checks varying policy-relevant assumptions on capacity expansion limits, interconnections, and level of sector coupling. Preliminary results illustrate how an imprudent selection of weather years may cause underestimating the severity of VRE droughts, flawing modeling insights concerning the need for flexibility. Sub-optimal European power sector designs vulnerable to extreme weather can result. Using relevant weather years that appropriately represent extreme weather events, our analysis identifies a resilient design of the European power sector. Although the scope of this work is limited to the European power sector, we are confident that our insights apply to other regions of the world with similar weather patterns. Many energy system studies still rely on one or a limited number of sometimes arbitrarily chosen weather years. We argue that the deliberate selection of relevant weather years is imperative for robust modeling results.Keywords: energy systems, numerical optimization, variable renewable energy sources, energy drought, flexibility
Procedia PDF Downloads 81