Search results for: Cloud Computing
1083 Design and Application of NFC-Based Identity and Access Management in Cloud Services
Authors: Shin-Jer Yang, Kai-Tai Yang
Abstract:
In response to a changing world and the fast growth of the Internet, more and more enterprises are replacing web-based services with cloud-based ones. Multi-tenancy technology is becoming more important especially with Software as a Service (SaaS). This in turn leads to a greater focus on the application of Identity and Access Management (IAM). Conventional Near-Field Communication (NFC) based verification relies on a computer browser and a card reader to access an NFC tag. This type of verification does not support mobile device login and user-based access management functions. This study designs an NFC-based third-party cloud identity and access management scheme (NFC-IAM) addressing this shortcoming. Data from simulation tests analyzed with Key Performance Indicators (KPIs) suggest that the NFC-IAM not only takes less time in identity identification but also cuts time by 80% in terms of two-factor authentication and improves verification accuracy to 99.9% or better. In functional performance analyses, NFC-IAM performed better in salability and portability. The NFC-IAM App (Application Software) and back-end system to be developed and deployed in mobile device are to support IAM features and also offers users a more user-friendly experience and stronger security protection. In the future, our NFC-IAM can be employed to different environments including identification for mobile payment systems, permission management for remote equipment monitoring, among other applications.Keywords: cloud service, multi-tenancy, NFC, IAM, mobile device
Procedia PDF Downloads 4351082 A Study on the HTML5 Based Multi Media Contents Authority Tool
Authors: Heesuk Seo, Yongtae Kim
Abstract:
Online learning started in the 1990s, the spread of the Internet has been through the era of e-learning paradigm of online education in the era of smart learning change. Reflecting the different nature of the mobile to anywhere anytime, anywhere was also allows the form of learning, it was also available through the learning content and interaction. We are developing a cloud system, 'TLINKS CLOUD' that allows you to configure the environment of the smart learning without the need for additional infrastructure. Using the big-data analysis for e-learning contents, we provide an integrated solution for e-learning tailored to individual study.Keywords: authority tool, big data analysis, e-learning, HTML5
Procedia PDF Downloads 4071081 Flexible 3D Virtual Desktop Using Handles for Cloud Environments
Abstract:
Due to the improvement in performance of computer hardware and the development of operating systems, a multi-tasking for several programs has become one of the basic functions to computer users. It is natural for computer users to want more functional, convenient, and visual GUI functions (Graphic User Interface). In this paper, a 3D virtual desktop system was proposed to meet users’ requirements for cloud environments such as a virtual desktop function in the Windows environment. The proposed system uses the handles of the windows to hide or restore several windows. It connects the list of task spaces using the circular double linked list to manage the handles. Each handle list is registered in the corresponding task space being executed. The 3D virtual desktop is efficient and flexible in handling the numbers of task spaces and can help users to work under more comfortable environments. Acknowledgment: This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Korea government (MSIP) (NRF-2015R1D1A1A01057680).Keywords: virtual desktop, GUI, cloud, virtualization
Procedia PDF Downloads 2101080 An Industrial Wastewater Management Using Cloud Based IoT System
Authors: Kaarthik K., Harshini S., Karthika M., Kripanandhini T.
Abstract:
Water is an essential part of living organisms. Major water pollution is caused due to contamination of industrial wastewater in the river. The most important step in bringing wastewater contaminants down to levels that are safe for nature is wastewater treatment. The contamination of river water harms both humans who consume it and the aquatic life that lives there. We introduce a new cloud-based industrial IoT paradigm in this work for real-time control and monitoring of wastewater. The proposed system prevents prohibited entry of industrial wastewater into the plant by monitoring temperature, hydrogen power (pH), CO₂ and turbidity factors from the wastewater input that the wastewater treatment facility will process. Real-time sensor values are collected and uploaded to the cloud by the system using an IoT Wi-Fi Module. By doing so, we can prevent the contamination of industrial wastewater entering the river earlier, and the necessary actions will be taken by the users. The proposed system's results are 90% efficient, preventing water pollution due to industry and protecting human lives.Keywords: sensors, pH, CO₂, temperature, turbidity
Procedia PDF Downloads 1101079 AMBICOM: An Ambient Computing Middleware Architecture for Heterogeneous Environments
Authors: Ekrem Aksoy, Nihat Adar, Selçuk Canbek
Abstract:
Ambient Computing or Ambient Intelligence (AmI) is emerging area in computer science aiming to create intelligently connected environments and Internet of Things. In this paper, we propose communication middleware architecture for AmI. This middleware architecture addresses problems of communication, networking, and abstraction of applications, although there are other aspects (e.g. HCI and Security) within general AmI framework. Within this middleware architecture, any application developer might address HCI and Security issues with extensibility features of this platform.Keywords: AmI, ambient computing, middleware, distributed-systems, software-defined networking
Procedia PDF Downloads 2851078 Bioethanol Production from Wild Sorghum (Sorghum arundinacieum) and Spear Grass (Heteropogon contortus)
Authors: Adeyinka Adesanya, Isaac Bamgboye
Abstract:
There is a growing need to develop the processes to produce renewable fuels and chemicals due to the economic, political, and environmental concerns associated with fossil fuels. Lignocellulosic biomass is an excellent renewable feedstock because it is both abundant and inexpensive. This project aims at producing bioethanol from lignocellulosic plants (Sorghum Arundinacieum and Heteropogon Contortus) by biochemical means, computing the energy audit of the process and determining the fuel properties of the produced ethanol. Acid pretreatment (0.5% H2SO4 solution) and enzymatic hydrolysis (using malted barley as enzyme source) were employed. The ethanol yield of wild sorghum was found to be 20% while that of spear grass was 15%. The fuel properties of the bioethanol from wild sorghum are 1.227 centipoise for viscosity, 1.10 g/cm3 for density, 0.90 for specific gravity, 78 °C for boiling point and the cloud point was found to be below -30 °C. That of spear grass was 1.206 centipoise for viscosity, 0.93 g/cm3 for density 1.08 specific gravity, 78 °C for boiling point and the cloud point was also found to be below -30 °C. The energy audit shows that about 64 % of the total energy was used up during pretreatment, while product recovery which was done manually demanded about 31 % of the total energy. Enzymatic hydrolysis, fermentation, and distillation total energy input were 1.95 %, 1.49 % and 1.04 % respectively, the alcoholometric strength of bioethanol from wild sorghum was found to be 47 % and the alcoholometric strength of bioethanol from spear grass was 72 %. Also, the energy efficiency of the bioethanol production for both grasses was 3.85 %.Keywords: lignocellulosic biomass, wild sorghum, spear grass, biochemical conversion
Procedia PDF Downloads 2361077 3D Point Cloud Model Color Adjustment by Combining Terrestrial Laser Scanner and Close Range Photogrammetry Datasets
Authors: M. Pepe, S. Ackermann, L. Fregonese, C. Achille
Abstract:
3D models obtained with advanced survey techniques such as close-range photogrammetry and laser scanner are nowadays particularly appreciated in Cultural Heritage and Archaeology fields. In order to produce high quality models representing archaeological evidences and anthropological artifacts, the appearance of the model (i.e. color) beyond the geometric accuracy, is not a negligible aspect. The integration of the close-range photogrammetry survey techniques with the laser scanner is still a topic of study and research. By combining point cloud data sets of the same object generated with both technologies, or with the same technology but registered in different moment and/or natural light condition, could construct a final point cloud with accentuated color dissimilarities. In this paper, a methodology to uniform the different data sets, to improve the chromatic quality and to highlight further details by balancing the point color will be presented.Keywords: color models, cultural heritage, laser scanner, photogrammetry
Procedia PDF Downloads 2801076 Digital Manufacturing: Evolution and a Process Oriented Approach to Align with Business Strategy
Authors: Abhimanyu Pati, Prabir K. Bandyopadhyay
Abstract:
The paper intends to highlight the significance of Digital Manufacturing (DM) strategy in support and achievement of business strategy and goals of any manufacturing organization. Towards this end, DM initiatives have been given a process perspective, while not undermining its technological significance, with a view to link its benefits directly with fulfilment of customer needs and expectations in a responsive and cost-effective manner. A digital process model has been proposed to categorize digitally enabled organizational processes with a view to create synergistic groups, which adopt and use digital tools having similar characteristics and functionalities. This will throw future opportunities for researchers and developers to create a unified technology environment for integration and orchestration of processes. Secondly, an effort has been made to apply “what” and “how” features of Quality Function Deployment (QFD) framework to establish the relationship between customers’ needs – both for external and internal customers, and the features of various digital processes, which support for the achievement of these customer expectations. The paper finally concludes that in the present highly competitive environment, business organizations cannot thrive to sustain unless they understand the significance of digital strategy and integrate it with their business strategy with a clearly defined implementation roadmap. A process-oriented approach to DM strategy will help business executives and leaders to appreciate its value propositions and its direct link to organization’s competitiveness.Keywords: knowledge management, cloud computing, knowledge management approaches, cloud-based knowledge management
Procedia PDF Downloads 3091075 Optimization for Autonomous Robotic Construction by Visual Guidance through Machine Learning
Authors: Yangzhi Li
Abstract:
Network transfer of information and performance customization is now a viable method of digital industrial production in the era of Industry 4.0. Robot platforms and network platforms have grown more important in digital design and construction. The pressing need for novel building techniques is driven by the growing labor scarcity problem and increased awareness of construction safety. Robotic approaches in construction research are regarded as an extension of operational and production tools. Several technological theories related to robot autonomous recognition, which include high-performance computing, physical system modeling, extensive sensor coordination, and dataset deep learning, have not been explored using intelligent construction. Relevant transdisciplinary theory and practice research still has specific gaps. Optimizing high-performance computing and autonomous recognition visual guidance technologies improves the robot's grasp of the scene and capacity for autonomous operation. Intelligent vision guidance technology for industrial robots has a serious issue with camera calibration, and the use of intelligent visual guiding and identification technologies for industrial robots in industrial production has strict accuracy requirements. It can be considered that visual recognition systems have challenges with precision issues. In such a situation, it will directly impact the effectiveness and standard of industrial production, necessitating a strengthening of the visual guiding study on positioning precision in recognition technology. To best facilitate the handling of complicated components, an approach for the visual recognition of parts utilizing machine learning algorithms is proposed. This study will identify the position of target components by detecting the information at the boundary and corner of a dense point cloud and determining the aspect ratio in accordance with the guidelines for the modularization of building components. To collect and use components, operational processing systems assign them to the same coordinate system based on their locations and postures. The RGB image's inclination detection and the depth image's verification will be used to determine the component's present posture. Finally, a virtual environment model for the robot's obstacle-avoidance route will be constructed using the point cloud information.Keywords: robotic construction, robotic assembly, visual guidance, machine learning
Procedia PDF Downloads 861074 A Novel Meta-Heuristic Algorithm Based on Cloud Theory for Redundancy Allocation Problem under Realistic Condition
Authors: H. Mousavi, M. Sharifi, H. Pourvaziri
Abstract:
Redundancy Allocation Problem (RAP) is a well-known mathematical problem for modeling series-parallel systems. It is a combinatorial optimization problem which focuses on determining an optimal assignment of components in a system design. In this paper, to be more practical, we have considered the problem of redundancy allocation of series system with interval valued reliability of components. Therefore, during the search process, the reliabilities of the components are considered as a stochastic variable with a lower and upper bounds. In order to optimize the problem, we proposed a simulated annealing based on cloud theory (CBSAA). Also, the Monte Carlo simulation (MCS) is embedded to the CBSAA to handle the random variable components’ reliability. This novel approach has been investigated by numerical examples and the experimental results have shown that the CBSAA combining MCS is an efficient tool to solve the RAP of systems with interval-valued component reliabilities.Keywords: redundancy allocation problem, simulated annealing, cloud theory, monte carlo simulation
Procedia PDF Downloads 4121073 Enhancing Information Technologies with AI: Unlocking Efficiency, Scalability, and Innovation
Authors: Abdal-Hafeez Alhussein
Abstract:
Artificial Intelligence (AI) has become a transformative force in the field of information technologies, reshaping how data is processed, analyzed, and utilized across various domains. This paper explores the multifaceted applications of AI within information technology, focusing on three key areas: automation, scalability, and data-driven decision-making. We delve into how AI-powered automation is optimizing operational efficiency in IT infrastructures, from automated network management to self-healing systems that reduce downtime and enhance performance. Scalability, another critical aspect, is addressed through AI’s role in cloud computing and distributed systems, enabling the seamless handling of increasing data loads and user demands. Additionally, the paper highlights the use of AI in cybersecurity, where real-time threat detection and adaptive response mechanisms significantly improve resilience against sophisticated cyberattacks. In the realm of data analytics, AI models—especially machine learning and natural language processing—are driving innovation by enabling more precise predictions, automated insights extraction, and enhanced user experiences. The paper concludes with a discussion on the ethical implications of AI in information technologies, underscoring the importance of transparency, fairness, and responsible AI use. It also offers insights into future trends, emphasizing the potential of AI to further revolutionize the IT landscape by integrating with emerging technologies like quantum computing and IoT.Keywords: artificial intelligence, information technology, automation, scalability
Procedia PDF Downloads 171072 Defining a Reference Architecture for Predictive Maintenance Systems: A Case Study Using the Microsoft Azure IoT-Cloud Components
Authors: Walter Bernhofer, Peter Haber, Tobias Mayer, Manfred Mayr, Markus Ziegler
Abstract:
Current preventive maintenance measures are cost intensive and not efficient. With the available sensor data of state of the art internet of things devices new possibilities of automated data processing emerge. Current advances in data science and in machine learning enable new, so called predictive maintenance technologies, which empower data scientists to forecast possible system failures. The goal of this approach is to cut expenses in preventive maintenance by automating the detection of possible failures and to improve efficiency and quality of maintenance measures. Additionally, a centralization of the sensor data monitoring can be achieved by using this approach. This paper describes the approach of three students to define a reference architecture for a predictive maintenance solution in the internet of things domain with a connected smartphone app for service technicians. The reference architecture is validated by a case study. The case study is implemented with current Microsoft Azure cloud technologies. The results of the case study show that the reference architecture is valid and can be used to achieve a system for predictive maintenance execution with the cloud components of Microsoft Azure. The used concepts are technology platform agnostic and can be reused in many different cloud platforms. The reference architecture is valid and can be used in many use cases, like gas station maintenance, elevator maintenance and many more.Keywords: case study, internet of things, predictive maintenance, reference architecture
Procedia PDF Downloads 2521071 Towards Natively Context-Aware Web Services
Authors: Hajer Taktak, Faouzi Moussa
Abstract:
With the ubiquitous computing’s emergence and the evolution of enterprises’ needs, one of the main challenges is to build context-aware applications based on Web services. These applications have become particularly relevant in the pervasive computing domain. In this paper, we introduce our approach that optimizes the use of Web services with context notions when dealing with contextual environments. We focus particularly on making Web services autonomous and natively context-aware. We implement and evaluate the proposed approach with a pedagogical example of a context-aware Web service treating temperature values.Keywords: context-aware, CXF framework, ubiquitous computing, web service
Procedia PDF Downloads 3611070 Design of Quality Assessment System for On-Orbit 3D Printing Based on 3D Reconstruction Technology
Authors: Jianning Tang, Trevor Hocksun Kwan, Xiaofeng Wu
Abstract:
With the increasing demand for space use in multiple sectors (navigation, telecommunication, imagery, etc.), the deployment and maintenance demand of satellites are growing. Considering the high launching cost and the restrictions on weight and size of the payload when using launch vehicle, the technique of on-orbit manufacturing has obtained more attention because of its significant potential to support future space missions. 3D printing is the most promising manufacturing technology that could be applied in space. However, due to the lack of autonomous quality assessment, the operation of conventional 3D printers still relies on human presence to supervise the printing process. This paper is proposed to develop an automatic 3D reconstruction system aiming at detecting failures on the 3D printed objects through application of point cloud technology. Based on the data obtained from the point cloud, the 3D printer could locate the failure and repair the failure. The system will increase automation and provide 3D printing with more feasibilities for space use without human interference.Keywords: 3D printing, quality assessment, point cloud, on-orbit manufacturing
Procedia PDF Downloads 1201069 Risk Assessment of Natural Gas Pipelines in Coal Mined Gobs Based on Bow-Tie Model and Cloud Inference
Authors: Xiaobin Liang, Wei Liang, Laibin Zhang, Xiaoyan Guo
Abstract:
Pipelines pass through coal mined gobs inevitably in the mining area, the stability of which has great influence on the safety of pipelines. After extensive literature study and field research, it was found that there are a few risk assessment methods for coal mined gob pipelines, and there is a lack of data on the gob sites. Therefore, the fuzzy comprehensive evaluation method is widely used based on expert opinions. However, the subjective opinions or lack of experience of individual experts may lead to inaccurate evaluation results. Hence the accuracy of the results needs to be further improved. This paper presents a comprehensive approach to achieve this purpose by combining bow-tie model and cloud inference. The specific evaluation process is as follows: First, a bow-tie model composed of a fault tree and an event tree is established to graphically illustrate the probability and consequence indicators of pipeline failure. Second, the interval estimation method can be scored in the form of intervals to improve the accuracy of the results, and the censored mean algorithm is used to remove the maximum and minimum values of the score to improve the stability of the results. The golden section method is used to determine the weight of the indicators and reduce the subjectivity of index weights. Third, the failure probability and failure consequence scores of the pipeline are converted into three numerical features by using cloud inference. The cloud inference can better describe the ambiguity and volatility of the results which can better describe the volatility of the risk level. Finally, the cloud drop graphs of failure probability and failure consequences can be expressed, which intuitively and accurately illustrate the ambiguity and randomness of the results. A case study of a coal mine gob pipeline carrying natural gas has been investigated to validate the utility of the proposed method. The evaluation results of this case show that the probability of failure of the pipeline is very low, the consequences of failure are more serious, which is consistent with the reality.Keywords: bow-tie model, natural gas pipeline, coal mine gob, cloud inference
Procedia PDF Downloads 2501068 Exploring MPI-Based Parallel Computing in Analyzing Very Large Sequences
Authors: Bilal Wajid, Erchin Serpedin
Abstract:
The health industry is aiming towards personalized medicine. If the patient’s genome needs to be sequenced it is important that the entire analysis be completed quickly. This paper explores use of parallel computing to analyze very large sequences. Two cases have been considered. In the first case, the sequence is kept constant and the effect of increasing the number of MPI-based processes is evaluated in terms of execution time, speed and efficiency. In the second case the number of MPI-based processes have been kept constant whereas, the length of the sequence was increased.Keywords: parallel computing, alignment, genome assembly, alignment
Procedia PDF Downloads 2761067 Students’ Willingness to Use Public Computing Facilities at a Library
Authors: Norbayah Mohd Suki, Norazah Mohd Suki
Abstract:
This study aims to examine relationships between attitude, self-efficacy, and subjective norm with students’ behavioural intention to use public computing facilities at a library. Data was collected from 200 undergraduate students enrolled at a higher learning institution in the Federal Territory of Labuan, Malaysia via a structured questionnaire comprising closed-ended questions. Data was analyzed using multiple regression analysis. The results show that students’ behavioural intention to use public computing facilities at the library is widely affected by subjective norm factor i.e. influence of the support of family members, friends and neighbours. The findings of this study provide a better understanding of factors likely to influence students’ behavioural intention to use public computing facilities at a library. It also offers valuable insights into factors which university librarians need to focus on to improve students’ behavioural intention to actively use public computing facilities at a library for quality information retrieval. Direction for future research is also presented.Keywords: attitude, self-efficacy, subjective norm, behavioural intention
Procedia PDF Downloads 4461066 Movement Optimization of Robotic Arm Movement Using Soft Computing
Authors: V. K. Banga
Abstract:
Robots are now playing a very promising role in industries. Robots are commonly used in applications in repeated operations or where operation by human is either risky or not feasible. In most of the industrial applications, robotic arm manipulators are widely used. Robotic arm manipulator with two link or three link structures is commonly used due to their low degrees-of-freedom (DOF) movement. As the DOF of robotic arm increased, complexity increases. Instrumentation involved with robotics plays very important role in order to interact with outer environment. In this work, optimal control for movement of various DOFs of robotic arm using various soft computing techniques has been presented. We have discussed about different robotic structures having various DOF robotics arm movement. Further stress is on kinematics of the arm structures i.e. forward kinematics and inverse kinematics. Trajectory planning of robotic arms using soft computing techniques is demonstrating the flexibility of this technique. The performance is optimized for all possible input values and results in optimized movement as resultant output. In conclusion, soft computing has been playing very important role for achieving optimized movement of robotic arm. It also requires very limited knowledge of the system to implement soft computing techniques.Keywords: artificial intelligence, kinematics, robotic arm, neural networks, fuzzy logic
Procedia PDF Downloads 2971065 A New Distributed Computing Environment Based On Mobile Agents for Massively Parallel Applications
Authors: Fatéma Zahra Benchara, Mohamed Youssfi, Omar Bouattane, Hassan Ouajji, Mohamed Ouadi Bensalah
Abstract:
In this paper, we propose a new distributed environment for High Performance Computing (HPC) based on mobile agents. It allows us to perform parallel programs execution as distributed one over a flexible grid constituted by a cooperative mobile agent team works. The distributed program to be performed is encapsulated on team leader agent which deploys its team workers as Agent Virtual Processing Unit (AVPU). Each AVPU is asked to perform its assigned tasks and provides the computational results which make the data and team works tasks management difficult for the team leader agent and that influence the performance computing. In this work we focused on the implementation of the Mobile Provider Agent (MPA) in order to manage the distribution of data and instructions and to ensure a load balancing model. It grants also some interesting mechanisms to manage the others computing challenges thanks to the mobile agents several skills.Keywords: image processing, distributed environment, mobile agents, parallel and distributed computing
Procedia PDF Downloads 4101064 Resource Orchestration Based on Two-Sides Scheduling in Computing Network Control Sytems
Authors: Li Guo, Jianhong Wang, Dian Huang, Shengzhong Feng
Abstract:
Computing networks as a new network architecture has shown great promise in boosting the utilization of different resources, such as computing, caching, and communications. To maximise the efficiency of resource orchestration in computing network control systems (CNCSs), this work proposes a dynamic orchestration strategy of a different resource based on task requirements from computing power requestors (CPRs). Specifically, computing power providers (CPPs) in CNCSs could share information with each other through communication channels on the basis of blockchain technology, especially their current idle resources. This dynamic process is modeled as a cooperative game in which CPPs have the same target of maximising long-term rewards by improving the resource utilization ratio. Meanwhile, the task requirements from CPRs, including size, deadline, and calculation, are simultaneously considered in this paper. According to task requirements, the proposed orchestration strategy could schedule the best-fitting resource in CNCSs, achieving the maximum long-term rewards of CPPs and the best quality of experience (QoE) of CRRs at the same time. Based on the EdgeCloudSim simulation platform, the efficiency of the proposed strategy is achieved from both sides of CPRs and CPPs. Besides, experimental results show that the proposed strategy outperforms the other comparisons in all cases.Keywords: computing network control systems, resource orchestration, dynamic scheduling, blockchain, cooperative game
Procedia PDF Downloads 1141063 Collision Detection Algorithm Based on Data Parallelism
Authors: Zhen Peng, Baifeng Wu
Abstract:
Modern computing technology enters the era of parallel computing with the trend of sustainable and scalable parallelism. Single Instruction Multiple Data (SIMD) is an important way to go along with the trend. It is able to gather more and more computing ability by increasing the number of processor cores without the need of modifying the program. Meanwhile, in the field of scientific computing and engineering design, many computation intensive applications are facing the challenge of increasingly large amount of data. Data parallel computing will be an important way to further improve the performance of these applications. In this paper, we take the accurate collision detection in building information modeling as an example. We demonstrate a model for constructing a data parallel algorithm. According to the model, a complex object is decomposed into the sets of simple objects; collision detection among complex objects is converted into those among simple objects. The resulting algorithm is a typical SIMD algorithm, and its advantages in parallelism and scalability is unparalleled in respect to the traditional algorithms.Keywords: data parallelism, collision detection, single instruction multiple data, building information modeling, continuous scalability
Procedia PDF Downloads 2891062 Providing Reliability, Availability and Scalability Support for Quick Assist Technology Cryptography on the Cloud
Authors: Songwu Shen, Garrett Drysdale, Veerendranath Mannepalli, Qihua Dai, Yuan Wang, Yuli Chen, David Qian, Utkarsh Kakaiya
Abstract:
Hardware accelerator has been a promising solution to reduce the cost of cloud data centers. This paper investigates the QoS enhancement of the acceleration of an important datacenter workload: the webserver (or proxy) that faces high computational consumption originated from secure sockets layer (SSL) or transport layer security (TLS) procession in the cloud environment. Our study reveals that for the accelerator maintenance cases—need to upgrade driver/firmware or hardware reset due to hardware hang; we still can provide cryptography services by switching to software during maintenance phase and then switching back to accelerator after maintenance. The switching is seamless to server application such as Nginx that runs inside a VM on top of the server. To achieve this high availability goal, we propose a comprehensive fallback solution based on Intel® QuickAssist Technology (QAT). This approach introduces an architecture that involves the collaboration between physical function (PF) and virtual function (VF), and collaboration among VF, OpenSSL, and web application Nginx. The evaluation shows that our solution could provide high reliability, availability, and scalability (RAS) of hardware cryptography service in a 7x24x365 manner in the cloud environment.Keywords: accelerator, cryptography service, RAS, secure sockets layer/transport layer security, SSL/TLS, virtualization fallback architecture
Procedia PDF Downloads 1591061 The Impact of Vertical Velocity Parameter Conditions and Its Relationship with Weather Parameters in the Hail Event
Authors: Nadine Ayasha
Abstract:
Hail happened in Sukabumi (August 23, 2020), Sekadau (August 22, 2020), and Bogor (September 23, 2020), where this extreme weather phenomenon occurred in the dry season. This study uses the ERA5 reanalysis model data, it aims to examine the vertical velocity impact on the hail occurrence in the dry season, as well as its relation to other weather parameters such as relative humidity, streamline, and wind velocity. Moreover, HCAI product satellite data is used as supporting data for the convective cloud development analysis. Based on the results of graphs, contours, and Hovmoller vertical cut from ERA5 modeling, the vertical velocity values in the 925 Mb-300 Mb layer in Sukabumi, Sekadau, and Bogor before the hail event ranged between -1.2-(-0.2), -1.5-(-0.2), -1-0 Pa/s. A negative value indicates that there is an upward motion from the air mass that trigger the convective cloud growth, which produces hail. It is evidenced by the presence of Cumulonimbus cloud on HCAI product when the hail falls. Therefore, the vertical velocity has significant effect on the hail event. In addition, the relative humidity in the 850-700 Mb layer is quite wet, which ranges from 80-90%. Meanwhile, the streamline and wind velocity in the three regions show the convergence with slowing wind velocity ranging from 2-4 knots. These results show that the upward motion of the vertical velocity is enough to form the wet atmospheric humidity and form a convergence for the growth of the convective cloud, which produce hail in the dry season.Keywords: hail, extreme weather, vertical velocity, relative humidity, streamline
Procedia PDF Downloads 1591060 The Right to Data Portability and Its Influence on the Development of Digital Services
Authors: Roman Bieda
Abstract:
The General Data Protection Regulation (GDPR) will come into force on 25 May 2018 which will create a new legal framework for the protection of personal data in the European Union. Article 20 of GDPR introduces a right to data portability. This right allows for data subjects to receive the personal data which they have provided to a data controller, in a structured, commonly used and machine-readable format, and to transmit this data to another data controller. The right to data portability, by facilitating transferring personal data between IT environments (e.g.: applications), will also facilitate changing the provider of services (e.g. changing a bank or a cloud computing service provider). Therefore, it will contribute to the development of competition and the digital market. The aim of this paper is to discuss the right to data portability and its influence on the development of new digital services.Keywords: data portability, digital market, GDPR, personal data
Procedia PDF Downloads 4731059 Rainwater Harvesting and Management of Ground Water (Case Study Weather Modification Project in Iran)
Authors: Samaneh Poormohammadi, Farid Golkar, Vahideh Khatibi Sarabi
Abstract:
Climate change and consecutive droughts have increased the importance of using rainwater harvesting methods. One of the methods of rainwater harvesting and, in other words, the management of atmospheric water resources is the use of weather modification technologies. Weather modification (also known as weather control) is the act of intentionally manipulating or altering the weather. The most common form of weather modification is cloud seeding, which increases rain or snow, usually for the purpose of increasing the local water supply. Cloud seeding operations in Iran have been married since 1999 in central Iran with the aim of harvesting rainwater and reducing the effects of drought. In this research, we analyze the results of cloud seeding operations in the Simindashtplain in northern Iran. Rainwater harvesting with the help of cloud seeding technology has been evaluated through its effects on surface water and underground water. For this purpose, two different methods have been used to estimate runoff. The first method is the US Soil Conservation Service (SCS) curve number method. Another method, known as the reasoning method, has also been used. In order to determine the infiltration rate of underground water, the balance reports of the comprehensive water plan of the country have been used. In this regard, the study areas located in the target area of each province have been extracted by drawing maps of the influence coefficients of each area in the GIS software. It should be mentioned that the infiltration coefficients were taken from the balance sheet reports of the country's comprehensive water plan. Then, based on the area of each study area, the weighted average of the infiltration coefficient of the study areas located in the target area of each province is considered as the infiltration coefficient of that province. Results show that the amount of water extracted from the rain with the help of cloud seeding projects in Simindasht is as follows: an increase in runoff 63.9 million cubic meters (with SCS equation) or 51.2 million cubic meters (with logical equation) and an increase in ground water resources: 40.5 million cubic meters.Keywords: rainwater harvesting, ground water, atmospheric water resources, weather modification, cloud seeding
Procedia PDF Downloads 1051058 Optimization of Cloud Classification Using Particle Swarm Algorithm
Authors: Riffi Mohammed Amine
Abstract:
A cloud is made up of small particles of liquid water or ice suspended in the atmosphere, which generally do not reach the ground. Various methods are used to classify clouds. This article focuses specifically on a technique known as particle swarm optimization (PSO), an AI approach inspired by the collective behaviors of animals living in groups, such as schools of fish and flocks of birds, and a method used to solve complex classification and optimization problems with approximate solutions. The proposed technique was evaluated using a series of second-generation METOSAT images taken by the MSG satellite. The acquired results indicate that the proposed method gave acceptable results.Keywords: remote sensing, particle swarm optimization, clouds, meteorological image
Procedia PDF Downloads 171057 Security Design of Root of Trust Based on RISC-V
Authors: Kang Huang, Wanting Zhou, Shiwei Yuan, Lei Li
Abstract:
Since information technology develops rapidly, the security issue has become an increasingly critical for computer system. In particular, as cloud computing and the Internet of Things (IoT) continue to gain widespread adoption, computer systems need to new security threats and attacks. The Root of Trust (RoT) is the foundation for providing basic trusted computing, which is used to verify the security and trustworthiness of other components. Design a reliable Root of Trust and guarantee its own security are essential for improving the overall security and credibility of computer systems. In this paper, we discuss the implementation of self-security technology based on the RISC-V Root of Trust at the hardware level. To effectively safeguard the security of the Root of Trust, researches on security safeguard technology on the Root of Trust have been studied. At first, a lightweight and secure boot framework is proposed as a secure mechanism. Secondly, two kinds of memory protection mechanism are built to against memory attacks. Moreover, hardware implementation of proposed method has been also investigated. A series of experiments and tests have been carried on to verify to effectiveness of the proposed method. The experimental results demonstrated that the proposed approach is effective in verifying the integrity of the Root of Trust’s own boot rom, user instructions, and data, ensuring authenticity and enabling the secure boot of the Root of Trust’s own system. Additionally, our approach provides memory protection against certain types of memory attacks, such as cache leaks and tampering, and ensures the security of root-of-trust sensitive information, including keys.Keywords: root of trust, secure boot, memory protection, hardware security
Procedia PDF Downloads 2161056 Virtualizing Attendance and Reducing Impacts on the Environment with a Mobile Application
Authors: Paulo R. M. Andrade, Adriano B. Albuquerque, Otávio F. Frota, Robson V. Silveira, Fátima A. da Silva
Abstract:
Information technology has been gaining more and more space whether in industry, commerce or even for personal use, but the misuse of it brings harm to the environment and human health as a result. Contribute to the sustainability of the planet is to compensate the environment, all or part of what withdraws it. The green computing also came to propose practical for use in IT in an environmentally correct way in aid of strategic management and communication. This work focuses on showing how a mobile application can help businesses reduce costs and reduced environmental impacts caused by its processes, through a case study of a public company in Brazil.Keywords: green computing, information technology, e-government, sustainable development, mobile computing
Procedia PDF Downloads 4191055 Cloud-Based Multiresolution Geodata Cube for Efficient Raster Data Visualization and Analysis
Authors: Lassi Lehto, Jaakko Kahkonen, Juha Oksanen, Tapani Sarjakoski
Abstract:
The use of raster-formatted data sets in geospatial analysis is increasing rapidly. At the same time, geographic data are being introduced into disciplines outside the traditional domain of geoinformatics, like climate change, intelligent transport, and immigration studies. These developments call for better methods to deliver raster geodata in an efficient and easy-to-use manner. Data cube technologies have traditionally been used in the geospatial domain for managing Earth Observation data sets that have strict requirements for effective handling of time series. The same approach and methodologies can also be applied in managing other types of geospatial data sets. A cloud service-based geodata cube, called GeoCubes Finland, has been developed to support online delivery and analysis of most important geospatial data sets with national coverage. The main target group of the service is the academic research institutes in the country. The most significant aspects of the GeoCubes data repository include the use of multiple resolution levels, cloud-optimized file structure, and a customized, flexible content access API. Input data sets are pre-processed while being ingested into the repository to bring them into a harmonized form in aspects like georeferencing, sampling resolutions, spatial subdivision, and value encoding. All the resolution levels are created using an appropriate generalization method, selected depending on the nature of the source data set. Multiple pre-processed resolutions enable new kinds of online analysis approaches to be introduced. Analysis processes based on interactive visual exploration can be effectively carried out, as the level of resolution most close to the visual scale can always be used. In the same way, statistical analysis can be carried out on resolution levels that best reflect the scale of the phenomenon being studied. Access times remain close to constant, independent of the scale applied in the application. The cloud service-based approach, applied in the GeoCubes Finland repository, enables analysis operations to be performed on the server platform, thus making high-performance computing facilities easily accessible. The developed GeoCubes API supports this kind of approach for online analysis. The use of cloud-optimized file structures in data storage enables the fast extraction of subareas. The access API allows for the use of vector-formatted administrative areas and user-defined polygons as definitions of subareas for data retrieval. Administrative areas of the country in four levels are available readily from the GeoCubes platform. In addition to direct delivery of raster data, the service also supports the so-called virtual file format, in which only a small text file is first downloaded. The text file contains links to the raster content on the service platform. The actual raster data is downloaded on demand, from the spatial area and resolution level required in each stage of the application. By the geodata cube approach, pre-harmonized geospatial data sets are made accessible to new categories of inexperienced users in an easy-to-use manner. At the same time, the multiresolution nature of the GeoCubes repository facilitates expert users to introduce new kinds of interactive online analysis operations.Keywords: cloud service, geodata cube, multiresolution, raster geodata
Procedia PDF Downloads 1361054 Implementing a Neural Network on a Low-Power and Mobile Cluster to Aide Drivers with Predictive AI for Traffic Behavior
Authors: Christopher Lama, Alix Rieser, Aleksandra Molchanova, Charles Thangaraj
Abstract:
New technologies like Tesla’s Dojo have made high-performance embedded computing more available. Although automobile computing has developed and benefited enormously from these more recent technologies, the costs are still high, prohibitively high in some cases for broader adaptation, particularly for the after-market and enthusiast markets. This project aims to implement a Raspberry Pi-based low-power (under one hundred Watts) highly mobile computing cluster for a neural network. The computing cluster built from off-the-shelf components is more affordable and, therefore, makes wider adoption possible. The paper describes the design of the neural network, Raspberry Pi-based cluster, and applications the cluster will run. The neural network will use input data from sensors and cameras to project a live view of the road state as the user drives. The neural network will be trained to predict traffic behavior and generate warnings when potentially dangerous situations are predicted. The significant outcomes of this study will be two folds, firstly, to implement and test the low-cost cluster, and secondly, to ascertain the effectiveness of the predictive AI implemented on the cluster.Keywords: CS pedagogy, student research, cluster computing, machine learning
Procedia PDF Downloads 102