Search results for: cloud computing application
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9047

Search results for: cloud computing application

8807 Frequency- and Content-Based Tag Cloud Font Distribution Algorithm

Authors: Ágnes Bogárdi-Mészöly, Takeshi Hashimoto, Shohei Yokoyama, Hiroshi Ishikawa

Abstract:

The spread of Web 2.0 has caused user-generated content explosion. Users can tag resources to describe and organize them. Tag clouds provide rough impression of relative importance of each tag within overall cloud in order to facilitate browsing among numerous tags and resources. The goal of our paper is to enrich visualization of tag clouds. A font distribution algorithm has been proposed to calculate a novel metric based on frequency and content, and to classify among classes from this metric based on power law distribution and percentages. The suggested algorithm has been validated and verified on the tag cloud of a real-world thesis portal.

Keywords: tag cloud, font distribution algorithm, frequency-based, content-based, power law

Procedia PDF Downloads 476
8806 Protocol for Dynamic Load Distributed Low Latency Web-Based Augmented Reality and Virtual Reality

Authors: Rohit T. P., Sahil Athrij, Sasi Gopalan

Abstract:

Currently, the content entertainment industry is dominated by mobile devices. As the trends slowly shift towards Augmented/Virtual Reality applications the computational demands on these devices are increasing exponentially and we are already reaching the limits of hardware optimizations. This paper proposes a software solution to this problem. By leveraging the capabilities of cloud computing we can offload the work from mobile devices to dedicated rendering servers that are way more powerful. But this introduces the problem of latency. This paper introduces a protocol that can achieve high-performance low latency Augmented/Virtual Reality experience. There are two parts to the protocol, 1) In-flight compression The main cause of latency in the system is the time required to transmit the camera frame from client to server. The round trip time is directly proportional to the amount of data transmitted. This can therefore be reduced by compressing the frames before sending. Using some standard compression algorithms like JPEG can result in minor size reduction only. Since the images to be compressed are consecutive camera frames there won't be a lot of changes between two consecutive images. So inter-frame compression is preferred. Inter-frame compression can be implemented efficiently using WebGL but the implementation of WebGL limits the precision of floating point numbers to 16bit in most devices. This can introduce noise to the image due to rounding errors, which will add up eventually. This can be solved using an improved interframe compression algorithm. The algorithm detects changes between frames and reuses unchanged pixels from the previous frame. This eliminates the need for floating point subtraction thereby cutting down on noise. The change detection is also improved drastically by taking the weighted average difference of pixels instead of the absolute difference. The kernel weights for this comparison can be fine-tuned to match the type of image to be compressed. 2) Dynamic Load distribution Conventional cloud computing architectures work by offloading as much work as possible to the servers, but this approach can cause a hit on bandwidth and server costs. The most optimal solution is obtained when the device utilizes 100% of its resources and the rest is done by the server. The protocol balances the load between the server and the client by doing a fraction of the computing on the device depending on the power of the device and network conditions. The protocol will be responsible for dynamically partitioning the tasks. Special flags will be used to communicate the workload fraction between the client and the server and will be updated in a constant interval of time ( or frames ). The whole of the protocol is designed so that it can be client agnostic. Flags are available to the client for resetting the frame, indicating latency, switching mode, etc. The server can react to client-side changes on the fly and adapt accordingly by switching to different pipelines. The server is designed to effectively spread the load and thereby scale horizontally. This is achieved by isolating client connections into different processes.

Keywords: 2D kernelling, augmented reality, cloud computing, dynamic load distribution, immersive experience, mobile computing, motion tracking, protocols, real-time systems, web-based augmented reality application

Procedia PDF Downloads 49
8805 A Review Paper on Data Security in Precision Agriculture Using Internet of Things

Authors: Tonderai Muchenje, Xolani Mkhwanazi

Abstract:

Precision agriculture uses a number of technologies, devices, protocols, and computing paradigms to optimize agricultural processes. Big data, artificial intelligence, cloud computing, and edge computing are all used to handle the huge amounts of data generated by precision agriculture. However, precision agriculture is still emerging and has a low level of security features. Furthermore, future solutions will demand data availability and accuracy as key points to help farmers, and security is important to build robust and efficient systems. Since precision agriculture comprises a wide variety and quantity of resources, security addresses issues such as compatibility, constrained resources, and massive data. Moreover, conventional protection schemes used in the traditional internet may not be useful for agricultural systems, creating extra demands and opportunities. Therefore, this paper aims at reviewing state of the art of precision agriculture security, particularly in open field agriculture, discussing its architecture, describing security issues, and presenting the major challenges and future directions.

Keywords: precision agriculture, security, IoT, EIDE

Procedia PDF Downloads 62
8804 Exploiting Non-Uniform Utility of Computing: A Case Study

Authors: Arnab Sarkar, Michael Huang, Chuang Ren, Jun Li

Abstract:

The increasing importance of computing in modern society has brought substantial growth in the demand for more computational power. In some problem domains such as scientific simulations, available computational power still sets a limit on what can be practically explored in computation. For many types of code, there is non-uniformity in the utility of computation. That is not every piece of computation contributes equally to the quality of the result. If this non-uniformity is understood well and exploited effectively, we can much more effectively utilize available computing power. In this paper, we discuss a case study of exploring such non-uniformity in a particle-in-cell simulation platform. We find both the existence of significant non-uniformity and that it is generally straightforward to exploit it. We show the potential of order-of-magnitude effective performance gain while keeping the comparable quality of output. We also discuss some challenges in both the practical application of the idea and evaluation of its impact.

Keywords: approximate computing, landau damping, non uniform utility computing, particle-in-cell

Procedia PDF Downloads 229
8803 Exploring Data Stewardship in Fog Networking Using Blockchain Algorithm

Authors: Ruvaitha Banu, Amaladhithyan Krishnamoorthy

Abstract:

IoT networks today solve various consumer problems, from home automation systems to aiding in driving autonomous vehicles with the exploration of multiple devices. For example, in an autonomous vehicle environment, multiple sensors are available on roads to monitor weather and road conditions and interact with each other to aid the vehicle in reaching its destination safely and timely. IoT systems are predominantly dependent on the cloud environment for data storage, and computing needs that result in latency problems. With the advent of Fog networks, some of this storage and computing is pushed to the edge/fog nodes, saving the network bandwidth and reducing the latency proportionally. Managing the data stored in these fog nodes becomes crucial as it might also store sensitive information required for a certain application. Data management in fog nodes is strenuous because Fog networks are dynamic in terms of their availability and hardware capability. It becomes more challenging when the nodes in the network also live a short span, detaching and joining frequently. When an end-user or Fog Node wants to access, read, or write data stored in another Fog Node, then a new protocol becomes necessary to access/manage the data stored in the fog devices as a conventional static way of managing the data doesn’t work in Fog Networks. The proposed solution discusses a protocol that acts by defining sensitivity levels for the data being written and read. Additionally, a distinct data distribution and replication model among the Fog nodes is established to decentralize the access mechanism. In this paper, the proposed model implements stewardship towards the data stored in the Fog node using the application of Reinforcement Learning so that access to the data is determined dynamically based on the requests.

Keywords: IoT, fog networks, data stewardship, dynamic access policy

Procedia PDF Downloads 26
8802 Distributed System Computing Resource Scheduling Algorithm Based on Deep Reinforcement Learning

Authors: Yitao Lei, Xingxiang Zhai, Burra Venkata Durga Kumar

Abstract:

As the quantity and complexity of computing in large-scale software systems increase, distributed system computing becomes increasingly important. The distributed system realizes high-performance computing by collaboration between different computing resources. If there are no efficient resource scheduling resources, the abuse of distributed computing may cause resource waste and high costs. However, resource scheduling is usually an NP-hard problem, so we cannot find a general solution. However, some optimization algorithms exist like genetic algorithm, ant colony optimization, etc. The large scale of distributed systems makes this traditional optimization algorithm challenging to work with. Heuristic and machine learning algorithms are usually applied in this situation to ease the computing load. As a result, we do a review of traditional resource scheduling optimization algorithms and try to introduce a deep reinforcement learning method that utilizes the perceptual ability of neural networks and the decision-making ability of reinforcement learning. Using the machine learning method, we try to find important factors that influence the performance of distributed system computing and help the distributed system do an efficient computing resource scheduling. This paper surveys the application of deep reinforcement learning on distributed system computing resource scheduling proposes a deep reinforcement learning method that uses a recurrent neural network to optimize the resource scheduling, and proposes the challenges and improvement directions for DRL-based resource scheduling algorithms.

Keywords: resource scheduling, deep reinforcement learning, distributed system, artificial intelligence

Procedia PDF Downloads 82
8801 Stackelberg Security Game for Optimizing Security of Federated Internet of Things Platform Instances

Authors: Violeta Damjanovic-Behrendt

Abstract:

This paper presents an approach for optimal cyber security decisions to protect instances of a federated Internet of Things (IoT) platform in the cloud. The presented solution implements the repeated Stackelberg Security Game (SSG) and a model called Stochastic Human behaviour model with AttRactiveness and Probability weighting (SHARP). SHARP employs the Subjective Utility Quantal Response (SUQR) for formulating a subjective utility function, which is based on the evaluations of alternative solutions during decision-making. We augment the repeated SSG (including SHARP and SUQR) with a reinforced learning algorithm called Naïve Q-Learning. Naïve Q-Learning belongs to the category of active and model-free Machine Learning (ML) techniques in which the agent (either the defender or the attacker) attempts to find an optimal security solution. In this way, we combine GT and ML algorithms for discovering optimal cyber security policies. The proposed security optimization components will be validated in a collaborative cloud platform that is based on the Industrial Internet Reference Architecture (IIRA) and its recently published security model.

Keywords: security, internet of things, cloud computing, stackelberg game, machine learning, naive q-learning

Procedia PDF Downloads 326
8800 Design of Quality Assessment System for On-Orbit 3D Printing Based on 3D Reconstruction Technology

Authors: Jianning Tang, Trevor Hocksun Kwan, Xiaofeng Wu

Abstract:

With the increasing demand for space use in multiple sectors (navigation, telecommunication, imagery, etc.), the deployment and maintenance demand of satellites are growing. Considering the high launching cost and the restrictions on weight and size of the payload when using launch vehicle, the technique of on-orbit manufacturing has obtained more attention because of its significant potential to support future space missions. 3D printing is the most promising manufacturing technology that could be applied in space. However, due to the lack of autonomous quality assessment, the operation of conventional 3D printers still relies on human presence to supervise the printing process. This paper is proposed to develop an automatic 3D reconstruction system aiming at detecting failures on the 3D printed objects through application of point cloud technology. Based on the data obtained from the point cloud, the 3D printer could locate the failure and repair the failure. The system will increase automation and provide 3D printing with more feasibilities for space use without human interference.

Keywords: 3D printing, quality assessment, point cloud, on-orbit manufacturing

Procedia PDF Downloads 90
8799 Modelling Mode Choice Behaviour Using Cloud Theory

Authors: Leah Wright, Trevor Townsend

Abstract:

Mode choice models are crucial instruments in the analysis of travel behaviour. These models show the relationship between an individual’s choice of transportation mode for a given O-D pair and the individual’s socioeconomic characteristics such as household size and income level, age and/or gender, and the features of the transportation system. The most popular functional forms of these models are based on Utility-Based Choice Theory, which addresses the uncertainty in the decision-making process with the use of an error term. However, with the development of artificial intelligence, many researchers have started to take a different approach to travel demand modelling. In recent times, researchers have looked at using neural networks, fuzzy logic and rough set theory to develop improved mode choice formulas. The concept of cloud theory has recently been introduced to model decision-making under uncertainty. Unlike the previously mentioned theories, cloud theory recognises a relationship between randomness and fuzziness, two of the most common types of uncertainty. This research aims to investigate the use of cloud theory in mode choice models. This paper highlights the conceptual framework of the mode choice model using cloud theory. Merging decision-making under uncertainty and mode choice models is state of the art. The cloud theory model is expected to address the issues and concerns with the nested logit and improve the design of mode choice models and their use in travel demand.

Keywords: Cloud theory, decision-making, mode choice models, travel behaviour, uncertainty

Procedia PDF Downloads 351
8798 AER Model: An Integrated Artificial Society Modeling Method for Cloud Manufacturing Service Economic System

Authors: Deyu Zhou, Xiao Xue, Lizhen Cui

Abstract:

With the increasing collaboration among various services and the growing complexity of user demands, there are more and more factors affecting the stable development of the cloud manufacturing service economic system (CMSE). This poses new challenges to the evolution analysis of the CMSE. Many researchers have modeled and analyzed the evolution process of CMSE from the perspectives of individual learning and internal factors influencing the system, but without considering other important characteristics of the system's individuals (such as heterogeneity, bounded rationality, etc.) and the impact of external environmental factors. Therefore, this paper proposes an integrated artificial social model for the cloud manufacturing service economic system, which considers both the characteristics of the system's individuals and the internal and external influencing factors of the system. The model consists of three parts: the Agent model, environment model, and rules model (Agent-Environment-Rules, AER): (1) the Agent model considers important features of the individuals, such as heterogeneity and bounded rationality, based on the adaptive behavior mechanisms of perception, action, and decision-making; (2) the environment model describes the activity space of the individuals (real or virtual environment); (3) the rules model, as the driving force of system evolution, describes the mechanism of the entire system's operation and evolution. Finally, this paper verifies the effectiveness of the AER model through computational and experimental results.

Keywords: cloud manufacturing service economic system (CMSE), AER model, artificial social modeling, integrated framework, computing experiment, agent-based modeling, social networks

Procedia PDF Downloads 49
8797 Analysis of the Strategic Value at the Usage of Green IT Application for the Organizational Product or Service in Order to Gain the Competitive Advantage; Case: E-Money of a Telecommunication Firm in Indonesia

Authors: I Putu Deny Arthawan Sugih Prabowo, Eko Nugroho, Rudy Hartanto

Abstract:

Known, Green IT is a concept about how to use the technology (IT) wisely, efficiently, and environmentally. However, it exists as the consequence of the rapid-growth of the technology (especially IT) currently. Not only for the environments, the usage of Green IT applications, e.g. Cloud Computing (Cloud Storage) and E-Money (E-Cash), also gives its benefits for the organizational business strategy (especially the organizational product/service strategy) in order to gain the organizational competitive advantage (to be the market leader). This paper takes the case at E-Money as a Value-Added Services (VAS) of a telecommunication firm (company) in Indonesia which it also competes with the competitors’ similar product (service). Although it has been a popular telecommunication firm’s product/service, but its strategic values for the organization (firm) is still unknown, and therefore, the aim of this paper is for analyzing its strategic values for gaining the organizational competitive advantage. However, in this paper, its strategic value analysis is viewed by how to assess (consider) its strategic benefits and also manage the challenges or risks of its implementation at the organization as an organizational product/service. Then the paper uses a research model for investigating the influences of both perceived risks and the organizational cultures to the usage of Green IT Application at the organization and also both the usage of Green IT Application at the organization and the threats-challenges of the organizational products/services to the competitive advantage of the organizational products/services. However, the paper uses the quantitative research method (collecting the information from the field respondents by using the research questionnaires) and then, the primary data is analyzed by both descriptive and inferential statistics. Also in this paper, SmartPLS is used for analyzing the primary data by the quantitative research method. Besides using the quantitative research method, the paper also uses the qualitative research method, such as interviewing the field respondent and/or directly field observation, for deeply confirming the quantitative research method’s analysis results at the certain domain, e.g. both organizational cultures and internal processes that support the usage of Green IT applications for the organizational product/service (E-Money in this paper case). However, the paper is still at an infant stage of in-progress research. Then the paper’s results may be used as a reference for the organization (firm or company) in developing the organizational business strategies, especially about the organizational product/service that relates to Green IT applications. Besides it, the paper may also be the future study, e.g. the influence of knowledge transfer about E-Money and/or other Green IT application-based products/services to the organizational service performance that relates to the product (service) in order to gain the competitive advantage.

Keywords: Green IT, competitive advantage, strategic value, organization (firm or company), organizational product (service)

Procedia PDF Downloads 279
8796 Development of a Shape Based Estimation Technology Using Terrestrial Laser Scanning

Authors: Gichun Cha, Byoungjoon Yu, Jihwan Park, Minsoo Park, Junghyun Im, Sehwan Park, Sujung Sin, Seunghee Park

Abstract:

The goal of this research is to estimate a structural shape change using terrestrial laser scanning. This study proceeds with development of data reduction and shape change estimation algorithm for large-capacity scan data. The point cloud of scan data was converted to voxel and sampled. Technique of shape estimation is studied to detect changes in structure patterns, such as skyscrapers, bridges, and tunnels based on large point cloud data. The point cloud analysis applies the octree data structure to speed up the post-processing process for change detection. The point cloud data is the relative representative value of shape information, and it used as a model for detecting point cloud changes in a data structure. Shape estimation model is to develop a technology that can detect not only normal but also immediate structural changes in the event of disasters such as earthquakes, typhoons, and fires, thereby preventing major accidents caused by aging and disasters. The study will be expected to improve the efficiency of structural health monitoring and maintenance.

Keywords: terrestrial laser scanning, point cloud, shape information model, displacement measurement

Procedia PDF Downloads 200
8795 Factors Affecting U-Computing Use

Authors: Shui Lien Chen, Chen-Yin Kuo

Abstract:

U-computing use has brings many new services of commerce, which could provide a new experience for customer. Location Based Services (LBS) is one of U-computing service. With increase of the smartphone and mobile internet users, there are many small and medium-sized enterprises (SMEs) take LBS in marketing strategy in Taiwan. For example, they would provide Facebook check-in to get a benefit (e.g. discount, free dessert and coupon) to attract customers purchasing. Therefore, this study is to understand which factors would affect SMEs adoption of u-computing and the performances after adopt U-computing. This study collected 187 useful data that were analyzed by SmartPLS 2.0 software. The results of this study are as follows. First, entrepreneurial orientation and market orientation positively affects innovation. Second, business resources and innovation positively affect u-computing use. Finally, U-computing positively affects both business value and customer value.

Keywords: entrepreneurial orientation, market orientation, innovation, business resources, u-computing use, LBS

Procedia PDF Downloads 548
8794 A Review of In-Vehicle Network for Cloud Connected Vehicle

Authors: Hanbhin Ryu, Ilkwon Yun

Abstract:

Automotive industry targets to provide an improvement in safety and convenience through realizing fully autonomous vehicle. For partially realizing fully automated driving, Current vehicles already feature varieties of advanced driver assistance system (ADAS) for safety and infotainment systems for the driver’s convenience. This paper presents Cloud Connected Vehicle (CCV) which connected vehicles with cloud data center via the access network to control the vehicle for achieving next autonomous driving form and describes its features. This paper also describes the shortcoming of the existing In-Vehicle Network (IVN) to be a next generation IVN of CCV and organize the 802.3 Ethernet, the next generation of IVN, related research issue to verify the feasibility of using Ethernet. At last, this paper refers to additional considerations to adopting Ethernet-based IVN for CCV.

Keywords: autonomous vehicle, cloud connected vehicle, ethernet, in-vehicle network

Procedia PDF Downloads 441
8793 Clouds Influence on Atmospheric Ozone from GOME-2 Satellite Measurements

Authors: S. M. Samkeyat Shohan

Abstract:

This study is mainly focused on the determination and analysis of the photolysis rate of atmospheric, specifically tropospheric, ozone as function of cloud properties through-out the year 2007. The observational basis for ozone concentrations and cloud properties are the measurement data set of the Global Ozone Monitoring Experiment-2 (GOME-2) sensor on board the polar orbiting Metop-A satellite. Two different spectral ranges are used; ozone total column are calculated from the wavelength window 325 – 335 nm, while cloud properties, such as cloud top height (CTH) and cloud optical thick-ness (COT) are derived from the absorption band of molecular oxygen centered at 761 nm. Cloud fraction (CF) is derived from measurements in the ultraviolet, visible and near-infrared range of GOME-2. First, ozone concentrations above clouds are derived from ozone total columns, subtracting the contribution of stratospheric ozone and filtering those satellite measurements which have thin and low clouds. Then, the values of ozone photolysis derived from observations are compared with theoretical modeled results, in the latitudinal belt 5˚N-5˚S and 20˚N - 20˚S, as function of CF and COT. In general, good agreement is found between the data and the model, proving both the quality of the space-borne ozone and cloud properties as well as the modeling theory of ozone photolysis rate. The found discrepancies can, however, amount to approximately 15%. Latitudinal seasonal changes of photolysis rate of ozone are found to be negatively correlated to changes in upper-tropospheric ozone concentrations only in the autumn and summer months within the northern and southern tropical belts, respectively. This fact points to the entangled roles of temperature and nitrogen oxides in the ozone production, which are superimposed on its sole photolysis induced by thick and high clouds in the tropics.

Keywords: cloud properties, photolysis rate, stratospheric ozone, tropospheric ozone

Procedia PDF Downloads 183
8792 Three-Dimensional Positioning Method of Indoor Personnel Based on Millimeter Wave Radar Sensor

Authors: Chao Wang, Zuxue Xia, Wenhai Xia, Rui Wang, Jiayuan Hu, Rui Cheng

Abstract:

Aiming at the application of indoor personnel positioning under smog conditions, this paper proposes a 3D positioning method based on the IWR1443 millimeter wave radar sensor. The problem that millimeter-wave radar cannot effectively form contours in 3D point cloud imaging is solved. The results show that the method can effectively achieve indoor positioning and scene construction, and the maximum positioning error of the system is 0.130m.

Keywords: indoor positioning, millimeter wave radar, IWR1443 sensor, point cloud imaging

Procedia PDF Downloads 66
8791 EECS: Reimagining the Future of Technology Education through Electrical Engineering and Computer Science Integration

Authors: Yousef Sharrab, Dimah Al-Fraihat, Monther Tarawneh, Aysh Alhroob, Ala’ Khalifeh, Nabil Sarhan

Abstract:

This paper explores the evolution of Electrical Engineering (EE) and Computer Science (CS) education in higher learning, examining the feasibility of unifying them into Electrical Engineering and Computer Science (EECS) for the technology industry. It delves into the historical reasons for their separation and underscores the need for integration. Emerging technologies such as AI, Virtual Reality, IoT, Cloud Computing, and Cybersecurity demand an integrated EE and CS program to enhance students' understanding. The study evaluates curriculum integration models, drawing from prior research and case studies, demonstrating how integration can provide students with a comprehensive knowledge base for industry demands. Successful integration necessitates addressing administrative and pedagogical challenges. For academic institutions considering merging EE and CS programs, the paper offers guidance, advocating for a flexible curriculum encompassing foundational courses and specialized tracks in computer engineering, software engineering, bioinformatics, information systems, data science, AI, robotics, IoT, virtual reality, cybersecurity, and cloud computing. Elective courses are emphasized to keep pace with technological advancements. Implementing this integrated approach can prepare students for success in the technology industry, addressing the challenges of a technologically advanced society reliant on both EE and CS principles. Integrating EE and CS curricula is crucial for preparing students for the future.

Keywords: electrical engineering, computer science, EECS, curriculum integration of EE and CS

Procedia PDF Downloads 22
8790 A Novel Computer-Generated Hologram (CGH) Achieved Scheme Generated from Point Cloud by Using a Lens Array

Authors: Wei-Na Li, Mei-Lan Piao, Nam Kim

Abstract:

We proposed a novel computer-generated hologram (CGH) achieved scheme, wherein the CGH is generated from a point cloud which is transformed by a mapping relationship of a series of elemental images captured from a real three-dimensional (3D) object by using a lens array. This scheme is composed of three procedures: mapping from elemental images to point cloud, hologram generation, and hologram display. A mapping method is figured out to achieve a virtual volume date (point cloud) from a series of elemental images. This mapping method consists of two steps. Firstly, the coordinate (x, y) pairs and its appearing number are calculated from the series of sub-images, which are generated from the elemental images. Secondly, a series of corresponding coordinates (x, y, z) are calculated from the elemental images. Then a hologram is generated from the volume data that is calculated by the previous two steps. Eventually, a spatial light modulator (SLM) and a green laser beam are utilized to display this hologram and reconstruct the original 3D object. In this paper, in order to show a more auto stereoscopic display of a real 3D object, we successfully obtained the actual depth data of every discrete point of the real 3D object, and overcame the inherent drawbacks of the depth camera by obtaining point cloud from the elemental images.

Keywords: elemental image, point cloud, computer-generated hologram (CGH), autostereoscopic display

Procedia PDF Downloads 550
8789 Multimodal Biometric Cryptography Based Authentication in Cloud Environment to Enhance Information Security

Authors: D. Pugazhenthi, B. Sree Vidya

Abstract:

Cloud computing is one of the emerging technologies that enables end users to use the services of cloud on ‘pay per usage’ strategy. This technology grows in a fast pace and so is its security threat. One among the various services provided by cloud is storage. In this service, security plays a vital factor for both authenticating legitimate users and protection of information. This paper brings in efficient ways of authenticating users as well as securing information on the cloud. Initial phase proposed in this paper deals with an authentication technique using multi-factor and multi-dimensional authentication system with multi-level security. Unique identification and slow intrusive formulates an advanced reliability on user-behaviour based biometrics than conventional means of password authentication. By biometric systems, the accounts are accessed only by a legitimate user and not by a nonentity. The biometric templates employed here do not include single trait but multiple, viz., iris and finger prints. The coordinating stage of the authentication system functions on Ensemble Support Vector Machine (SVM) and optimization by assembling weights of base SVMs for SVM ensemble after individual SVM of ensemble is trained by the Artificial Fish Swarm Algorithm (AFSA). Thus it helps in generating a user-specific secure cryptographic key of the multimodal biometric template by fusion process. Data security problem is averted and enhanced security architecture is proposed using encryption and decryption system with double key cryptography based on Fuzzy Neural Network (FNN) for data storing and retrieval in cloud computing . The proposing scheme aims to protect the records from hackers by arresting the breaking of cipher text to original text. This improves the authentication performance that the proposed double cryptographic key scheme is capable of providing better user authentication and better security which distinguish between the genuine and fake users. Thus, there are three important modules in this proposed work such as 1) Feature extraction, 2) Multimodal biometric template generation and 3) Cryptographic key generation. The extraction of the feature and texture properties from the respective fingerprint and iris images has been done initially. Finally, with the help of fuzzy neural network and symmetric cryptography algorithm, the technique of double key encryption technique has been developed. As the proposed approach is based on neural networks, it has the advantage of not being decrypted by the hacker even though the data were hacked already. The results prove that authentication process is optimal and stored information is secured.

Keywords: artificial fish swarm algorithm (AFSA), biometric authentication, decryption, encryption, fingerprint, fusion, fuzzy neural network (FNN), iris, multi-modal, support vector machine classification

Procedia PDF Downloads 227
8788 Cloud & Natural Language Processing (NLP) to Solve the Problem of Service Continuity

Authors: Mohammed Tou, Adel Toumoh

Abstract:

The availability of IT services within organizations has become increasingly important; however, in an interconnected world favoring the distribution and offshoring of organizational information system components, availability is directly based on the constancy and uninterrupted flow of the Internet. Internet attendance guarantees the continuity of IT services. In this communication, we introduce paradigms around the concept of service continuity, as well as the technical approaches and methodologies leading to its resolution. As the heart of the problem is indeed the non-continuity of service, we first start by framing the notion of continuity in the context of services offered by the information system and identify the failures resulting from the discontinuity; thus, we refer to related research to extract the tools and technological paradigms allowing the implementation of solutions that guarantee a minimum of service continuity. If the main element causing continuity is the availability of the Internet, it is obvious to look for an alternative path, which is a conventional PSTN telephone network. To complete the chain of solutions, we mainly used concepts such as voice and speech recognition, AI, NLP, and cloud computing. The research led us to introduce an important element between the user and the service: the request represented by a voice message. Thus, the broker guarantees the delivery of the message to the right recipient service, as well as the response to the user. All of these elements are orchestrated by a pipeline that guarantees the integrity of the request and response. The concepts related to speech recognition are used for the initiation of the process of the solution, along with the combination of NLP, with its two statistical approaches and neural networks, and cloud technology secures the solution in both directions. The targeted solution does not replace 100 \ 100 the availability, by default, of the service; however, our research aims for a minimum of continuity by preventing the organizational information system from being put into total shutdown mode.

Keywords: Cloud, PSTN, NPL, NLU, AI, MTTR, MTBF, RPO, RTO, SLA, SLO, LSR, SRS

Procedia PDF Downloads 0
8787 Domestic Led Lighting Designs Using Internet of Things

Authors: Gouresh Singhal, Rajib Kumar Panigrahi

Abstract:

In this paper, we try to examine historical and technological changes in lighting industry. We propose a (proto) technical solution at block diagram and circuit level. Untapped and upcoming technologies such as Cloud and 6LoWPAN are further explored. The paper presents a robust hardware realistic design. A mobile application is also provided to provide last mile user interface. The paper highlights the current challenges to be faced and concludes with a pragmatic view of lighting industry.

Keywords: 6lowpan, internet of things, mobile application, led

Procedia PDF Downloads 549
8786 A Pervasive System Architecture for Smart Environments in Internet of Things Context

Authors: Patrick Santos, João Casal, João Santos Luis Varandas, Tiago Alves, Carlos Romeiro, Sérgio Lourenço

Abstract:

Nowadays, technology makes it possible to, in one hand, communicate with various objects of the daily life through the Internet, and in the other, put these objects interacting with each other through this channel. Simultaneously, with the raise of smartphones as the most ubiquitous technology on persons lives, emerge new agents for these devices - Intelligent Personal Assistants. These agents have the goal of helping the user manage and organize his information as well as supporting the user in his/her day-to-day tasks. Moreover, other emergent concept is the Cloud Computing, which allows computation and storage to get out of the users devices, bringing benefits in terms of performance, security, interoperability and others. Connecting these three paradigms, in this work we propose an architecture for an intelligent system which provides an interface that assists the user on smart environments, informing, suggesting actions and allowing to manage the objects of his/her daily life.

Keywords: internet of things, cloud, intelligent personal assistant, architecture

Procedia PDF Downloads 480
8785 BigCrypt: A Probable Approach of Big Data Encryption to Protect Personal and Business Privacy

Authors: Abdullah Al Mamun, Talal Alkharobi

Abstract:

As data size is growing up, people are became more familiar to store big amount of secret information into cloud storage. Companies are always required to need transfer massive business files from one end to another. We are going to lose privacy if we transmit it as it is and continuing same scenario repeatedly without securing the communication mechanism means proper encryption. Although asymmetric key encryption solves the main problem of symmetric key encryption but it can only encrypt limited size of data which is inapplicable for large data encryption. In this paper we propose a probable approach of pretty good privacy for encrypt big data using both symmetric and asymmetric keys. Our goal is to achieve encrypt huge collection information and transmit it through a secure communication channel for committing the business and personal privacy. To justify our method an experimental dataset from three different platform is provided. We would like to show that our approach is working for massive size of various data efficiently and reliably.

Keywords: big data, cloud computing, cryptography, hadoop, public key

Procedia PDF Downloads 297
8784 DAG Design and Tradeoff for Full Live Virtual Machine Migration over XIA Network

Authors: Dalu Zhang, Xiang Jin, Dejiang Zhou, Jianpeng Wang, Haiying Jiang

Abstract:

Traditional TCP/IP network is showing lots of shortages and research for future networks is becoming a hotspot. FIA (Future Internet Architecture) and FIA-NP (Next Phase) are supported by US NSF for future Internet designing. Moreover, virtual machine migration is a significant technique in cloud computing. As a network application, it should also be supported in XIA (expressive Internet Architecture), which is in both FIA and FIA-NP projects. This paper is an experimental study aims at verifying the feasibility of VM migration over XIA. We present three ways to maintain VM connectivity and communication states concerning DAG design and routing table modification. VM migration experiments are conducted intra-AD and inter-AD with KVM instances. The procedure is achieved by a migration control protocol which is suitable for the characters of XIA. Evaluation results show that our solutions can well supports full live VM migration over XIA network respectively, keeping services seamless.

Keywords: DAG, downtime, virtual machine migration, XIA

Procedia PDF Downloads 817
8783 A Conceptual Framework of Digital Twin for Homecare

Authors: Raja Omman Zafar, Yves Rybarczyk, Johan Borg

Abstract:

This article proposes a conceptual framework for the application of digital twin technology in home care. The main goal is to bridge the gap between advanced digital twin concepts and their practical implementation in home care. This study uses a literature review and thematic analysis approach to synthesize existing knowledge and proposes a structured framework suitable for homecare applications. The proposed framework integrates key components such as IoT sensors, data-driven models, cloud computing, and user interface design, highlighting the importance of personalized and predictive homecare solutions. This framework can significantly improve the efficiency, accuracy, and reliability of homecare services. It paves the way for the implementation of digital twins in home care, promoting real-time monitoring, early intervention, and better outcomes.

Keywords: digital twin, homecare, older adults, healthcare, IoT, artificial intelligence

Procedia PDF Downloads 22
8782 Cloud-Based Multiresolution Geodata Cube for Efficient Raster Data Visualization and Analysis

Authors: Lassi Lehto, Jaakko Kahkonen, Juha Oksanen, Tapani Sarjakoski

Abstract:

The use of raster-formatted data sets in geospatial analysis is increasing rapidly. At the same time, geographic data are being introduced into disciplines outside the traditional domain of geoinformatics, like climate change, intelligent transport, and immigration studies. These developments call for better methods to deliver raster geodata in an efficient and easy-to-use manner. Data cube technologies have traditionally been used in the geospatial domain for managing Earth Observation data sets that have strict requirements for effective handling of time series. The same approach and methodologies can also be applied in managing other types of geospatial data sets. A cloud service-based geodata cube, called GeoCubes Finland, has been developed to support online delivery and analysis of most important geospatial data sets with national coverage. The main target group of the service is the academic research institutes in the country. The most significant aspects of the GeoCubes data repository include the use of multiple resolution levels, cloud-optimized file structure, and a customized, flexible content access API. Input data sets are pre-processed while being ingested into the repository to bring them into a harmonized form in aspects like georeferencing, sampling resolutions, spatial subdivision, and value encoding. All the resolution levels are created using an appropriate generalization method, selected depending on the nature of the source data set. Multiple pre-processed resolutions enable new kinds of online analysis approaches to be introduced. Analysis processes based on interactive visual exploration can be effectively carried out, as the level of resolution most close to the visual scale can always be used. In the same way, statistical analysis can be carried out on resolution levels that best reflect the scale of the phenomenon being studied. Access times remain close to constant, independent of the scale applied in the application. The cloud service-based approach, applied in the GeoCubes Finland repository, enables analysis operations to be performed on the server platform, thus making high-performance computing facilities easily accessible. The developed GeoCubes API supports this kind of approach for online analysis. The use of cloud-optimized file structures in data storage enables the fast extraction of subareas. The access API allows for the use of vector-formatted administrative areas and user-defined polygons as definitions of subareas for data retrieval. Administrative areas of the country in four levels are available readily from the GeoCubes platform. In addition to direct delivery of raster data, the service also supports the so-called virtual file format, in which only a small text file is first downloaded. The text file contains links to the raster content on the service platform. The actual raster data is downloaded on demand, from the spatial area and resolution level required in each stage of the application. By the geodata cube approach, pre-harmonized geospatial data sets are made accessible to new categories of inexperienced users in an easy-to-use manner. At the same time, the multiresolution nature of the GeoCubes repository facilitates expert users to introduce new kinds of interactive online analysis operations.

Keywords: cloud service, geodata cube, multiresolution, raster geodata

Procedia PDF Downloads 104
8781 Energy Efficient Assessment of Energy Internet Based on Data-Driven Fuzzy Integrated Cloud Evaluation Algorithm

Authors: Chuanbo Xu, Xinying Li, Gejirifu De, Yunna Wu

Abstract:

Energy Internet (EI) is a new form that deeply integrates the Internet and the entire energy process from production to consumption. The assessment of energy efficient performance is of vital importance for the long-term sustainable development of EI project. Although the newly proposed fuzzy integrated cloud evaluation algorithm considers the randomness of uncertainty, it relies too much on the experience and knowledge of experts. Fortunately, the enrichment of EI data has enabled the utilization of data-driven methods. Therefore, the main purpose of this work is to assess the energy efficient of park-level EI by using a combination of a data-driven method with the fuzzy integrated cloud evaluation algorithm. Firstly, the indicators for the energy efficient are identified through literature review. Secondly, the artificial neural network (ANN)-based data-driven method is employed to cluster the values of indicators. Thirdly, the energy efficient of EI project is calculated through the fuzzy integrated cloud evaluation algorithm. Finally, the applicability of the proposed method is demonstrated by a case study.

Keywords: energy efficient, energy internet, data-driven, fuzzy integrated evaluation, cloud model

Procedia PDF Downloads 172
8780 A Qualitative Study of Children's Growth in Creative Dance: An Example of Cloud Gate Dance School in Taiwan

Authors: Chingwen Yeh, Yu Ru Chen

Abstract:

This paper aims to explore the growth and development of children in the creative dance class of Cloud Gate Dance School in Taichung Taiwan. Professor Chingwen Yeh’s qualitative research method was applied in this study. First of all, application of Dalcroze Eurhythmic teaching materials such as music, teaching aids, speaking language through classroom situation was collected and exam. Second, the in-class observation on the participation of the young children's learning situation was recorded both by words and on video screen as the research data. Finally, data analysis was categorized into the following aspects: children's body movement coordination, children’s mind concentration and imagination and children’s verbal expression. Through the in-depth interviews with the in-class teachers, parents of participating children and other in class observers were conducted from time to time; this research found the children's body rhythm, language skills, and social learning growth were improved in certain degree through the creative dance training. These authors hope the study can contribute as the further research reference on the related topic.

Keywords: Cloud Gate Dance School, creative dance, Dalcroze, Eurhythmic

Procedia PDF Downloads 270
8779 Application of Semantic Technologies in Rapid Reconfiguration of Factory Systems

Authors: J. Zhang, K. Agyapong-Kodua

Abstract:

Digital factory based on visual design and simulation has emerged as a mainstream to reduce digital development life cycle. Some basic industrial systems are being integrated via semantic modelling, and products (P) matching process (P)-resource (R) requirements are designed to fulfill current customer demands. Nevertheless, product design is still limited to fixed product models and known knowledge of product engineers. Therefore, this paper presents a rapid reconfiguration method based on semantic technologies with PPR ontologies to reuse known and unknown knowledge. In order to avoid the influence of big data, our system uses a cloud manufactory and distributed database to improve the efficiency of querying meeting PPR requirements.

Keywords: semantic technologies, factory system, digital factory, cloud manufactory

Procedia PDF Downloads 459
8778 Grid Computing for Multi-Objective Optimization Problems

Authors: Aouaouche Elmaouhab, Hassina Beggar

Abstract:

Solving multi-objective discrete optimization applications has always been limited by the resources of one machine: By computing power or by memory, most often both. To speed up the calculations, the grid computing represents a primary solution for the treatment of these applications through the parallelization of these resolution methods. In this work, we are interested in the study of some methods for solving multiple objective integer linear programming problem based on Branch-and-Bound and the study of grid computing technology. This study allowed us to propose an implementation of the method of Abbas and Al on the grid by reducing the execution time. To enhance our contribution, the main results are presented.

Keywords: multi-objective optimization, integer linear programming, grid computing, parallel computing

Procedia PDF Downloads 455