Search results for: approximate computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1331

Search results for: approximate computing

1151 The Improved Laplace Homotopy Perturbation Method for Solving Non-integrable PDEs

Authors: Noufe H. Aljahdaly

Abstract:

The Laplace homotopy perturbation method (LHPM) is an approximate method that help to compute the approximate solution for partial differential equations. The method has been used for solving several problems in science. It requires the initial condition, so it solves the initial value problem. In physics, when some important terms are taken in account, we may obtain non-integrable partial differential equations that do not have analytical integrals. This type of PDEs do not have exact solution, therefore, we need to compute the solution without initial condition. In this work, we improved the LHPM to be able to solve non-integrable problem, especially the damped PDEs, which are the PDEs that include a damping term which makes the PDEs non-integrable. We improved the LHPM by setting a perturbation parameter and an embedding parameter as the damping parameter and using the initial condition for damped PDE as the initial condition for non-damped PDE.

Keywords: non-integrable PDEs, modified Kawahara equation;, laplace homotopy perturbation method, damping term

Procedia PDF Downloads 100
1150 Classification of Attacks Over Cloud Environment

Authors: Karim Abouelmehdi, Loubna Dali, Elmoutaoukkil Abdelmajid, Hoda Elsayed, Eladnani Fatiha, Benihssane Abderahim

Abstract:

The security of cloud services is the concern of cloud service providers. In this paper, we will mention different classifications of cloud attacks referred by specialized organizations. Each agency has its classification of well-defined properties. The purpose is to present a high-level classification of current research in cloud computing security. This classification is organized around attack strategies and corresponding defenses.

Keywords: cloud computing, classification, risk, security

Procedia PDF Downloads 548
1149 On the Factors Affecting Computing Students’ Awareness of the Latest ICTs

Authors: O. D. Adegbehingbe, S. D. Eyono Obono

Abstract:

The education sector is constantly faced with rapid changes in technologies in terms of ensuring that the curriculum is up to date and in terms of making sure that students are aware of these technological changes. This challenge can be seen as the motivation for this study, which is to examine the factors affecting computing students’ awareness of the latest Information Technologies (ICTs). The aim of this study is divided into two sub-objectives which are: the selection of relevant theories and the design of a conceptual model to support it as well as the empirical testing of the designed model. The first objective is achieved by a review of existing literature on technology adoption theories and models. The second objective is achieved using a survey of computing students in the four universities of the KwaZulu-Natal province of South Africa. Data collected from this survey is analyzed using Statistical package for the Social Science (SPSS) using descriptive statistics, ANOVA and Pearson correlations. The main hypothesis of this study is that there is a relationship between the demographics and the prior conditions of the computing students and their awareness of general ICT trends and of Digital Switch Over (DSO) a new technology which involves the change from analog to digital television broadcasting in order to achieve improved spectrum efficiency. The prior conditions of the computing students that were considered in this study are students’ perceived exposure to career guidance and students’ perceived curriculum currency. The results of this study confirm that gender, ethnicity, and high school computing course affect students’ perceived curriculum currency while high school location affects students’ awareness of DSO. The results of this study also confirm that there is a relationship between students prior conditions and their awareness of general ICT trends and DSO in particular.

Keywords: education, information technologies, IDT, awareness

Procedia PDF Downloads 357
1148 Efficient Semi-Systolic Finite Field Multiplier Using Redundant Basis

Authors: Hyun-Ho Lee, Kee-Won Kim

Abstract:

The arithmetic operations over GF(2m) have been extensively used in error correcting codes and public-key cryptography schemes. Finite field arithmetic includes addition, multiplication, division and inversion operations. Addition is very simple and can be implemented with an extremely simple circuit. The other operations are much more complex. The multiplication is the most important for cryptosystems, such as the elliptic curve cryptosystem, since computing exponentiation, division, and computing multiplicative inverse can be performed by computing multiplication iteratively. In this paper, we present a parallel computation algorithm that operates Montgomery multiplication over finite field using redundant basis. Also, based on the multiplication algorithm, we present an efficient semi-systolic multiplier over finite field. The multiplier has less space and time complexities compared to related multipliers. As compared to the corresponding existing structures, the multiplier saves at least 5% area, 50% time, and 53% area-time (AT) complexity. Accordingly, it is well suited for VLSI implementation and can be easily applied as a basic component for computing complex operations over finite field, such as inversion and division operation.

Keywords: finite field, Montgomery multiplication, systolic array, cryptography

Procedia PDF Downloads 294
1147 Proposed Solutions Based on Affective Computing

Authors: Diego Adrian Cardenas Jorge, Gerardo Mirando Guisado, Alfredo Barrientos Padilla

Abstract:

A system based on Affective Computing can detect and interpret human information like voice, facial expressions and body movement to detect emotions and execute a corresponding response. This data is important due to the fact that a person can communicate more effectively with emotions than can be possible with words. This information can be processed through technological components like Facial Recognition, Gait Recognition or Gesture Recognition. As of now, solutions proposed using this technology only consider one component at a given moment. This research investigation proposes two solutions based on Affective Computing taking into account more than one component for emotion detection. The proposals reflect the levels of dependency between hardware devices and software, as well as the interaction process between the system and the user which implies the development of scenarios where both proposals will be put to the test in a live environment. Both solutions are to be developed in code by software engineers to prove the feasibility. To validate the impact on society and business interest, interviews with stakeholders are conducted with an investment mind set where each solution is labeled on a scale of 1 through 5, being one a minimum possible investment and 5 the maximum.

Keywords: affective computing, emotions, emotion detection, face recognition, gait recognition

Procedia PDF Downloads 369
1146 Arithmetic Operations in Deterministic P Systems Based on the Weak Rule Priority

Authors: Chinedu Peter, Dashrath Singh

Abstract:

Membrane computing is a computability model which abstracts its structures and functions from the biological cell. The main ingredient of membrane computing is the notion of a membrane structure, which consists of several cell-like membranes recurrently placed inside a unique skin membrane. The emergence of several variants of membrane computing gives rise to the notion of a P system. The paper presents a variant of P systems for arithmetic operations on non-negative integers based on the weak priorities for rule application. Consequently, we obtain deterministic P systems. Two membranes suffice. There are at most four objects for multiplication and five objects for division throughout the computation processes. The model is simple and has a potential for possible extension to non-negative integers and real numbers in general.

Keywords: P system, binary operation, determinism, weak rule priority

Procedia PDF Downloads 445
1145 An Effective Route to Control of the Safety of Accessing and Storing Data in the Cloud-Based Data Base

Authors: Omid Khodabakhshi, Amir Rozdel

Abstract:

The subject of cloud computing security research has allocated a number of challenges and competitions because the data center is comprised of complex private information and are always faced various risks of information disclosure by hacker attacks or internal enemies. Accordingly, the security of virtual machines in the cloud computing infrastructure layer is very important. So far, there are many software solutions to develop security in virtual machines. But using software alone is not enough to solve security problems. The purpose of this article is to examine the challenges and security requirements for accessing and storing data in an insecure cloud environment. In other words, in this article, a structure is proposed for the implementation of highly isolated security-sensitive codes using secure computing hardware in virtual environments. It also allows remote code validation with inputs and outputs. We provide these security features even in situations where the BIOS, the operating system, and even the super-supervisor are infected. To achieve these goals, we will use the hardware support provided by the new Intel and AMD processors, as well as the TPM security chip. In conclusion, the use of these technologies ultimately creates a root of dynamic trust and reduces TCB to security-sensitive codes.

Keywords: code, cloud computing, security, virtual machines

Procedia PDF Downloads 191
1144 Design and Implementation of Remote Application Virtualization in Cloud Environments

Authors: Shuen-Tai Wang, Ying-Chuan Chen, Hsi-Ya Chang

Abstract:

Cloud computing is a paradigm of computing that shifts the way computing has been done in the past. The users can use cloud resources such as application software or storage space from the cloud without needing to own them. This paper is focused on solutions that are anticipated to introduce IaaS idea to build cloud base services and enable the individual remote user's applications in cloud environments, which appear as if they are running on the end user's local computer. The available features of application delivery solution have been developed based on our previous research on the virtualization technology to offer applications independent of location so that the users can work online, offline, anywhere, with appropriate device and at any time. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for cloud service. Users no longer need to burden the system managers and drastically reduces the overall cost of hardware and software licenses. Moreover, this flexible remote application virtualization service represents the next significant step to the mobile workplace, and it lets users access their applications remotely through cloud services anywhere. This is also made possible by the low administrative costs as well as relatively inexpensive end-user terminals and reduced energy expenses.

Keywords: cloud computing, IaaS, virtualization, application delivery

Procedia PDF Downloads 281
1143 Digital Forensics Compute Cluster: A High Speed Distributed Computing Capability for Digital Forensics

Authors: Daniel Gonzales, Zev Winkelman, Trung Tran, Ricardo Sanchez, Dulani Woods, John Hollywood

Abstract:

We have developed a distributed computing capability, Digital Forensics Compute Cluster (DFORC2) to speed up the ingestion and processing of digital evidence that is resident on computer hard drives. DFORC2 parallelizes evidence ingestion and file processing steps. It can be run on a standalone computer cluster or in the Amazon Web Services (AWS) cloud. When running in a virtualized computing environment, its cluster resources can be dynamically scaled up or down using Kubernetes. DFORC2 is an open source project that uses Autopsy, Apache Spark and Kafka, and other open source software packages. It extends the proven open source digital forensics capabilities of Autopsy to compute clusters and cloud architectures, so digital forensics tasks can be accomplished efficiently by a scalable array of cluster compute nodes. In this paper, we describe DFORC2 and compare it with a standalone version of Autopsy when both are used to process evidence from hard drives of different sizes.

Keywords: digital forensics, cloud computing, cyber security, spark, Kubernetes, Kafka

Procedia PDF Downloads 393
1142 A Performance Analysis Study for Cloud Based ERP Systems

Authors: Burak Erkayman

Abstract:

The manufacturing and service organizations are in the need of using ERP systems to integrate many functions from purchasing to storage, production planning to calculation of costs. Using ERP systems by the integration in the level of information provides companies remarkable advantages in terms of profitability, productivity and efficiency in processes. Cloud computing is one of the most significant changes in information and communication technology. The developments in Cloud Computing attract business world to take advantage of this field. Cloud Computing means much more storage area, more cost saving and faster data transfer rate. In addition to these, it presents new business models, new field of study and practicable solutions for anyone’s use. These developments make inevitable the implementation of ERP systems to cloud environment. In this study, the performance of ERP systems in cloud environment is analyzed through various performance criteria and a comparison between traditional and cloud-ERP systems is presented. At the end of study the transformation and the future of ERP systems is discussed.

Keywords: cloud-ERP, ERP system performance, information system transformation

Procedia PDF Downloads 529
1141 Optimizing Parallel Computing Systems: A Java-Based Approach to Modeling and Performance Analysis

Authors: Maher Ali Rusho, Sudipta Halder

Abstract:

The purpose of the study is to develop optimal solutions for models of parallel computing systems using the Java language. During the study, programmes were written for the examined models of parallel computing systems. The result of the parallel sorting code is the output of a sorted array of random numbers. When processing data in parallel, the time spent on processing and the first elements of the list of squared numbers are displayed. When processing requests asynchronously, processing completion messages are displayed for each task with a slight delay. The main results include the development of optimisation methods for algorithms and processes, such as the division of tasks into subtasks, the use of non-blocking algorithms, effective memory management, and load balancing, as well as the construction of diagrams and comparison of these methods by characteristics, including descriptions, implementation examples, and advantages. In addition, various specialised libraries were analysed to improve the performance and scalability of the models. The results of the work performed showed a substantial improvement in response time, bandwidth, and resource efficiency in parallel computing systems. Scalability and load analysis assessments were conducted, demonstrating how the system responds to an increase in data volume or the number of threads. Profiling tools were used to analyse performance in detail and identify bottlenecks in models, which improved the architecture and implementation of parallel computing systems. The obtained results emphasise the importance of choosing the right methods and tools for optimising parallel computing systems, which can substantially improve their performance and efficiency.

Keywords: algorithm optimisation, memory management, load balancing, performance profiling, asynchronous programming.

Procedia PDF Downloads 12
1140 Reconfigurable Ubiquitous Computing Infrastructure for Load Balancing

Authors: Khaled Sellami, Lynda Sellami, Pierre F. Tiako

Abstract:

Ubiquitous computing helps make data and services available to users anytime and anywhere. This makes the cooperation of devices a crucial need. In return, such cooperation causes an overload of the devices and/or networks, resulting in network malfunction and suspension of its activities. Our goal in this paper is to propose an approach of devices reconfiguration in order to help to reduce the energy consumption in ubiquitous environments. The idea is that when high-energy consumption is detected, we proceed to a change in component distribution on the devices to reduce and/or balance the energy consumption. We also investigate the possibility to detect high-energy consumption of devices/network based on devices abilities. As a result, our idea realizes a reconfiguration of devices aimed at reducing the consumption of energy and/or load balancing in ubiquitous environments.

Keywords: ubiquitous computing, load balancing, device energy consumption, reconfiguration

Procedia PDF Downloads 275
1139 Cloud Computing: Deciding Whether It Is Easier or Harder to Defend Against Cyber Attacks

Authors: Emhemed Shaklawoon, Ibrahim Althomali

Abstract:

We propose that we identify different defense mechanisms that were used before the introduction of the cloud and compare if their protection mechanisms are still valuable and to what degree. Note that in order to defend against vulnerability, we must know how this vulnerability is abused in an attack. Only then, we will be able to recognize if it is easier or harder to defend against cyber attacks.

Keywords: cloud computing, privacy, cyber attacks, defend the cloud

Procedia PDF Downloads 422
1138 Accelerating the Uptake of Smart City Applications through Cloud Computing

Authors: Panagiotis Tsarchopoulos, Nicos Komninos, Christina Kakderi

Abstract:

Smart cities are high on the political agenda around the globe. However, planning smart cities and deploying applications dealing with the complex problems of the urban environment is a very challenging task that is difficult to be undertaken solely by the cities. We argue that the uptake of smart city strategies is facilitated, first, through the development of smart city application repositories allowing re-use of already developed and tested software, and, second, through cloud computing which disengages city authorities from any resource constraints, technical or financial, and has a higher impact and greater effect at the city level The combination of these two solutions allows city governments and municipalities to select and deploy a large number of applications dedicated to different city functions, which collectively could create a multiplier effect with a greater impact on the urban environment.

Keywords: smart cities, applications, cloud computing, migration to the cloud, application repositories

Procedia PDF Downloads 356
1137 Quantum Computing with Qudits on a Graph

Authors: Aleksey Fedorov

Abstract:

Building a scalable platform for quantum computing remains one of the most challenging tasks in quantum science and technologies. However, the implementation of most important quantum operations with qubits (quantum analogues of classical bits), such as multiqubit Toffoli gate, requires either a polynomial number of operation or a linear number of operations with the use of ancilla qubits. Therefore, the reduction of the number of operations in the presence of scalability is a crucial goal in quantum information processing. One of the most elegant ideas in this direction is to use qudits (multilevel systems) instead of qubits and rely on additional levels of qudits instead of ancillas. Although some of the already obtained results demonstrate a reduction of the number of operation, they suffer from high complexity and/or of the absence of scalability. We show a strong reduction of the number of operations for the realization of the Toffoli gate by using qudits for a scalable multi-qudit processor. This is done on the basis of a general relation between the dimensionality of qudits and their topology of connections, that we derived.

Keywords: quantum computing, qudits, Toffoli gates, gate decomposition

Procedia PDF Downloads 146
1136 Wavelet-Based Classification of Myocardial Ischemia, Arrhythmia, Congestive Heart Failure and Sleep Apnea

Authors: Santanu Chattopadhyay, Gautam Sarkar, Arabinda Das

Abstract:

This paper presents wavelet based classification of various heart diseases. Electrocardiogram signals of different heart patients have been studied. Statistical natures of electrocardiogram signals for different heart diseases have been compared with the statistical nature of electrocardiograms for normal persons. Under this study four different heart diseases have been considered as follows: Myocardial Ischemia (MI), Congestive Heart Failure (CHF), Arrhythmia and Sleep Apnea. Statistical nature of electrocardiograms for each case has been considered in terms of kurtosis values of two types of wavelet coefficients: approximate and detail. Nine wavelet decomposition levels have been considered in each case. Kurtosis corresponding to both approximate and detail coefficients has been considered for decomposition level one to decomposition level nine. Based on significant difference, few decomposition levels have been chosen and then used for classification.

Keywords: arrhythmia, congestive heart failure, discrete wavelet transform, electrocardiogram, myocardial ischemia, sleep apnea

Procedia PDF Downloads 134
1135 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic

Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi

Abstract:

In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.

Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing

Procedia PDF Downloads 299
1134 Data Security and Privacy Challenges in Cloud Computing

Authors: Amir Rashid

Abstract:

Cloud Computing frameworks empower organizations to cut expenses by outsourcing computation resources on-request. As of now, customers of Cloud service providers have no methods for confirming the privacy and ownership of their information and data. To address this issue we propose the platform of a trusted cloud computing program (TCCP). TCCP empowers Infrastructure as a Service (IaaS) suppliers, for example, Amazon EC2 to give a shout box execution condition that ensures secret execution of visitor virtual machines. Also, it permits clients to bear witness to the IaaS supplier and decide if the administration is secure before they dispatch their virtual machines. This paper proposes a Trusted Cloud Computing Platform (TCCP) for guaranteeing the privacy and trustworthiness of computed data that are outsourced to IaaS service providers. The TCCP gives the deliberation of a shut box execution condition for a client's VM, ensuring that no cloud supplier's authorized manager can examine or mess up with its data. Furthermore, before launching the VM, the TCCP permits a client to dependably and remotely acknowledge that the provider at backend is running a confided in TCCP. This capacity extends the verification of whole administration, and hence permits a client to confirm the data operation in secure mode.

Keywords: cloud security, IaaS, cloud data privacy and integrity, hybrid cloud

Procedia PDF Downloads 299
1133 A Cloud Computing System Using Virtual Hyperbolic Coordinates for Services Distribution

Authors: Telesphore Tiendrebeogo, Oumarou Sié

Abstract:

Cloud computing technologies have attracted considerable interest in recent years. Thus, these latters have become more important for many existing database applications. It provides a new mode of use and of offer of IT resources in general. Such resources can be used “on demand” by anybody who has access to the internet. Particularly, the Cloud platform provides an ease to use interface between providers and users, allow providers to develop and provide software and databases for users over locations. Currently, there are many Cloud platform providers support large scale database services. However, most of these only support simple keyword-based queries and can’t response complex query efficiently due to lack of efficient in multi-attribute index techniques. Existing Cloud platform providers seek to improve performance of indexing techniques for complex queries. In this paper, we define a new cloud computing architecture based on a Distributed Hash Table (DHT) and design a prototype system. Next, we perform and evaluate our cloud computing indexing structure based on a hyperbolic tree using virtual coordinates taken in the hyperbolic plane. We show through our experimental results that we compare with others clouds systems to show our solution ensures consistence and scalability for Cloud platform.

Keywords: virtual coordinates, cloud, hyperbolic plane, storage, scalability, consistency

Procedia PDF Downloads 425
1132 Memristor-A Promising Candidate for Neural Circuits in Neuromorphic Computing Systems

Authors: Juhi Faridi, Mohd. Ajmal Kafeel

Abstract:

The advancements in the field of Artificial Intelligence (AI) and technology has led to an evolution of an intelligent era. Neural networks, having the computational power and learning ability similar to the brain is one of the key AI technologies. Neuromorphic computing system (NCS) consists of the synaptic device, neuronal circuit, and neuromorphic architecture. Memristor are a promising candidate for neuromorphic computing systems, but when it comes to neuromorphic computing, the conductance behavior of the synaptic memristor or neuronal memristor needs to be studied thoroughly in order to fathom the neuroscience or computer science. Furthermore, there is a need of more simulation work for utilizing the existing device properties and providing guidance to the development of future devices for different performance requirements. Hence, development of NCS needs more simulation work to make use of existing device properties. This work aims to provide an insight to build neuronal circuits using memristors to achieve a Memristor based NCS.  Here we throw a light on the research conducted in the field of memristors for building analog and digital circuits in order to motivate the research in the field of NCS by building memristor based neural circuits for advanced AI applications. This literature is a step in the direction where we describe the various Key findings about memristors and its analog and digital circuits implemented over the years which can be further utilized in implementing the neuronal circuits in the NCS. This work aims to help the electronic circuit designers to understand how the research progressed in memristors and how these findings can be used in implementing the neuronal circuits meant for the recent progress in the NCS.

Keywords: analog circuits, digital circuits, memristors, neuromorphic computing systems

Procedia PDF Downloads 174
1131 Inference for Compound Truncated Poisson Lognormal Model with Application to Maximum Precipitation Data

Authors: M. Z. Raqab, Debasis Kundu, M. A. Meraou

Abstract:

In this paper, we have analyzed maximum precipitation data during a particular period of time obtained from different stations in the Global Historical Climatological Network of the USA. One important point to mention is that some stations are shut down on certain days for some reason or the other. Hence, the maximum values are recorded by excluding those readings. It is assumed that the number of stations that operate follows zero-truncated Poisson random variables, and the daily precipitation follows a lognormal random variable. We call this model a compound truncated Poisson lognormal model. The proposed model has three unknown parameters, and it can take a variety of shapes. The maximum likelihood estimators can be obtained quite conveniently using Expectation-Maximization (EM) algorithm. Approximate maximum likelihood estimators are also derived. The associated confidence intervals also can be obtained from the observed Fisher information matrix. Simulation results have been performed to check the performance of the EM algorithm, and it is observed that the EM algorithm works quite well in this case. When we analyze the precipitation data set using the proposed model, it is observed that the proposed model provides a better fit than some of the existing models.

Keywords: compound Poisson lognormal distribution, EM algorithm, maximum likelihood estimation, approximate maximum likelihood estimation, Fisher information, skew distribution

Procedia PDF Downloads 108
1130 Indium-Gallium-Zinc Oxide Photosynaptic Device with Alkylated Graphene Oxide for Optoelectronic Spike Processing

Authors: Seyong Oh, Jin-Hong Park

Abstract:

Recently, neuromorphic computing based on brain-inspired artificial neural networks (ANNs) has attracted huge amount of research interests due to the technological abilities to facilitate massively parallel, low-energy consuming, and event-driven computing. In particular, research on artificial synapse that imitate biological synapses responsible for human information processing and memory is in the spotlight. Here, we demonstrate a photosynaptic device, wherein a synaptic weight is governed by a mixed spike consisting of voltage and light spikes. Compared to the device operated only by the voltage spike, ∆G in the proposed photosynaptic device significantly increased from -2.32nS to 5.95nS with no degradation of nonlinearity (NL) (potentiation/depression values were changed from 4.24/8 to 5/8). Furthermore, the Modified National Institute of Standards and Technology (MNIST) digit pattern recognition rates improved from 36% and 49% to 50% and 62% in ANNs consisting of the synaptic devices with 20 and 100 weight states, respectively. We expect that the photosynaptic device technology processed by optoelectronic spike will play an important role in implementing the neuromorphic computing systems in the future.

Keywords: optoelectronic synapse, IGZO (Indium-Gallium-Zinc Oxide) photosynaptic device, optoelectronic spiking process, neuromorphic computing

Procedia PDF Downloads 173
1129 Selecting Skyline Mash-Ups under Uncertainty

Authors: Aymen Gammoudi, Hamza Labbaci, Nizar Messai, Yacine Sam

Abstract:

Web Service Composition (Mash-up) has been considered as a new approach used to offer the user a set of Web Services responding to his request. These approaches can return a set of similar Mash-ups in a given context that makes users unable to select the perfect one. Recent approaches focus on computing the skyline over a set of Quality of Service (QoS) attributes. However, these approaches are not sufficient in a dynamic web service environment where the delivered QoS by a Web service is inherently uncertain. In this paper, we treat the problem of computing the skyline over a set of similar Mash-ups under certain dimension values. We generate dimensions for each Mash-up using aggregation operations applied to the QoS attributes. We then tackle the problem of computing the skyline under uncertain dimensions. We present each dimension value of mash-up using a frame of discernment and introduce the d-dominance using the Evidence Theory. Finally, we propose our experimental results that show both the effectiveness of the introduced skyline extensions and the efficiency of the proposed approaches.

Keywords: web services, uncertain QoS, mash-ups, uncertain dimensions, skyline, evidence theory, d-dominance

Procedia PDF Downloads 234
1128 A Review Paper on Data Security in Precision Agriculture Using Internet of Things

Authors: Tonderai Muchenje, Xolani Mkhwanazi

Abstract:

Precision agriculture uses a number of technologies, devices, protocols, and computing paradigms to optimize agricultural processes. Big data, artificial intelligence, cloud computing, and edge computing are all used to handle the huge amounts of data generated by precision agriculture. However, precision agriculture is still emerging and has a low level of security features. Furthermore, future solutions will demand data availability and accuracy as key points to help farmers, and security is important to build robust and efficient systems. Since precision agriculture comprises a wide variety and quantity of resources, security addresses issues such as compatibility, constrained resources, and massive data. Moreover, conventional protection schemes used in the traditional internet may not be useful for agricultural systems, creating extra demands and opportunities. Therefore, this paper aims at reviewing state of the art of precision agriculture security, particularly in open field agriculture, discussing its architecture, describing security issues, and presenting the major challenges and future directions.

Keywords: precision agriculture, security, IoT, EIDE

Procedia PDF Downloads 90
1127 Cloud Monitoring and Performance Optimization Ensuring High Availability and Security

Authors: Inayat Ur Rehman, Georgia Sakellari

Abstract:

Cloud computing has evolved into a vital technology for businesses, offering scalability, flexibility, and cost-effectiveness. However, maintaining high availability and optimal performance in the cloud is crucial for reliable services. This paper explores the significance of cloud monitoring and performance optimization in sustaining the high availability of cloud-based systems. It discusses diverse monitoring tools, techniques, and best practices for continually assessing the health and performance of cloud resources. The paper also delves into performance optimization strategies, including resource allocation, load balancing, and auto-scaling, to ensure efficient resource utilization and responsiveness. Addressing potential challenges in cloud monitoring and optimization, the paper offers insights into data security and privacy considerations. Through this thorough analysis, the paper aims to underscore the importance of cloud monitoring and performance optimization for ensuring a seamless and highly available cloud computing environment.

Keywords: cloud computing, cloud monitoring, performance optimization, high availability

Procedia PDF Downloads 65
1126 Development of Web-Based Remote Desktop to Provide Adaptive User Interfaces in Cloud Platform

Authors: Shuen-Tai Wang, Hsi-Ya Chang

Abstract:

Cloud virtualization technologies are becoming more and more prevalent, cloud users usually encounter the problem of how to access to the virtualized remote desktops easily over the web without requiring the installation of special clients. To resolve this issue, we took advantage of the HTML5 technology and developed web-based remote desktop. It permits users to access the terminal which running in our cloud platform from anywhere. We implemented a sketch of web interface following the cloud computing concept that seeks to enable collaboration and communication among users for high performance computing. Given the development of remote desktop virtualization, it allows to shift the user’s desktop from the traditional PC environment to the cloud platform, which is stored on a remote virtual machine rather than locally. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for online cloud service. This is also made possible by the low administrative costs as well as relatively inexpensive end-user terminals and reduced energy expenses.

Keywords: virtualization, remote desktop, HTML5, cloud computing

Procedia PDF Downloads 339
1125 Cloud-Based Dynamic Routing with Feedback in Formal Methods

Authors: Jawid Ahmad Baktash, Mursal Dawodi, Tomokazu Nagata

Abstract:

With the rapid growth of Cloud Computing, Formal Methods became a good choice for the refinement of message specification and verification for Dynamic Routing in Cloud Computing. Cloud-based Dynamic Routing is becoming increasingly popular. We propose feedback in Formal Methods for Dynamic Routing and Cloud Computing; the model and topologies show how to send messages from index zero to all others formally. The responsibility of proper verification becomes crucial with Dynamic Routing in the cloud. Formal Methods can play an essential role in the routing and development of Networks, and the testing of distributed systems. Event-B is a formal technique that consists of describing the problem rigorously and introduces solutions or details in the refinement steps. Event-B is a variant of B, designed for developing distributed systems and message passing of the dynamic routing. In Event-B and formal methods, the events consist of guarded actions occurring spontaneously rather than being invoked.

Keywords: cloud, dynamic routing, formal method, Pro-B, event-B

Procedia PDF Downloads 423
1124 Performance Evaluation of Fingerprint, Auto-Pin and Password-Based Security Systems in Cloud Computing Environment

Authors: Emmanuel Ogala

Abstract:

Cloud computing has been envisioned as the next-generation architecture of Information Technology (IT) enterprise. In contrast to traditional solutions where IT services are under physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centres, where the management of the data and services may not be fully trustworthy. This is due to the fact that the systems are opened to the whole world and as people tries to have access into the system, many people also are there trying day-in day-out on having unauthorized access into the system. This research contributes to the improvement of cloud computing security for better operation. The work is motivated by two problems: first, the observed easy access to cloud computing resources and complexity of attacks to vital cloud computing data system NIC requires that dynamic security mechanism evolves to stay capable of preventing illegitimate access. Second; lack of good methodology for performance test and evaluation of biometric security algorithms for securing records in cloud computing environment. The aim of this research was to evaluate the performance of an integrated security system (ISS) for securing exams records in cloud computing environment. In this research, we designed and implemented an ISS consisting of three security mechanisms of biometric (fingerprint), auto-PIN and password into one stream of access control and used for securing examination records in Kogi State University, Anyigba. Conclusively, the system we built has been able to overcome guessing abilities of hackers who guesses people password or pin. We are certain about this because the added security system (fingerprint) needs the presence of the user of the software before a login access can be granted. This is based on the placement of his finger on the fingerprint biometrics scanner for capturing and verification purpose for user’s authenticity confirmation. The study adopted the conceptual of quantitative design. Object oriented and design methodology was adopted. In the analysis and design, PHP, HTML5, CSS, Visual Studio Java Script, and web 2.0 technologies were used to implement the model of ISS for cloud computing environment. Note; PHP, HTML5, CSS were used in conjunction with visual Studio front end engine design tools and MySQL + Access 7.0 were used for the backend engine and Java Script was used for object arrangement and also validation of user input for security check. Finally, the performance of the developed framework was evaluated by comparing with two other existing security systems (Auto-PIN and password) within the school and the results showed that the developed approach (fingerprint) allows overcoming the two main weaknesses of the existing systems and will work perfectly well if fully implemented.

Keywords: performance evaluation, fingerprint, auto-pin, password-based, security systems, cloud computing environment

Procedia PDF Downloads 140
1123 Governance, Risk Management, and Compliance Factors Influencing the Adoption of Cloud Computing in Australia

Authors: Tim Nedyalkov

Abstract:

A business decision to move to the cloud brings fundamental changes in how an organization develops and delivers its Information Technology solutions. The accelerated pace of digital transformation across businesses and government agencies increases the reliance on cloud-based services. They are collecting, managing, and retaining large amounts of data in cloud environments makes information security and data privacy protection essential. It becomes even more important to understand what key factors drive successful cloud adoption following the commencement of the Privacy Amendment Notifiable Data Breaches (NDB) Act 2017 in Australia as the regulatory changes impact many organizations and industries. This quantitative correlational research investigated the governance, risk management, and compliance factors contributing to cloud security success. The factors influence the adoption of cloud computing within an organizational context after the commencement of the NDB scheme. The results and findings demonstrated that corporate information security policies, data storage location, management understanding of data governance responsibilities, and regular compliance assessments are the factors influencing cloud computing adoption. The research has implications for organizations, future researchers, practitioners, policymakers, and cloud computing providers to meet the rapidly changing regulatory and compliance requirements.

Keywords: cloud compliance, cloud security, data governance, privacy protection

Procedia PDF Downloads 116
1122 Intrusion Detection in Cloud Computing Using Machine Learning

Authors: Faiza Babur Khan, Sohail Asghar

Abstract:

With an emergence of distributed environment, cloud computing is proving to be the most stimulating computing paradigm shift in computer technology, resulting in spectacular expansion in IT industry. Many companies have augmented their technical infrastructure by adopting cloud resource sharing architecture. Cloud computing has opened doors to unlimited opportunities from application to platform availability, expandable storage and provision of computing environment. However, from a security viewpoint, an added risk level is introduced from clouds, weakening the protection mechanisms, and hardening the availability of privacy, data security and on demand service. Issues of trust, confidentiality, and integrity are elevated due to multitenant resource sharing architecture of cloud. Trust or reliability of cloud refers to its capability of providing the needed services precisely and unfailingly. Confidentiality is the ability of the architecture to ensure authorization of the relevant party to access its private data. It also guarantees integrity to protect the data from being fabricated by an unauthorized user. So in order to assure provision of secured cloud, a roadmap or model is obligatory to analyze a security problem, design mitigation strategies, and evaluate solutions. The aim of the paper is twofold; first to enlighten the factors which make cloud security critical along with alleviation strategies and secondly to propose an intrusion detection model that identifies the attackers in a preventive way using machine learning Random Forest classifier with an accuracy of 99.8%. This model uses less number of features. A comparison with other classifiers is also presented.

Keywords: cloud security, threats, machine learning, random forest, classification

Procedia PDF Downloads 320