Search results for: ambient computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1594

Search results for: ambient computing

1414 Navigating Cyber Attacks with Quantum Computing: Leveraging Vulnerabilities and Forensics for Advanced Penetration Testing in Cybersecurity

Authors: Sayor Ajfar Aaron, Ashif Newaz, Sajjat Hossain Abir, Mushfiqur Rahman

Abstract:

This paper examines the transformative potential of quantum computing in the field of cybersecurity, with a focus on advanced penetration testing and forensics. It explores how quantum technologies can be leveraged to identify and exploit vulnerabilities more efficiently than traditional methods and how they can enhance the forensic analysis of cyber-attacks. Through theoretical analysis and practical simulations, this study highlights the enhanced capabilities of quantum algorithms in detecting and responding to sophisticated cyber threats, providing a pathway for developing more resilient cybersecurity infrastructures.

Keywords: cybersecurity, cyber forensics, penetration testing, quantum computing

Procedia PDF Downloads 14
1413 Method and Apparatus for Optimized Job Scheduling in the High-Performance Computing Cloud Environment

Authors: Subodh Kumar, Amit Varde

Abstract:

Typical on-premises high-performance computing (HPC) environments consist of a fixed number and a fixed set of computing hardware. During the design of the HPC environment, the hardware components, including but not limited to CPU, Memory, GPU, and networking, are carefully chosen from select vendors for optimal performance. High capital cost for building the environment is a prime factor influencing the design environment. A class of software called “Job Schedulers” are critical to maximizing these resources and running multiple workloads to extract the maximum value for the high capital cost. In principle, schedulers work by preventing workloads and users from monopolizing the finite hardware resources by queuing jobs in a workload. A cloud-based HPC environment does not have the limitations of fixed (type of and quantity of) hardware resources. In theory, users and workloads could spin up any number and type of hardware resource. This paper discusses the limitations of using traditional scheduling algorithms for cloud-based HPC workloads. It proposes a new set of features, called “HPC optimizers,” for maximizing the benefits of the elasticity and scalability of the cloud with the goal of cost-performance optimization of the workload.

Keywords: high performance computing, HPC, cloud computing, optimization, schedulers

Procedia PDF Downloads 60
1412 Carcinogenic Polycyclic Aromatic Hydrocarbons in Urban Air Particulate Matter

Authors: A. Szabó Nagy, J. Szabó, Zs. Csanádi, J. Erdős

Abstract:

An assessment of the air quality of Győr (Hungary) was performed by determining the ambient concentrations of PM10-bound carcinogenic polycyclic aromatic hydrocarbons (cPAHs) in different seasons. A high volume sampler was used for the collection of ambient aerosol particles, and the associated cPAH compounds (benzo[a]pyrene (BaP), benzo[a]anthracene, benzofluoranthene isomers, indeno[123-cd]pyrene and dibenzo[ah]anthracene) were analyzed by a gas chromatographic method. Higher mean concentrations of total cPAHs were detected in samples collected in winter (9.62 ng/m3) and autumn (2.69 ng/m3) compared to spring (1.05 ng/m3) and summer (0.21 ng/m3). The calculated BaP toxic equivalent concentrations have also reflected that the local population appears to be exposed to significantly higher cancer risk in the heating seasons. Moreover, the concentration levels of cPAHs determined in this study were compared to other Hungarian urban sites.

Keywords: air, carcinogenic, polycyclic aromatic hydrocarbons (PAH), PM10

Procedia PDF Downloads 244
1411 Crow Search Algorithm-Based Task Offloading Strategies for Fog Computing Architectures

Authors: Aniket Ganvir, Ritarani Sahu, Suchismita Chinara

Abstract:

The rapid digitization of various aspects of life is leading to the creation of smart IoT ecosystems, where interconnected devices generate significant amounts of valuable data. However, these IoT devices face constraints such as limited computational resources and bandwidth. Cloud computing emerges as a solution by offering ample resources for offloading tasks efficiently despite introducing latency issues, especially for time-sensitive applications like fog computing. Fog computing (FC) addresses latency concerns by bringing computation and storage closer to the network edge, minimizing data travel distance, and enhancing efficiency. Offloading tasks to fog nodes or the cloud can conserve energy and extend IoT device lifespan. The offloading process is intricate, with tasks categorized as full or partial, and its optimization presents an NP-hard problem. Traditional greedy search methods struggle to address the complexity of task offloading efficiently. To overcome this, the efficient crow search algorithm (ECSA) has been proposed as a meta-heuristic optimization algorithm. ECSA aims to effectively optimize computation offloading, providing solutions to this challenging problem.

Keywords: IoT, fog computing, task offloading, efficient crow search algorithm

Procedia PDF Downloads 18
1410 Improved Reuse and Storage Performances at Room Temperature of a New Environmental-Friendly Lactate Oxidase Biosensor Made by Ambient Electrospray Deposition

Authors: Antonella Cartoni, Mattea Carmen Castrovilli

Abstract:

A biosensor for lactate detection has been developed using an environmentally friendly approach. The biosensor is based on lactate oxidase (LOX) and has remarkable capabilities for reuse and storage at room temperature. The manufacturing technique employed is ambient electrospray deposition (ESD), which enables efficient and sustainable immobilization of the LOX enzyme on a cost-effective com-mercial screen-printed Prussian blue/carbon electrode (PB/C-SPE). The study demonstrates that the ESD technology allows the biosensor to be stored at ambient pressure and temperature for extended periods without affecting the enzymatic activity. The biosensor can be stored for up to 90 days without requiring specific storage conditions, and it can be reused for up to 24 measurements on both freshly prepared electrodes and electrodes that are three months old. The LOX-based biosensor exhibits a lin-ear range of lactate detection between 0.1 and 1 mM, with a limit of detection of 0.07±0.02 mM. Ad-ditionally, it does not exhibit any memory effects. The immobilization process does not involve the use of entrapment matrices or hazardous chemicals, making it environmentally sustainable and non-toxic compared to current methods. Furthermore, the application of a electrospray deposition cycle on previously used biosensors rejuvenates their performance, making them comparable to freshly made biosensors. This highlights the excellent recycling potential of the technique, eliminating the waste as-sociated with disposable devices.

Keywords: green friendly, reuse, storage performance, immobilization, matrix-free, electrospray deposition, biosensor, lactate oxidase, enzyme

Procedia PDF Downloads 37
1409 A Distributed Cryptographically Generated Address Computing Algorithm for Secure Neighbor Discovery Protocol in IPv6

Authors: M. Moslehpour, S. Khorsandi

Abstract:

Due to shortage in IPv4 addresses, transition to IPv6 has gained significant momentum in recent years. Like Address Resolution Protocol (ARP) in IPv4, Neighbor Discovery Protocol (NDP) provides some functions like address resolution in IPv6. Besides functionality of NDP, it is vulnerable to some attacks. To mitigate these attacks, Internet Protocol Security (IPsec) was introduced, but it was not efficient due to its limitation. Therefore, SEND protocol is proposed to automatic protection of auto-configuration process. It is secure neighbor discovery and address resolution process. To defend against threats on NDP’s integrity and identity, Cryptographically Generated Address (CGA) and asymmetric cryptography are used by SEND. Besides advantages of SEND, its disadvantages like the computation process of CGA algorithm and sequentially of CGA generation algorithm are considerable. In this paper, we parallel this process between network resources in order to improve it. In addition, we compare the CGA generation time in self-computing and distributed-computing process. We focus on the impact of the malicious nodes on the CGA generation time in the network. According to the result, although malicious nodes participate in the generation process, CGA generation time is less than when it is computed in a one-way. By Trust Management System, detecting and insulating malicious nodes is easier.

Keywords: NDP, IPsec, SEND, CGA, modifier, malicious node, self-computing, distributed-computing

Procedia PDF Downloads 260
1408 Accelerating Side Channel Analysis with Distributed and Parallelized Processing

Authors: Kyunghee Oh, Dooho Choi

Abstract:

Although there is no theoretical weakness in a cryptographic algorithm, Side Channel Analysis can find out some secret data from the physical implementation of a cryptosystem. The analysis is based on extra information such as timing information, power consumption, electromagnetic leaks or even sound which can be exploited to break the system. Differential Power Analysis is one of the most popular analyses, as computing the statistical correlations of the secret keys and power consumptions. It is usually necessary to calculate huge data and takes a long time. It may take several weeks for some devices with countermeasures. We suggest and evaluate the methods to shorten the time to analyze cryptosystems. Our methods include distributed computing and parallelized processing.

Keywords: DPA, distributed computing, parallelized processing, side channel analysis

Procedia PDF Downloads 396
1407 Parallel Computing: Offloading Matrix Multiplication to GPU

Authors: Bharath R., Tharun Sai N., Bhuvan G.

Abstract:

This project focuses on developing a Parallel Computing method aimed at optimizing matrix multiplication through GPU acceleration. Addressing algorithmic challenges, GPU programming intricacies, and integration issues, the project aims to enhance efficiency and scalability. The methodology involves algorithm design, GPU programming, and optimization techniques. Future plans include advanced optimizations, extended functionality, and integration with high-level frameworks. User engagement is emphasized through user-friendly interfaces, open- source collaboration, and continuous refinement based on feedback. The project's impact extends to significantly improving matrix multiplication performance in scientific computing and machine learning applications.

Keywords: matrix multiplication, parallel processing, cuda, performance boost, neural networks

Procedia PDF Downloads 19
1406 A Study on How to Develop the Usage Metering Functions of BIM (Building Information Modeling) Software under Cloud Computing Environment

Authors: Kim Byung-Kon, Kim Young-Jin

Abstract:

As project opportunities for the Architecture, Engineering and Construction (AEC) industry have grown more complex and larger, the utilization of BIM (Building Information Modeling) technologies for 3D design and simulation practices has been increasing significantly; the typical applications of the BIM technologies include clash detection and design alternative based on 3D planning, which have been expanded over to the technology of construction management in the AEC industry for virtual design and construction. As for now, commercial BIM software has been operated under a single-user environment, which is why initial costs for its introduction are very high. Cloud computing, one of the most promising next-generation Internet technologies, enables simple Internet devices to use services and resources provided with BIM software. Recently in Korea, studies to link between BIM and cloud computing technologies have been directed toward saving costs to build BIM-related infrastructure, and providing various BIM services for small- and medium-sized enterprises (SMEs). This study addressed how to develop the usage metering functions of BIM software under cloud computing architecture in order to archive and use BIM data and create an optimal revenue structure so that the BIM services may grow spontaneously, considering a demand for cloud resources. To this end, the author surveyed relevant cases, and then analyzed needs and requirements from AEC industry. Based on the results & findings of the foregoing survey & analysis, the author proposed herein how to optimally develop the usage metering functions of cloud BIM software.

Keywords: construction IT, BIM (Building Information Modeling), cloud computing, BIM-based cloud computing, 3D design, cloud BIM

Procedia PDF Downloads 473
1405 ACO-TS: an ACO-based Algorithm for Optimizing Cloud Task Scheduling

Authors: Fahad Y. Al-dawish

Abstract:

The current trend by a large number of organizations and individuals to use cloud computing. Many consider it a significant shift in the field of computing. Cloud computing are distributed and parallel systems consisting of a collection of interconnected physical and virtual machines. With increasing request and profit of cloud computing infrastructure, diverse computing processes can be executed on cloud environment. Many organizations and individuals around the world depend on the cloud computing environments infrastructure to carry their applications, platform, and infrastructure. One of the major and essential issues in this environment related to allocating incoming tasks to suitable virtual machine (cloud task scheduling). Cloud task scheduling is classified as optimization problem, and there are several meta-heuristic algorithms have been anticipated to solve and optimize this problem. Good task scheduler should execute its scheduling technique on altering environment and the types of incoming task set. In this research project a cloud task scheduling methodology based on ant colony optimization ACO algorithm, we call it ACO-TS Ant Colony Optimization for Task Scheduling has been proposed and compared with different scheduling algorithms (Random, First Come First Serve FCFS, and Fastest Processor to the Largest Task First FPLTF). Ant Colony Optimization (ACO) is random optimization search method that will be used for assigning incoming tasks to available virtual machines VMs. The main role of proposed algorithm is to minimizing the makespan of certain tasks set and maximizing resource utilization by balance the load among virtual machines. The proposed scheduling algorithm was evaluated by using Cloudsim toolkit framework. Finally after analyzing and evaluating the performance of experimental results we find that the proposed algorithm ACO-TS perform better than Random, FCFS, and FPLTF algorithms in each of the makespaan and resource utilization.

Keywords: cloud Task scheduling, ant colony optimization (ACO), cloudsim, cloud computing

Procedia PDF Downloads 395
1404 Estimation of PM2.5 Emissions and Source Apportionment Using Receptor and Dispersion Models

Authors: Swetha Priya Darshini Thammadi, Sateesh Kumar Pisini, Sanjay Kumar Shukla

Abstract:

Source apportionment using Dispersion model depends primarily on the quality of Emission Inventory. In the present study, a CMB receptor model has been used to identify the sources of PM2.5, while the AERMOD dispersion model has been used to account for missing sources of PM2.5 in the Emission Inventory. A statistical approach has been developed to quantify the missing sources not considered in the Emission Inventory. The inventory of each grid was improved by adjusting emissions based on road lengths and deficit in measured and modelled concentrations. The results showed that in CMB analyses, fugitive sources - soil and road dust - contribute significantly to ambient PM2.5 pollution. As a result, AERMOD significantly underestimated the ambient air concentration at most locations. The revised Emission Inventory showed a significant improvement in AERMOD performance which is evident through statistical tests.

Keywords: CMB, GIS, AERMOD, PM₂.₅, fugitive, emission inventory

Procedia PDF Downloads 310
1403 A Type-2 Fuzzy Model for Link Prediction in Social Network

Authors: Mansoureh Naderipour, Susan Bastani, Mohammad Fazel Zarandi

Abstract:

Predicting links that may occur in the future and missing links in social networks is an attractive problem in social network analysis. Granular computing can help us to model the relationships between human-based system and social sciences in this field. In this paper, we present a model based on granular computing approach and Type-2 fuzzy logic to predict links regarding nodes’ activity and the relationship between two nodes. Our model is tested on collaboration networks. It is found that the accuracy of prediction is significantly higher than the Type-1 fuzzy and crisp approach.

Keywords: social network, link prediction, granular computing, type-2 fuzzy sets

Procedia PDF Downloads 298
1402 Design Procedure of Cold Bitumen Emulsion Mixtures

Authors: Hayder Shanbara, Felicite Ruddock, William Atherton, Ali Al-Rifaie

Abstract:

In highways construction, Hot Mix Asphalt (HMA) is used predominantly as a paving material from many years. Around 90 percent of the world road network is laid by flexible pavements. However, there are some restrictions on paving hot mix asphalt such as immoderate greenhouse gas emission, rainy season difficulties, fuel and energy consumption and cost. Therefore, Cold Bitumen Emulsion Mixture (CBEM) is considered an alternative mix to the HMA. CBEM is the popular type of Cold Mix Asphalt (CMA). It is unheated emulsion, aggregate and filler mixtures, which can be prepared and mixed at ambient temperature. This research presents a simple and more practicable design procedure of CBEM and discusses limitations of this design. CBEM is a mixture of bitumen emulsion and aggregates that mixed and produced at ambient temperature. It is relatively easy to produce, but the design procedure that provided by Asphalt Institute (Manual Series 14 (1989)) pose some issues in its practical application.

Keywords: cold bitumen, emulsion mixture, design procedure, pavement

Procedia PDF Downloads 224
1401 Efficacy of Vitamins A, C and E on the Growth Performance of Broiler Chickens Subjected to Heat Stress

Authors: Desierin Rodrin, Magdalena Alcantara, Cristina Olo

Abstract:

The increase in environmental temperatures brought about by climate change impacts negatively the growth performance of broilers that may be solved by manipulating the diet of the animals. Hence, this study was conducted to evaluate the effects of different vitamin supplements on the growth performance of broiler chickens subjected to ambient (31°C) and heat stress (34°C) temperatures. The treatments were: I- Control (no vitamin supplement), II- Vitamin A (4.5 mg/kg of feed), III- Vitamin C (250 mg/kg of feed), IV- Vitamin E (250 mg/kg of feed), V- Vitamin C and E (250 mg/kg of feed and 250 mg/kg of feed), VI- Vitamin A and E (4.5 mg/kg of feed and 250 mg/kg of feed), VII- Vitamin A and C (4.5 mg/kg of feed and 250 mg/kg of feed), and VIII- Vitamin A, C and E (4.5 mg/kg of feed, 250 mg/kg of feed and 250 mg/kg of feed). The birds (n=240) were distributed randomly into eight treatments replicated three times, with each replicates having five birds. Ambient temperature was maintained using a 25 watts bulb for every 20 birds, while heat stress condition was sustained at 34°C for about 9 hours daily by using a 50 watts bulb per 5 birds. The interaction of vitamin supplements and temperatures did not significantly (P>0.05) affected body weight, average daily gain, feed consumption and feed conversion efficiency throughout the growing period. Similarly, supplementation of different vitamins did not improve (P>0.05) the overall production performance of birds throughout the rearing period. Birds raised in heat stress (34°C) condition had significantly lower ((P<0.05) body weight, average daily gain, and feed consumption compared to birds raised in ambient temperature at weeks 3, 4 and 5 of rearing. Supplementation of vitamins A, C, and E in the diet of broilers did not alleviate the effect of heat stress in the growth performance of broilers.

Keywords: broiler growth performance, heat stress, vitamin supplementation, vitamin A, vitamin C, vitamin E

Procedia PDF Downloads 269
1400 Using High Performance Computing for Online Flood Monitoring and Prediction

Authors: Stepan Kuchar, Martin Golasowski, Radim Vavrik, Michal Podhoranyi, Boris Sir, Jan Martinovic

Abstract:

The main goal of this article is to describe the online flood monitoring and prediction system Floreon+ primarily developed for the Moravian-Silesian region in the Czech Republic and the basic process it uses for running automatic rainfall-runoff and hydrodynamic simulations along with their calibration and uncertainty modeling. It takes a long time to execute such process sequentially, which is not acceptable in the online scenario, so the use of high-performance computing environment is proposed for all parts of the process to shorten their duration. Finally, a case study on the Ostravice river catchment is presented that shows actual durations and their gain from the parallel implementation.

Keywords: flood prediction process, high performance computing, online flood prediction system, parallelization

Procedia PDF Downloads 469
1399 DNA Multiplier: A Design Architecture of a Multiplier Circuit Using DNA Molecules

Authors: Hafiz Md. Hasan Babu, Khandaker Mohammad Mohi Uddin, Nitish Biswas, Sarreha Tasmin Rikta, Nuzmul Hossain Nahid

Abstract:

Nanomedicine and bioengineering use biological systems that can perform computing operations. In a biocomputational circuit, different types of biomolecules and DNA (Deoxyribose Nucleic Acid) are used as active components. DNA computing has the capability of performing parallel processing and a large storage capacity that makes it diverse from other computing systems. In most processors, the multiplier is treated as a core hardware block, and multiplication is one of the time-consuming and lengthy tasks. In this paper, cost-effective DNA multipliers are designed using algorithms of molecular DNA operations with respect to conventional ones. The speed and storage capacity of a DNA multiplier are also much higher than a traditional silicon-based multiplier.

Keywords: biological systems, DNA multiplier, large storage, parallel processing

Procedia PDF Downloads 171
1398 Human Intraocular Thermal Field in Action with Different Boundary Conditions Considering Aqueous Humor and Vitreous Humor Fluid Flow

Authors: Dara Singh, Keikhosrow Firouzbakhsh, Mohammad Taghi Ahmadian

Abstract:

In this study, a validated 3D finite volume model of human eye is developed to study the fluid flow and heat transfer in the human eye at steady state conditions. For this purpose, discretized bio-heat transfer equation coupled with Boussinesq equation is analyzed with different anatomical, environmental, and physiological conditions. It is demonstrated that the fluid circulation is formed as a result of thermal gradients in various regions of eye. It is also shown that posterior region of the human eye is less affected by the ambient conditions compared to the anterior segment which is sensitive to the ambient conditions and also to the way the gravitational field is defined compared to the geometry of the eye making the circulations and the thermal field complicated in transient states. The effect of variation in material and boundary conditions guides us to the conclusion that thermal field of a healthy and non-healthy eye can be distinguished via computer simulations.

Keywords: bio-heat, boussinesq, conduction, convection, eye

Procedia PDF Downloads 316
1397 Analyze of Nanoscale Materials and Devices for Future Communication and Telecom Networks in the Gas Refinery

Authors: Mohamad Bagher Heidari, Hefzollah Mohammadian

Abstract:

New discoveries in materials on the nanometer-length scale are expected to play an important role in addressing ongoing and future challenges in the field of communication. Devices and systems for ultra-high speed short and long range communication links, portable and power efficient computing devices, high-density memory and logics, ultra-fast interconnects, and autonomous and robust energy scavenging devices for accessing ambient intelligence and needed information will critically depend on the success of next-generation emerging nonmaterials and devices. This article presents some exciting recent developments in nonmaterials that have the potential to play a critical role in the development and transformation of future intelligent communication and telecom networks in the gas refinery. The industry is benefiting from nanotechnology advances with numerous applications including those in smarter sensors, logic elements, computer chips, memory storage devices, optoelectronics.

Keywords: nonmaterial, intelligent communication, nanoscale, nanophotonic, telecom

Procedia PDF Downloads 296
1396 Hierarchical Queue-Based Task Scheduling with CloudSim

Authors: Wanqing You, Kai Qian, Ying Qian

Abstract:

The concepts of Cloud Computing provide users with infrastructure, platform and software as service, which make those services more accessible for people via Internet. To better analysis the performance of Cloud Computing provisioning policies as well as resources allocation strategies, a toolkit named CloudSim proposed. With CloudSim, the Cloud Computing environment can be easily constructed by modelling and simulating cloud computing components, such as datacenter, host, and virtual machine. A good scheduling strategy is the key to achieve the load balancing among different machines as well as to improve the utilization of basic resources. Recently, the existing scheduling algorithms may work well in some presumptive cases in a single machine; however they are unable to make the best decision for the unforeseen future. In real world scenario, there would be numbers of tasks as well as several virtual machines working in parallel. Based on the concepts of multi-queue, this paper presents a new scheduling algorithm to schedule tasks with CloudSim by taking into account several parameters, the machines’ capacity, the priority of tasks and the history log.

Keywords: hierarchical queue, load balancing, CloudSim, information technology

Procedia PDF Downloads 396
1395 The Development and Testing of a Small Scale Dry Electrostatic Precipitator for the Removal of Particulate Matter

Authors: Derek Wardle, Tarik Al-Shemmeri, Neil Packer

Abstract:

This paper presents a small tube/wire type electrostatic precipitator (ESP). In the ESPs present form, particle charging and collecting voltages and airflow rates were individually varied throughout 200 ambient temperature test runs ranging from 10 to 30 kV in increments on 5 kV and 0.5 m/s to 1.5 m/s, respectively. It was repeatedly observed that, at input air velocities of between 0.5 and 0.9 m/s and voltage settings of 20 kV to 30 kV, the collection efficiency remained above 95%. The outcomes of preliminary tests at combustion flue temperatures are, at present, inconclusive although indications are that there is little or no drop in comparable performance during ideal test conditions. A limited set of similar tests was carried out during which the collecting electrode was grounded, having been disconnected from the static generator. The collecting efficiency fell significantly, and for that reason, this approach was not pursued further. The collecting efficiencies during ambient temperature tests were determined by mass balance between incoming and outgoing dry PM. The efficiencies of combustion temperature runs are determined by analysing the difference in opacity of the flue gas at inlet and outlet compared to a reference light source. In addition, an array of Leit tabs (carbon coated, electrically conductive adhesive discs) was placed at inlet and outlet for a number of four-day continuous ambient temperature runs. Analysis of the discs’ contamination was carried out using scanning electron microscopy and ImageJ computer software that confirmed collection efficiencies of over 99% which gave unequivocal support to all the previous tests. The average efficiency for these runs was 99.409%. Emissions collected from a woody biomass combustion unit, classified to a diameter of 100 µm, were used in all ambient temperature trials test runs apart from two which collected airborne dust from within the laboratory. Sawdust and wood pellets were chosen for laboratory and field combustion trials. Video recordings were made of three ambient temperature test runs in which the smoke from a wood smoke generator was drawn through the precipitator. Although these runs were visual indicators only, with no objective other than to display, they provided a strong argument for the device’s claimed efficiency, as no emissions were visible at exit when energised.  The theoretical performance of ESPs, when applied to the geometry and configuration of the tested model, was compared to the actual performance and was shown to be in good agreement with it.

Keywords: electrostatic precipitators, air quality, particulates emissions, electron microscopy, image j

Procedia PDF Downloads 231
1394 Performance Analysis of Elliptic Curve Cryptography Using Onion Routing to Enhance the Privacy and Anonymity in Grid Computing

Authors: H. Parveen Begam, M. A. Maluk Mohamed

Abstract:

Grid computing is an environment that allows sharing and coordinated use of diverse resources in dynamic, heterogeneous and distributed environment using Virtual Organization (VO). Security is a critical issue due to the open nature of the wireless channels in the grid computing which requires three fundamental services: authentication, authorization, and encryption. The privacy and anonymity are considered as an important factor while communicating over publicly spanned network like web. To ensure a high level of security we explored an extension of onion routing, which has been used with dynamic token exchange along with protection of privacy and anonymity of individual identity. To improve the performance of encrypting the layers, the elliptic curve cryptography is used. Compared to traditional cryptosystems like RSA (Rivest-Shamir-Adelman), ECC (Elliptic Curve Cryptosystem) offers equivalent security with smaller key sizes which result in faster computations, lower power consumption, as well as memory and bandwidth savings. This paper presents the estimation of the performance improvements of onion routing using ECC as well as the comparison graph between performance level of RSA and ECC.

Keywords: grid computing, privacy, anonymity, onion routing, ECC, RSA

Procedia PDF Downloads 374
1393 A Two Level Load Balancing Approach for Cloud Environment

Authors: Anurag Jain, Rajneesh Kumar

Abstract:

Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.

Keywords: cloud analyst, cloud computing, join idle queue, join shortest queue, load balancing, task scheduling

Procedia PDF Downloads 400
1392 Secure Hashing Algorithm and Advance Encryption Algorithm in Cloud Computing

Authors: Jaimin Patel

Abstract:

Cloud computing is one of the most sharp and important movement in various computing technologies. It provides flexibility to users, cost effectiveness, location independence, easy maintenance, enables multitenancy, drastic performance improvements, and increased productivity. On the other hand, there are also major issues like security. Being a common server, security for a cloud is a major issue; it is important to provide security to protect user’s private data, and it is especially important in e-commerce and social networks. In this paper, encryption algorithms such as Advanced Encryption Standard algorithms, their vulnerabilities, risk of attacks, optimal time and complexity management and comparison with other algorithms based on software implementation is proposed. Encryption techniques to improve the performance of AES algorithms and to reduce risk management are given. Secure Hash Algorithms, their vulnerabilities, software implementations, risk of attacks and comparison with other hashing algorithms as well as the advantages and disadvantages between hashing techniques and encryption are given.

Keywords: Cloud computing, encryption algorithm, secure hashing algorithm, brute force attack, birthday attack, plaintext attack, man in middle attack

Procedia PDF Downloads 253
1391 MLProxy: SLA-Aware Reverse Proxy for Machine Learning Inference Serving on Serverless Computing Platforms

Authors: Nima Mahmoudi, Hamzeh Khazaei

Abstract:

Serving machine learning inference workloads on the cloud is still a challenging task at the production level. The optimal configuration of the inference workload to meet SLA requirements while optimizing the infrastructure costs is highly complicated due to the complex interaction between batch configuration, resource configurations, and variable arrival process. Serverless computing has emerged in recent years to automate most infrastructure management tasks. Workload batching has revealed the potential to improve the response time and cost-effectiveness of machine learning serving workloads. However, it has not yet been supported out of the box by serverless computing platforms. Our experiments have shown that for various machine learning workloads, batching can hugely improve the system’s efficiency by reducing the processing overhead per request. In this work, we present MLProxy, an adaptive reverse proxy to support efficient machine learning serving workloads on serverless computing systems. MLProxy supports adaptive batching to ensure SLA compliance while optimizing serverless costs. We performed rigorous experiments on Knative to demonstrate the effectiveness of MLProxy. We showed that MLProxy could reduce the cost of serverless deployment by up to 92% while reducing SLA violations by up to 99% that can be generalized across state-of-the-art model serving frameworks.

Keywords: serverless computing, machine learning, inference serving, Knative, google cloud run, optimization

Procedia PDF Downloads 140
1390 Investigations on Geopolymer Concrete Slabs

Authors: Akhila Jose

Abstract:

The cement industry is one of the major contributors to the global warming due to the release of greenhouse gases. The primary binder in conventional concrete is Ordinary Portland cement (OPC) and billions of tons are produced annually all over the world. An alternative binding material to OPC is needed to reduce the environmental impact caused during the cement manufacturing process. Geopolymer concrete is an ideal material to substitute cement-based binder. Geopolymer is an inorganic alumino-silicate polymer. Geopolymer Concrete (GPC) is formed by the polymerization of aluminates and silicates formed by the reaction of solid aluminosilicates with alkali hydroxides or alkali silicates. Various Industrial bye- products like Fly Ash (FA), Rice Husk Ash (RHA), Ground granulated Blast Furnace Slag (GGBFS), Silica Fume (SF), Red mud (RM) etc. are rich in aluminates and silicates. Using by-products from other industries reduces the carbon dioxide emission and thus giving a sustainable way of reducing greenhouse gas emissions and also a way to dispose the huge wastes generated from the major industries like thermal plants, steel plants, etc. The earlier research about geopolymer were focused on heat cured fly ash based precast members and this limited its applications. The heat curing mechanism itself is highly cumbersome and costly even though they possess high compressive strength, low drying shrinkage and creep, and good resistance to sulphate and acid environments. GPC having comparable strength and durability characteristics of OPC were able to develop under ambient cured conditions is the solution making it a sustainable alternative in future. In this paper an attempt has been made to review and compare the feasibility of ambient cured GPC over heat cured geopolymer concrete with respect to strength and serviceability characteristics. The variation on the behavior of structural members is also reviewed to identify the research gaps for future development of ambient cured geopolymer concrete. The comparison and analysis of studies showed that GPC most importantly ambient cured type has a comparable behavior with respect to OPC based concrete in terms strength and durability criteria.

Keywords: geopolymer concrete, oven heated, durability properties, mechanical properties

Procedia PDF Downloads 160
1389 Oxalate Content of Raw and Cooked Amaranth and Strawberry Spinach Grown in an Elevated CO₂ Atmosphere

Authors: Madhuri Kanala, Geoffrey Savage

Abstract:

Worldwide CO₂ levels are slowly rising, and this may have effects on the growth and nutritional composition of many food plants. The production of secondary metabolites such as oxalates has not been investigated in depth. The oxalate content of many food plants are known to have adverse nutritional effects on humans and reduction in the oxalate contents of food plants is a very positive move. Recent studies had shown that the oxalate content of the leaves of spinach and silver beet reduced when the plants were grown in an environment where CO₂ was increased. The response of amaranth and strawberry spinach leaves to changes in the high CO₂ environment have not been understood though it is known that the plants do contain appreciable oxalate contents. A study was conducted where amaranth and strawberry spinach plants were grown in identical plant growth chambers with the same environmental conditions except that one chamber was supplied with ambient air (CO₂ 405 ppm) while the other chamber had the CO₂ level increased to 650 ppm. The total and soluble oxalate content of the leaves of raw and cooked amaranth and strawberry spinach were determined by HPLC and calcium levels were determined using ICP following 6 weeks of growth. The total oxalate content of the fresh leaves of amaranth and strawberry spinach were reduced by 29.5 % and 24.6% respectively in the leaves of the plants grown in increased CO₂ conditions compared to ambient levels. The soluble oxalate content of amaranth leaves grown under ambient and increased CO₂ conditions were future reduced by 42% and 26.8% respectively following cooking as the soluble oxalate was leached into the cooking water and discarded. The reduction of the oxalate and calcium levels of raw and cooked amaranth and strawberry spinach leaves following an increase in CO₂ content in the air is an interesting positive response to an otherwise significant environmental problem.

Keywords: amaranth, calcium oxalate, enriched CO₂, oxalates, strawberry spinach

Procedia PDF Downloads 166
1388 Effects of Environmental Parameters on Salmonella Contaminated in Harvested Oysters (Crassostrea lugubris and Crassostrea belcheri)

Authors: Varangkana Thaotumpitak, Jarukorn Sripradite, Saharuetai Jeamsripong

Abstract:

Environmental contamination from wastewater discharges originated from anthropogenic activities introduces the accumulation of enteropathogenic bacteria in aquatic animals, especially in oysters, and in shellfish harvesting areas. The consumption of raw or partially cooked oysters can be a risk for seafood-borne diseases in human. This study aimed to evaluate the relationship between the presence of Salmonella in oyster meat samples, and environmental factors (ambient air temperature, relative humidity, gust wind speed, average wind speed, tidal condition, precipitation and season) by using the principal component analysis (PCA). One hundred and forty-four oyster meat samples were collected from four oyster harvesting areas in Phang Nga province, Thailand from March 2016 to February 2017. The prevalence of Salmonella of each site was ranged from 25.0-36.11% in oyster meat. The results of PCA showed that ambient air temperature, relative humidity, and precipitation were main factors correlated with Salmonella detection in these oysters. Positive relationship was observed between positive Salmonella in the oysters and relative humidity (PC1=0.413) and precipitation (PC1=0.607), while the negative association was found between ambient air temperature (PC1=0.338) and the presence of Salmonella in oyster samples. These results suggested that lower temperature and higher precipitation and higher relative humidity will possibly effect on Salmonella contamination of oyster meat. During the high risk period, harvesting of oysters should be prohibited to reduce pathogenic bacteria contamination and to minimize a hazard of humans from Salmonellosis.

Keywords: oyster, Phang Nga Bay, principal component analysis, Salmonella

Procedia PDF Downloads 113
1387 An Intelligent Thermal-Aware Task Scheduler in Multiprocessor System on a Chip

Authors: Sina Saadati

Abstract:

Multiprocessors Systems-On-Chips (MPSOCs) are used widely on modern computers to execute sophisticated software and applications. These systems include different processors for distinct aims. Most of the proposed task schedulers attempt to improve energy consumption. In some schedulers, the processor's temperature is considered to increase the system's reliability and performance. In this research, we have proposed a new method for thermal-aware task scheduling which is based on an artificial neural network (ANN). This method enables us to consider a variety of factors in the scheduling process. Some factors like ambient temperature, season (which is important for some embedded systems), speed of the processor, computing type of tasks and have a complex relationship with the final temperature of the system. This Issue can be solved using a machine learning algorithm. Another point is that our solution makes the system intelligent So that It can be adaptive. We have also shown that the computational complexity of the proposed method is cheap. As a consequence, It is also suitable for battery-powered systems.

Keywords: task scheduling, MOSOC, artificial neural network, machine learning, architecture of computers, artificial intelligence

Procedia PDF Downloads 75
1386 A Study on the Effectiveness of Alternative Commercial Ventilation Inlets That Improve Energy Efficiency of Building Ventilation Systems

Authors: Brian Considine, Aonghus McNabola, John Gallagher, Prashant Kumar

Abstract:

Passive air pollution control devices known as aspiration efficiency reducers (AER) have been developed using aspiration efficiency (AE) concepts. Their purpose is to reduce the concentration of particulate matter (PM) drawn into a building air handling unit (AHU) through alterations in the inlet design improving energy consumption. In this paper an examination is conducted into the effect of installing a deflector system around an AER-AHU inlet for both a forward and rear-facing orientations relative to the wind. The results of the study found that these deflectors are an effective passive control method for reducing AE at various ambient wind speeds over a range of microparticles of varying diameter. The deflector system was found to induce a large wake zone at low ambient wind speeds for a rear-facing AER-AHU, resulting in significantly lower AE in comparison to without. As the wind speed increased, both contained a wake zone but have much lower concentration gradients with the deflectors. For the forward-facing models, the deflector system at low ambient wind speed was preferred at higher Stokes numbers but there was negligible difference as the Stokes number decreased. Similarly, there was no significant difference at higher wind speeds across the Stokes number range tested. The results demonstrate that a deflector system is a viable passive control method for the reduction of ventilation energy consumption.

Keywords: air handling unit, air pollution, aspiration efficiency, energy efficiency, particulate matter, ventilation

Procedia PDF Downloads 100
1385 Knowledge Reactor: A Contextual Computing Work in Progress for Eldercare

Authors: Scott N. Gerard, Aliza Heching, Susann M. Keohane, Samuel S. Adams

Abstract:

The world-wide population of people over 60 years of age is growing rapidly. The explosion is placing increasingly onerous demands on individual families, multiple industries and entire countries. Current, human-intensive approaches to eldercare are not sustainable, but IoT and AI technologies can help. The Knowledge Reactor (KR) is a contextual, data fusion engine built to address this and other similar problems. It fuses and centralizes IoT and System of Record/Engagement data into a reactive knowledge graph. Cognitive applications and services are constructed with its multiagent architecture. The KR can scale-up and scaledown, because it exploits container-based, horizontally scalable services for graph store (JanusGraph) and pub-sub (Kafka) technologies. While the KR can be applied to many domains that require IoT and AI technologies, this paper describes how the KR specifically supports the challenging domain of cognitive eldercare. Rule- and machine learning-based analytics infer activities of daily living from IoT sensor readings. KR scalability, adaptability, flexibility and usability are demonstrated.

Keywords: ambient sensing, AI, artificial intelligence, eldercare, IoT, internet of things, knowledge graph

Procedia PDF Downloads 150