Search results for: secure cloud computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1962

Search results for: secure cloud computing

1692 IoT Based Information Processing and Computing

Authors: Mannan Ahmad Rasheed, Sawera Kanwal, Mansoor Ahmad Rasheed

Abstract:

The Internet of Things (IoT) has revolutionized the way we collect and process information, making it possible to gather data from a wide range of connected devices and sensors. This has led to the development of IoT-based information processing and computing systems that are capable of handling large amounts of data in real time. This paper provides a comprehensive overview of the current state of IoT-based information processing and computing, as well as the key challenges and gaps that need to be addressed. This paper discusses the potential benefits of IoT-based information processing and computing, such as improved efficiency, enhanced decision-making, and cost savings. Despite the numerous benefits of IoT-based information processing and computing, several challenges need to be addressed to realize the full potential of these systems. These challenges include security and privacy concerns, interoperability issues, scalability and reliability of IoT devices, and the need for standardization and regulation of IoT technologies. Moreover, this paper identifies several gaps in the current research related to IoT-based information processing and computing. One major gap is the lack of a comprehensive framework for designing and implementing IoT-based information processing and computing systems.

Keywords: IoT, computing, information processing, Iot computing

Procedia PDF Downloads 145
1691 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations

Authors: Deepak Singh, Rail Kuliev

Abstract:

The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.

Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization

Procedia PDF Downloads 40
1690 Cloud-Based Mobile-to-Mobile Computation Offloading

Authors: Ebrahim Alrashed, Yousef Rafique

Abstract:

Mobile devices have drastically changed the way we do things on the move. They are being extremely relied on to perform tasks that are analogous to desktop computer capability. There has been a rapid increase of computational power on these devices; however, battery technology is still the bottleneck of evolution. The primary modern approach day approach to tackle this issue is offloading computation to the cloud, proving to be latency expensive and requiring high network bandwidth. In this paper, we explore efforts to perform barter-based mobile-to-mobile offloading. We present define a protocol and present an architecture to facilitate the development of such a system. We further highlight the deployment and security challenges.

Keywords: computational offloading, power conservation, cloud, sandboxing

Procedia PDF Downloads 362
1689 Analyzing the Quality of Cloud-Based E-Learning Systems on the Perception of the Learners and the Teachers

Authors: R. W. C. Devindi, S. M. Buddika Harshanath

Abstract:

E-learning is a widely used technology for learning in the modern world. With the pandemic situation the popularity of using e-learning has been increased in a larger capacity. The e-learning educational systems require software resources as well as hardware usually but it is hard for most of the education institutions to afford those resources. Also with the massive user load e-learning has to broaden the server side resources as well. Therefore, in the present cloud computing was implemented in order to make the e – learning systems more efficient. The researcher has analyzed the quality of the e-learning systems on the perception of the learners and the teachers with the aid of hypothesis and has given the analyzed results and the discussion in this report. Therefore, the future research will be able to get some steps to increase the quality of the online learning systems furthermore. In the case of e-learning, quality assurance and cost effectiveness are essential. A complex quality assurance system is used in the stated project. There are no well-defined standard evaluation measures in this field. As a result, accurately assessing the e-learning system's overall quality is challenging. The researcher has done the analysis with the aid of standard methods and software.

Keywords: LMS–learning management system, SPSS–statistical package for social sciences (software), eigen value, hypothesis

Procedia PDF Downloads 83
1688 Comparison of Security Challenges and Issues of Mobile Computing and Internet of Things

Authors: Aabiah Nayeem, Fariha Shafiq, Mustabshra Aftab, Rabia Saman Pirzada, Samia Ghazala

Abstract:

In this modern era of technology, the concept of Internet of Things is very popular in every domain. It is a widely distributed system of things in which the data collected from sensory devices is transmitted, analyzed locally/collectively then broadcasted to network where action can be taken remotely via mobile/web apps. Today’s mobile computing is also gaining importance as the services are provided during mobility. Through mobile computing, data are transmitted via computer without physically connected to a fixed point. The challenge is to provide services with high speed and security. Also, the data gathered from the mobiles must be processed in a secured way. Mobile computing is strongly influenced by internet of things. In this paper, we have discussed security issues and challenges of internet of things and mobile computing and we have compared both of them on the basis of similarities and dissimilarities.

Keywords: embedded computing, internet of things, mobile computing, wireless technologies

Procedia PDF Downloads 289
1687 Mobile Cloud Application in Design Build Bridge Construction

Authors: Meng Sun, Bin Wei

Abstract:

In the past decades, design-build has become a more popular project delivery system especially for the large scaled infrastructure project in North America. It provides a one-stop shopping system for the client therefore improves the efficiency of construction, and reduces the risks and overall cost for the clients. Compared to the project with traditional delivery method, design-build project requires contractor and designer to work together efficiently to deliver the best-value solutions through the construction process. How to facilitate a solid integration and efficient interaction between contractor and designer often affects the schedule, budget and quality of the construction therefore becomes a key factor to the success of a design-build project. This paper presents a concept of using modern mobile cloud technology to provide an integrated solution during the design-build construction. It uses mobile cloud architecture to provide a platform for real-time field progress, change request approval, job progress log, and project time entry with devices integration for field information and communications. The paper uses a real filed change notice as an example to demonstrate how mobile cloud technology applies in a design-build project and how it can improve the project efficiency.

Keywords: cloud, design-build, field change notice, mobile application

Procedia PDF Downloads 218
1686 Open Source Cloud Managed Enterprise WiFi

Authors: James Skon, Irina Beshentseva, Michelle Polak

Abstract:

Wifi solutions come in two major classes. Small Office/Home Office (SOHO) WiFi, characterized by inexpensive WiFi routers, with one or two service set identifiers (SSIDs), and a single shared passphrase. These access points provide no significant user management or monitoring, and no aggregation of monitoring and control for multiple routers. The other solution class is managed enterprise WiFi solutions, which involve expensive Access Points (APs), along with (also costly) local or cloud based management components. These solutions typically provide portal based login, per user virtual local area networks (VLANs), and sophisticated monitoring and control across a large group of APs. The cost for deploying and managing such managed enterprise solutions is typically about 10 fold that of inexpensive consumer APs. Low revenue organizations, such as schools, non-profits, non-government organizations (NGO's), small businesses, and even homes cannot easily afford quality enterprise WiFi solutions, though they may need to provide quality WiFi access to their population. Using available lower cost Wifi solutions can significantly reduce their ability to provide reliable, secure network access. This project explored and created a new approach for providing secured managed enterprise WiFi based on low cost hardware combined with both new and existing (but modified) open source software. The solution provides a cloud based management interface which allows organizations to aggregate the configuration and management of small, medium and large WiFi solutions. It utilizes a novel approach for user management, giving each user a unique passphrase. It provides unlimited SSID's across an unlimited number of WiFI zones, and the ability to place each user (and all their devices) on their own VLAN. With proper configuration it can even provide user local services. It also allows for users' usage and quality of service to be monitored, and for users to be added, enabled, and disabled at will. As inferred above, the ultimate goal is to free organizations with limited resources from the expense of a commercial enterprise WiFi, while providing them with most of the qualities of such a more expensive managed solution at a fraction of the cost.

Keywords: wifi, enterprise, cloud, managed

Procedia PDF Downloads 58
1685 VCloud: A Security Framework for VANET

Authors: Wiseborn Manfe Danquah, D. Turgay Altilar

Abstract:

Vehicular Ad-hoc Network (VANET) is an integral component of Intelligent Transport Systems (ITS) that has enjoyed a lot of attention from the research community and the automotive industry. This is mainly due to the opportunities and challenges it presents. Vehicular Ad-hoc Network being a class of Mobile Ad-hoc Networks (MANET) has all the security concerns existing in traditional MANET as well as new security and privacy concerns introduced by the unique vehicular communication environment. This paper provides a survey of the possible attacks in vehicular environment, as well as security and privacy concerns in VANET. It also provides an insight into the development of a comprehensive cloud framework to provide a more robust and secured communication among vehicular nodes and road side units. Our proposal, a Metropolitan Based Public Interconnected Vehicular Cloud (MIVC) infrastructure seeks to provide a more reliable and secured vehicular communication network.

Keywords: mobile Ad-hoc networks, vehicular ad hoc network, cloud, ITS, road side units (RSU), metropolitan interconnected vehicular cloud (MIVC)

Procedia PDF Downloads 323
1684 Enhancing Security and Privacy Protocols in Telehealth: A Comprehensive Approach across IoT/Fog/Cloud Environments

Authors: Yunyong Guo, Man Wang, Bryan Guo, Nathan Guo

Abstract:

This paper introduces an advanced security and privacy model tailored for Telehealth systems, emphasizing end-to-end protection across IoT, Fog, and Cloud components. The proposed model integrates encryption, key management, intrusion detection, and privacy-preserving measures to safeguard patient data. A comprehensive simulation study evaluates the model's effectiveness in scenarios such as unauthorized access, physical breaches, and insider threats. Results indicate notable success in detecting and mitigating threats yet underscore areas for refinement. The study contributes insights into the intricate balance between security and usability in Telehealth environments, setting the stage for continued advancements.

Keywords: cloud, enhancing security, fog, IoT, telehealth

Procedia PDF Downloads 35
1683 MLOps Scaling Machine Learning Lifecycle in an Industrial Setting

Authors: Yizhen Zhao, Adam S. Z. Belloum, Goncalo Maia Da Costa, Zhiming Zhao

Abstract:

Machine learning has evolved from an area of academic research to a real-word applied field. This change comes with challenges, gaps and differences exist between common practices in academic environments and the ones in production environments. Following continuous integration, development and delivery practices in software engineering, similar trends have happened in machine learning (ML) systems, called MLOps. In this paper we propose a framework that helps to streamline and introduce best practices that facilitate the ML lifecycle in an industrial setting. This framework can be used as a template that can be customized to implement various machine learning experiment. The proposed framework is modular and can be recomposed to be adapted to various use cases (e.g. data versioning, remote training on cloud). The framework inherits practices from DevOps and introduces other practices that are unique to the machine learning system (e.g.data versioning). Our MLOps practices automate the entire machine learning lifecycle, bridge the gap between development and operation.

Keywords: cloud computing, continuous development, data versioning, DevOps, industrial setting, MLOps

Procedia PDF Downloads 230
1682 Observationally Constrained Estimates of Aerosol Indirect Radiative Forcing over Indian Ocean

Authors: Sofiya Rao, Sagnik Dey

Abstract:

Aerosol-cloud-precipitation interaction continues to be one of the largest sources of uncertainty in quantifying the aerosol climate forcing. The uncertainty is increasing from global to regional scale. This problem remains unresolved due to the large discrepancy in the representation of cloud processes in the climate models. Most of the studies on aerosol-cloud-climate interaction and aerosol-cloud-precipitation over Indian Ocean (like INDOEX, CAIPEEX campaign etc.) are restricted to either particular to one season or particular to one region. Here we developed a theoretical framework to quantify aerosol indirect radiative forcing using Moderate Resolution Imaging Spectroradiometer (MODIS) aerosol and cloud products of 15 years (2000-2015) period over the Indian Ocean. This framework relies on the observationally constrained estimate of the aerosol-induced change in cloud albedo. We partitioned the change in cloud albedo into the change in Liquid Water Path (LWP) and Effective Radius of Clouds (Reff) in response to an aerosol optical depth (AOD). Cloud albedo response to an increase in AOD is most sensitive in the range of LWP between 120-300 gm/m² for a range of Reff varying from 8-24 micrometer, which means aerosols are most sensitive to this range of LWP and Reff. Using this framework, aerosol forcing during a transition from indirect to semi-direct effect is also calculated. The outcome of this analysis shows best results over the Arabian Sea in comparison with the Bay of Bengal and the South Indian Ocean because of heterogeneity in aerosol spices over the Arabian Sea. Over the Arabian Sea during Winter Season the more absorbing aerosols are dominating, during Pre-monsoon dust (coarse mode aerosol particles) are more dominating. In winter and pre-monsoon majorly the aerosol forcing is more dominating while during monsoon and post-monsoon season meteorological forcing is more dominating. Over the South Indian Ocean, more or less same types of aerosol (Sea salt) are present. Over the Arabian Sea the Aerosol Indirect Radiative forcing are varying from -5 ± 4.5 W/m² for winter season while in other seasons it is reducing. The results provide observationally constrained estimates of aerosol indirect forcing in the Indian Ocean which can be helpful in evaluating the climate model performance in the context of such complex interactions.

Keywords: aerosol-cloud-precipitation interaction, aerosol-cloud-climate interaction, indirect radiative forcing, climate model

Procedia PDF Downloads 142
1681 Internet of Things, Edge and Cloud Computing in Rock Mechanical Investigation for Underground Surveys

Authors: Esmael Makarian, Ayub Elyasi, Fatemeh Saberi, Olusegun Stanley Tomomewo

Abstract:

Rock mechanical investigation is one of the most crucial activities in underground operations, especially in surveys related to hydrocarbon exploration and production, geothermal reservoirs, energy storage, mining, and geotechnics. There is a wide range of traditional methods for driving, collecting, and analyzing rock mechanics data. However, these approaches may not be suitable or work perfectly in some situations, such as fractured zones. Cutting-edge technologies have been provided to solve and optimize the mentioned issues. Internet of Things (IoT), Edge, and Cloud Computing technologies (ECt & CCt, respectively) are among the most widely used and new artificial intelligence methods employed for geomechanical studies. IoT devices act as sensors and cameras for real-time monitoring and mechanical-geological data collection of rocks, such as temperature, movement, pressure, or stress levels. Structural integrity, especially for cap rocks within hydrocarbon systems, and rock mass behavior assessment, to further activities such as enhanced oil recovery (EOR) and underground gas storage (UGS), or to improve safety risk management (SRM) and potential hazards identification (P.H.I), are other benefits from IoT technologies. EC techniques can process, aggregate, and analyze data immediately collected by IoT on a real-time scale, providing detailed insights into the behavior of rocks in various situations (e.g., stress, temperature, and pressure), establishing patterns quickly, and detecting trends. Therefore, this state-of-the-art and useful technology can adopt autonomous systems in rock mechanical surveys, such as drilling and production (in hydrocarbon wells) or excavation (in mining and geotechnics industries). Besides, ECt allows all rock-related operations to be controlled remotely and enables operators to apply changes or make adjustments. It must be mentioned that this feature is very important in environmental goals. More often than not, rock mechanical studies consist of different data, such as laboratory tests, field operations, and indirect information like seismic or well-logging data. CCt provides a useful platform for storing and managing a great deal of volume and different information, which can be very useful in fractured zones. Additionally, CCt supplies powerful tools for predicting, modeling, and simulating rock mechanical information, especially in fractured zones within vast areas. Also, it is a suitable source for sharing extensive information on rock mechanics, such as the direction and size of fractures in a large oil field or mine. The comprehensive review findings demonstrate that digital transformation through integrated IoT, Edge, and Cloud solutions is revolutionizing traditional rock mechanical investigation. These advanced technologies have empowered real-time monitoring, predictive analysis, and data-driven decision-making, culminating in noteworthy enhancements in safety, efficiency, and sustainability. Therefore, by employing IoT, CCt, and ECt, underground operations have experienced a significant boost, allowing for timely and informed actions using real-time data insights. The successful implementation of IoT, CCt, and ECt has led to optimized and safer operations, optimized processes, and environmentally conscious approaches in underground geological endeavors.

Keywords: rock mechanical studies, internet of things, edge computing, cloud computing, underground surveys, geological operations

Procedia PDF Downloads 24
1680 Aerosol - Cloud Interaction with Summer Precipitation over Major Cities in Eritrea

Authors: Samuel Abraham Berhane, Lingbing Bu

Abstract:

This paper presents the spatiotemporal variability of aerosols, clouds, and precipitation within the major cities in Eritrea and it investigates the relationship between aerosols, clouds, and precipitation concerning the presence of aerosols over the study region. In Eritrea, inadequate water supplies will have both direct and indirect adverse impacts on sustainable development in areas such as health, agriculture, energy, communication, and transport. Besides, there exists a gap in the knowledge on suitable and potential areas for cloud seeding. Further, the inadequate understanding of aerosol-cloud-precipitation (ACP) interactions limits the success of weather modification aimed at improving freshwater sources, storage, and recycling. Spatiotemporal variability of aerosols, clouds, and precipitation involve spatial and time series analysis based on trend and anomaly analysis. To find the relationship between aerosols and clouds, a correlation coefficient is used. The spatiotemporal analysis showed larger variations of aerosols within the last two decades, especially in Assab, indicating that aerosol optical depth (AOD) has increased over the surrounding Red Sea region. Rainfall was significantly low but AOD was significantly high during the 2011 monsoon season. Precipitation was high during 2007 over most parts of Eritrea. The correlation coefficient between AOD and rainfall was negative over Asmara and Nakfa. Cloud effective radius (CER) and cloud optical thickness (COT) exhibited a negative correlation with AOD over Nakfa within the June–July–August (JJA) season. The hybrid single-particle Lagrangian integrated trajectory (HYSPLIT) model that is used to find the path and origin of the air mass of the study region showed that the majority of aerosols made their way to the study region via the westerly and the southwesterly winds.

Keywords: aerosol-cloud-precipitation, aerosol optical depth, cloud effective radius, cloud optical thickness, HYSPLIT

Procedia PDF Downloads 104
1679 High Performance Computing and Big Data Analytics

Authors: Branci Sarra, Branci Saadia

Abstract:

Because of the multiplied data growth, many computer science tools have been developed to process and analyze these Big Data. High-performance computing architectures have been designed to meet the treatment needs of Big Data (view transaction processing standpoint, strategic, and tactical analytics). The purpose of this article is to provide a historical and global perspective on the recent trend of high-performance computing architectures especially what has a relation with Analytics and Data Mining.

Keywords: high performance computing, HPC, big data, data analysis

Procedia PDF Downloads 484
1678 Protocol for Dynamic Load Distributed Low Latency Web-Based Augmented Reality and Virtual Reality

Authors: Rohit T. P., Sahil Athrij, Sasi Gopalan

Abstract:

Currently, the content entertainment industry is dominated by mobile devices. As the trends slowly shift towards Augmented/Virtual Reality applications the computational demands on these devices are increasing exponentially and we are already reaching the limits of hardware optimizations. This paper proposes a software solution to this problem. By leveraging the capabilities of cloud computing we can offload the work from mobile devices to dedicated rendering servers that are way more powerful. But this introduces the problem of latency. This paper introduces a protocol that can achieve high-performance low latency Augmented/Virtual Reality experience. There are two parts to the protocol, 1) In-flight compression The main cause of latency in the system is the time required to transmit the camera frame from client to server. The round trip time is directly proportional to the amount of data transmitted. This can therefore be reduced by compressing the frames before sending. Using some standard compression algorithms like JPEG can result in minor size reduction only. Since the images to be compressed are consecutive camera frames there won't be a lot of changes between two consecutive images. So inter-frame compression is preferred. Inter-frame compression can be implemented efficiently using WebGL but the implementation of WebGL limits the precision of floating point numbers to 16bit in most devices. This can introduce noise to the image due to rounding errors, which will add up eventually. This can be solved using an improved interframe compression algorithm. The algorithm detects changes between frames and reuses unchanged pixels from the previous frame. This eliminates the need for floating point subtraction thereby cutting down on noise. The change detection is also improved drastically by taking the weighted average difference of pixels instead of the absolute difference. The kernel weights for this comparison can be fine-tuned to match the type of image to be compressed. 2) Dynamic Load distribution Conventional cloud computing architectures work by offloading as much work as possible to the servers, but this approach can cause a hit on bandwidth and server costs. The most optimal solution is obtained when the device utilizes 100% of its resources and the rest is done by the server. The protocol balances the load between the server and the client by doing a fraction of the computing on the device depending on the power of the device and network conditions. The protocol will be responsible for dynamically partitioning the tasks. Special flags will be used to communicate the workload fraction between the client and the server and will be updated in a constant interval of time ( or frames ). The whole of the protocol is designed so that it can be client agnostic. Flags are available to the client for resetting the frame, indicating latency, switching mode, etc. The server can react to client-side changes on the fly and adapt accordingly by switching to different pipelines. The server is designed to effectively spread the load and thereby scale horizontally. This is achieved by isolating client connections into different processes.

Keywords: 2D kernelling, augmented reality, cloud computing, dynamic load distribution, immersive experience, mobile computing, motion tracking, protocols, real-time systems, web-based augmented reality application

Procedia PDF Downloads 49
1677 An Integrated Cloud Service of Application Delivery in Virtualized Environments

Authors: Shuen-Tai Wang, Yu-Ching Lin, Hsi-Ya Chang

Abstract:

Virtualization technologies are experiencing a renewed interest as a way to improve system reliability, and availability, reduce costs, and provide flexibility. This paper presents the development on leverage existing cloud infrastructure and virtualization tools. We adopted some virtualization technologies which improve portability, manageability and compatibility of applications by encapsulating them from the underlying operating system on which they are executed. Given the development of application virtualization, it allows shifting the user’s applications from the traditional PC environment to the virtualized environment, which is stored on a remote virtual machine rather than locally. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for online cloud service. Users no longer need to burden the platform maintenance and drastically reduces the overall cost of hardware and software licenses. Moreover, this flexible and web-based application virtualization service represent the next significant step to the mobile workplace, and it lets user executes their applications from virtually anywhere.

Keywords: cloud service, application virtualization, virtual machine, elastic environment

Procedia PDF Downloads 256
1676 Attachment and Memories: Activating Attachment in College Students through Narrative-Based Methods

Authors: Catherine Wright, Kate Luedke

Abstract:

This paper questions whether or not individuals who had been exposed to narratives describing secure and insecure-avoidant attachment styles experienced temporary changes in their attachment style when compared to individuals who had been exposed to neutral narratives. The Attachment Style Questionnaire (or ASQ) developed by Feeney, Noller, and Hanrahan in 1994 was utilized to assess attachment style. Participants filled out a truncated version of the ASQ prior to reading the respective narratives assigned to their groups, and filled out the entirety of the ASQ after reading the narratives. Utilizing a one-way independent groups ANOVA, researchers found that the group which read the insecure-avoidant narrative experienced a statistically significant decrease in secure attachment, as did the group which read the secure narrative. The control group, however, experienced a statistically significant increase in secure attachment. Based on these findings, researchers concluded that narratives may have the ability to call attention to parental shortcomings that individuals have experienced in the forms of reminding individuals of positive experiences that they were not able to experience while spending time with their parental figures and calling attention to the shortcomings of said parental figures by reminding them of the negative experiences which they did have with them.

Keywords: attachment, insecure-avoidant, memory, secure

Procedia PDF Downloads 374
1675 Bridge Members Segmentation Algorithm of Terrestrial Laser Scanner Point Clouds Using Fuzzy Clustering Method

Authors: Donghwan Lee, Gichun Cha, Jooyoung Park, Junkyeong Kim, Seunghee Park

Abstract:

3D shape models of the existing structure are required for many purposes such as safety and operation management. The traditional 3D modeling methods are based on manual or semi-automatic reconstruction from close-range images. It occasions great expense and time consuming. The Terrestrial Laser Scanner (TLS) is a common survey technique to measure quickly and accurately a 3D shape model. This TLS is used to a construction site and cultural heritage management. However there are many limits to process a TLS point cloud, because the raw point cloud is massive volume data. So the capability of carrying out useful analyses is also limited with unstructured 3-D point. Thus, segmentation becomes an essential step whenever grouping of points with common attributes is required. In this paper, members segmentation algorithm was presented to separate a raw point cloud which includes only 3D coordinates. This paper presents a clustering approach based on a fuzzy method for this objective. The Fuzzy C-Means (FCM) is reviewed and used in combination with a similarity-driven cluster merging method. It is applied to the point cloud acquired with Lecia Scan Station C10/C5 at the test bed. The test-bed was a bridge which connects between 1st and 2nd engineering building in Sungkyunkwan University in Korea. It is about 32m long and 2m wide. This bridge was used as pedestrian between two buildings. The 3D point cloud of the test-bed was constructed by a measurement of the TLS. This data was divided by segmentation algorithm for each member. Experimental analyses of the results from the proposed unsupervised segmentation process are shown to be promising. It can be processed to manage configuration each member, because of the segmentation process of point cloud.

Keywords: fuzzy c-means (FCM), point cloud, segmentation, terrestrial laser scanner (TLS)

Procedia PDF Downloads 206
1674 Cloud Effect on Power Generation of Grid-Connected Small PV Systems

Authors: Yehya Abdellatif, Ahmed Alsalaymeh, Iyad Muslih, Ali Alshduifat

Abstract:

Photovoltaic (PV) power generation systems, mainly small scale, are rapidly being deployed in Jordan. The impact of these systems on the grid has not been studied or analyzed. These systems can cause many technical problems such as reverse power flows and voltage rises in distribution feeders, and real and reactive power transients that affect the operation of the transmission system. To fully understand and address these problems, extensive research, simulation, and case studies are required. To this end, this paper studies the cloud shadow effect on the power generation of a ground mounted PV system installed at the test field of the Renewable Energy Center at the Applied Science University.

Keywords: photovoltaic, cloud effect, MPPT, power transients

Procedia PDF Downloads 573
1673 Lifting Wavelet Transform and Singular Values Decomposition for Secure Image Watermarking

Authors: Siraa Ben Ftima, Mourad Talbi, Tahar Ezzedine

Abstract:

In this paper, we present a technique of secure watermarking of grayscale and color images. This technique consists in applying the Singular Value Decomposition (SVD) in LWT (Lifting Wavelet Transform) domain in order to insert the watermark image (grayscale) in the host image (grayscale or color image). It also uses signature in the embedding and extraction steps. The technique is applied on a number of grayscale and color images. The performance of this technique is proved by the PSNR (Pick Signal to Noise Ratio), the MSE (Mean Square Error) and the SSIM (structural similarity) computations.

Keywords: lifting wavelet transform (LWT), sub-space vectorial decomposition, secure, image watermarking, watermark

Procedia PDF Downloads 233
1672 Secured Transmission and Reserving Space in Images Before Encryption to Embed Data

Authors: G. R. Navaneesh, E. Nagarajan, C. H. Rajam Raju

Abstract:

Nowadays the multimedia data are used to store some secure information. All previous methods allocate a space in image for data embedding purpose after encryption. In this paper, we propose a novel method by reserving space in image with a boundary surrounded before encryption with a traditional RDH algorithm, which makes it easy for the data hider to reversibly embed data in the encrypted images. The proposed method can achieve real time performance, that is, data extraction and image recovery are free of any error. A secure transmission process is also discussed in this paper, which improves the efficiency by ten times compared to other processes as discussed.

Keywords: secure communication, reserving room before encryption, least significant bits, image encryption, reversible data hiding

Procedia PDF Downloads 375
1671 Comparison of Data Reduction Algorithms for Image-Based Point Cloud Derived Digital Terrain Models

Authors: M. Uysal, M. Yilmaz, I. Tiryakioğlu

Abstract:

Digital Terrain Model (DTM) is a digital numerical representation of the Earth's surface. DTMs have been applied to a diverse field of tasks, such as urban planning, military, glacier mapping, disaster management. In the expression of the Earth' surface as a mathematical model, an infinite number of point measurements are needed. Because of the impossibility of this case, the points at regular intervals are measured to characterize the Earth's surface and DTM of the Earth is generated. Hitherto, the classical measurement techniques and photogrammetry method have widespread use in the construction of DTM. At present, RADAR, LiDAR, and stereo satellite images are also used for the construction of DTM. In recent years, especially because of its superiorities, Airborne Light Detection and Ranging (LiDAR) has an increased use in DTM applications. A 3D point cloud is created with LiDAR technology by obtaining numerous point data. However recently, by the development in image mapping methods, the use of unmanned aerial vehicles (UAV) for photogrammetric data acquisition has increased DTM generation from image-based point cloud. The accuracy of the DTM depends on various factors such as data collection method, the distribution of elevation points, the point density, properties of the surface and interpolation methods. In this study, the random data reduction method is compared for DTMs generated from image based point cloud data. The original image based point cloud data set (100%) is reduced to a series of subsets by using random algorithm, representing the 75, 50, 25 and 5% of the original image based point cloud data set. Over the ANS campus of Afyon Kocatepe University as the test area, DTM constructed from the original image based point cloud data set is compared with DTMs interpolated from reduced data sets by Kriging interpolation method. The results show that the random data reduction method can be used to reduce the image based point cloud datasets to 50% density level while still maintaining the quality of DTM.

Keywords: DTM, Unmanned Aerial Vehicle (UAV), uniform, random, kriging

Procedia PDF Downloads 124
1670 Conducting Computational Physics Laboratory Course Using Cloud Storage Space

Authors: Ajay Wadhwa

Abstract:

A Laboratory course on computational physics is different from the conventional lab course on other topics of physics like Mechanics, Heat, Optics, etc. because it involves active participation of the teacher as well as one-to-one interaction between teacher and the student. The course content requires the teacher to teach programming language as well as numerical methods along with their applications in physics. The task becomes more daunting when about 90% of the students in the class have no previous experience of any programming language. In the presented work, we have described a methodology for conducting the computational physics course by using the Google Drive and Dropitto.me cloud storage services. We have evaluated the performance in a class of sixty students by dividing them equally into four groups. One of the groups was made the peer group on whom the presented methodology was tested. The other groups were taught by using conventional method of classroom lectures. In order to assess our methodology, we analyzed the performance of students in four class tests. A study of certain statistical parameters like the mean, standard deviation, and Z-test hypothesis revealed that the cyber methodology based on cloud storage is more efficient than the conventional method of teaching.

Keywords: computational Physics, Z-test hypothesis, cloud storage, Google drive

Procedia PDF Downloads 276
1669 Private Coded Computation of Matrix Multiplication

Authors: Malihe Aliasgari, Yousef Nejatbakhsh

Abstract:

The era of Big Data and the immensity of real-life datasets compels computation tasks to be performed in a distributed fashion, where the data is dispersed among many servers that operate in parallel. However, massive parallelization leads to computational bottlenecks due to faulty servers and stragglers. Stragglers refer to a few slow or delay-prone processors that can bottleneck the entire computation because one has to wait for all the parallel nodes to finish. The problem of straggling processors, has been well studied in the context of distributed computing. Recently, it has been pointed out that, for the important case of linear functions, it is possible to improve over repetition strategies in terms of the tradeoff between performance and latency by carrying out linear precoding of the data prior to processing. The key idea is that, by employing suitable linear codes operating over fractions of the original data, a function may be completed as soon as enough number of processors, depending on the minimum distance of the code, have completed their operations. The problem of matrix-matrix multiplication in the presence of practically big sized of data sets faced with computational and memory related difficulties, which makes such operations are carried out using distributed computing platforms. In this work, we study the problem of distributed matrix-matrix multiplication W = XY under storage constraints, i.e., when each server is allowed to store a fixed fraction of each of the matrices X and Y, which is a fundamental building of many science and engineering fields such as machine learning, image and signal processing, wireless communication, optimization. Non-secure and secure matrix multiplication are studied. We want to study the setup, in which the identity of the matrix of interest should be kept private from the workers and then obtain the recovery threshold of the colluding model, that is, the number of workers that need to complete their task before the master server can recover the product W. The problem of secure and private distributed matrix multiplication W = XY which the matrix X is confidential, while matrix Y is selected in a private manner from a library of public matrices. We present the best currently known trade-off between communication load and recovery threshold. On the other words, we design an achievable PSGPD scheme for any arbitrary privacy level by trivially concatenating a robust PIR scheme for arbitrary colluding workers and private databases and the proposed SGPD code that provides a smaller computational complexity at the workers.

Keywords: coded distributed computation, private information retrieval, secret sharing, stragglers

Procedia PDF Downloads 91
1668 Optimized Approach for Secure Data Sharing in Distributed Database

Authors: Ahmed Mateen, Zhu Qingsheng, Ahmad Bilal

Abstract:

In the current age of technology, information is the most precious asset of a company. Today, companies have a large amount of data. As the data become larger, access to data for some particular information is becoming slower day by day. Faster data processing to shape it in the form of information is the biggest issue. The major problems in distributed databases are the efficiency of data distribution and response time of data distribution. The security of data distribution is also a big issue. For these problems, we proposed a strategy that can maximize the efficiency of data distribution and also increase its response time. This technique gives better results for secure data distribution from multiple heterogeneous sources. The newly proposed technique facilitates the companies for secure data sharing efficiently and quickly.

Keywords: ER-schema, electronic record, P2P framework, API, query formulation

Procedia PDF Downloads 300
1667 The Future of Reduced Instruction Set Computing and Complex Instruction Set Computing and Suggestions for Reduced Instruction Set Computing-V Development

Authors: Can Xiao, Ouanhong Jiang

Abstract:

Based on the two instruction sets of complex instruction set computing (CISC) and reduced instruction set computing (RISC), processors developed in their respective “expertise” fields. This paper will summarize research on the differences in performance and energy efficiency between CISC and RISC and strive to eliminate the influence of peripheral configuration factors. We will discuss whether processor performance is centered around instruction sets or implementation. In addition, the rapidly developing RISC-V poses a challenge to existing models. We will analyze research results, analyze the impact of instruction sets themselves, and finally make suggestions for the development of RISC-V.

Keywords: ISA, RISC-V, ARM, X86, power, energy efficiency

Procedia PDF Downloads 57
1666 Researching Apache Hama: A Pure BSP Computing Framework

Authors: Kamran Siddique, Yangwoo Kim, Zahid Akhtar

Abstract:

In recent years, the technological advancements have led to a deluge of data from distinctive domains and the need for development of solutions based on parallel and distributed computing has still long way to go. That is why, the research and development of massive computing frameworks is continuously growing. At this particular stage, highlighting a potential research area along with key insights could be an asset for researchers in the field. Therefore, this paper explores one of the emerging distributed computing frameworks, Apache Hama. It is a Top Level Project under the Apache Software Foundation, based on Bulk Synchronous Processing (BSP). We present an unbiased and critical interrogation session about Apache Hama and conclude research directions in order to assist interested researchers.

Keywords: apache hama, bulk synchronous parallel, BSP, distributed computing

Procedia PDF Downloads 218
1665 A Framework of Virtualized Software Controller for Smart Manufacturing

Authors: Pin Xiu Chen, Shang Liang Chen

Abstract:

A virtualized software controller is developed in this research to replace traditional hardware control units. This virtualized software controller transfers motion interpolation calculations from the motion control units of end devices to edge computing platforms, thereby reducing the end devices' computational load and hardware requirements and making maintenance and updates easier. The study also applies the concept of microservices, dividing the control system into several small functional modules and then deploy into a cloud data server. This reduces the interdependency among modules and enhances the overall system's flexibility and scalability. Finally, with containerization technology, the system can be deployed and started in a matter of seconds, which is more efficient than traditional virtual machine deployment methods. Furthermore, this virtualized software controller communicates with end control devices via wireless networks, making the placement of production equipment or the redesign of processes more flexible and no longer limited by physical wiring. To handle the large data flow and maintain low-latency transmission, this study integrates 5G technology, fully utilizing its high speed, wide bandwidth, and low latency features to achieve rapid and stable remote machine control. An experimental setup is designed to verify the feasibility and test the performance of this framework. This study designs a smart manufacturing site with a 5G communication architecture, serving as a field for experimental data collection and performance testing. The smart manufacturing site includes one robotic arm, three Computer Numerical Control machine tools, several Input/Output ports, and an edge computing architecture. All machinery information is uploaded to edge computing servers and cloud servers via 5G communication and the Internet of Things framework. After analysis and computation, this information is converted into motion control commands, which are transmitted back to the relevant machinery for motion control through 5G communication. The communication time intervals at each stage are calculated using the C++ chrono library to measure the time difference for each command transmission. The relevant test results will be organized and displayed in the full-text.

Keywords: 5G, MEC, microservices, virtualized software controller, smart manufacturing

Procedia PDF Downloads 31
1664 Cryptosystems in Asymmetric Cryptography for Securing Data on Cloud at Various Critical Levels

Authors: Sartaj Singh, Amar Singh, Ashok Sharma, Sandeep Kaur

Abstract:

With upcoming threats in a digital world, we need to work continuously in the area of security in all aspects, from hardware to software as well as data modelling. The rise in social media activities and hunger for data by various entities leads to cybercrime and more attack on the privacy and security of persons. Cryptography has always been employed to avoid access to important data by using many processes. Symmetric key and asymmetric key cryptography have been used for keeping data secrets at rest as well in transmission mode. Various cryptosystems have evolved from time to time to make the data more secure. In this research article, we are studying various cryptosystems in asymmetric cryptography and their application with usefulness, and much emphasis is given to Elliptic curve cryptography involving algebraic mathematics.

Keywords: cryptography, symmetric key cryptography, asymmetric key cryptography

Procedia PDF Downloads 88
1663 Aliens in Space: Reflections on an Applied Theatre Project in a Medium Secure Hospital

Authors: Ashley Barnes

Abstract:

This paper will consider the ways in which varied notions of Space played a central role in a 12-week drama project with patients in a Medium Secure Hospital in the UK. In the project, the patients devised and performed a series of sketches, inspired by Science Fiction films, which echoed their own experience of alienation. During the project, the familiar and rigorously regulated Activity Room became a site of imagination, adventure and laughter; transforming the atmosphere of the hospital and allowing the patients to be transported to another space entirely. A space that was as much in their heads as in the physical domain. It will be argued that, although work created in an institution such as a Medium Secure Hospital is infused with hegemonic associations and meanings, the starting point for such work should be to seek an empty space in which the participants can allow their imaginations to be released. This work sits within a range of contexts and will be consciously interdisciplinary. It will draw from Human Geography and Criminology, as well as Performance and Applied Theatre Literature. It is hoped that this paper will build upon the literature that relates to the very particular environment of Secure Hospitals and to provide a starting point for further practical exploration.

Keywords: criminal justice, mental health, science fiction films, space and place

Procedia PDF Downloads 191