Search results for: cloud data privacy and integrity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25339

Search results for: cloud data privacy and integrity

24739 Integrating the Modbus SCADA Communication Protocol with Elliptic Curve Cryptography

Authors: Despoina Chochtoula, Aristidis Ilias, Yannis Stamatiou

Abstract:

Modbus is a protocol that enables the communication among devices which are connected to the same network. This protocol is, often, deployed in connecting sensor and monitoring units to central supervisory servers in Supervisory Control and Data Acquisition, or SCADA, systems. These systems monitor critical infrastructures, such as factories, power generation stations, nuclear power reactors etc. in order to detect malfunctions and ignite alerts and corrective actions. However, due to their criticality, SCADA systems are vulnerable to attacks that range from simple eavesdropping on operation parameters, exchanged messages, and valuable infrastructure information to malicious modification of vital infrastructure data towards infliction of damage. Thus, the SCADA research community has been active over strengthening SCADA systems with suitable data protection mechanisms based, to a large extend, on cryptographic methods for data encryption, device authentication, and message integrity protection. However, due to the limited computation power of many SCADA sensor and embedded devices, the usual public key cryptographic methods are not appropriate due to their high computational requirements. As an alternative, Elliptic Curve Cryptography has been proposed, which requires smaller key sizes and, thus, less demanding cryptographic operations. Until now, however, no such implementation has been proposed in the SCADA literature, to the best of our knowledge. In order to fill this gap, our methodology was focused on integrating Modbus, a frequently used SCADA communication protocol, with Elliptic Curve based cryptography and develop a server/client application to demonstrate the proof of concept. For the implementation we deployed two C language libraries, which were suitably modify in order to be successfully integrated: libmodbus (https://github.com/stephane/libmodbus) and ecc-lib https://www.ceid.upatras.gr/webpages/faculty/zaro/software/ecc-lib/). The first library provides a C implementation of the Modbus/TCP protocol while the second one offers the functionality to develop cryptographic protocols based on Elliptic Curve Cryptography. These two libraries were combined, after suitable modifications and enhancements, in order to give a modified version of the Modbus/TCP protocol focusing on the security of the data exchanged among the devices and the supervisory servers. The mechanisms we implemented include key generation, key exchange/sharing, message authentication, data integrity check, and encryption/decryption of data. The key generation and key exchange protocols were implemented with the use of Elliptic Curve Cryptography primitives. The keys established by each device are saved in their local memory and are retained during the whole communication session and are used in encrypting and decrypting exchanged messages as well as certifying entities and the integrity of the messages. Finally, the modified library was compiled for the Android environment in order to run the server application as an Android app. The client program runs on a regular computer. The communication between these two entities is an example of the successful establishment of an Elliptic Curve Cryptography based, secure Modbus wireless communication session between a portable device acting as a supervisor station and a monitoring computer. Our first performance measurements are, also, very promising and demonstrate the feasibility of embedding Elliptic Curve Cryptography into SCADA systems, filling in a gap in the relevant scientific literature.

Keywords: elliptic curve cryptography, ICT security, modbus protocol, SCADA, TCP/IP protocol

Procedia PDF Downloads 243
24738 A Medical Vulnerability Scoring System Incorporating Health and Data Sensitivity Metrics

Authors: Nadir A. Carreon, Christa Sonderer, Aakarsh Rao, Roman Lysecky

Abstract:

With the advent of complex software and increased connectivity, the security of life-critical medical devices is becoming an increasing concern, particularly with their direct impact on human safety. Security is essential, but it is impossible to develop completely secure and impenetrable systems at design time. Therefore, it is important to assess the potential impact on the security and safety of exploiting a vulnerability in such critical medical systems. The common vulnerability scoring system (CVSS) calculates the severity of exploitable vulnerabilities. However, for medical devices it does not consider the unique challenges of impacts to human health and privacy. Thus, the scoring of a medical device on which human life depends (e.g., pacemakers, insulin pumps) can score very low, while a system on which human life does not depend (e.g., hospital archiving systems) might score very high. In this paper, we propose a medical vulnerability scoring system (MVSS) that extends CVSS to address the health and privacy concerns of medical devices. We propose incorporating two new parameters, namely health impact, and sensitivity impact. Sensitivity refers to the type of information that can be stolen from the device, and health represents the impact on the safety of the patient if the vulnerability is exploited (e.g., potential harm, life-threatening). We evaluate fifteen different known vulnerabilities in medical devices and compare MVSS against two state-of-the-art medical device-oriented vulnerability scoring systems and the foundational CVSS.

Keywords: common vulnerability system, medical devices, medical device security, vulnerabilities

Procedia PDF Downloads 142
24737 Task Scheduling and Resource Allocation in Cloud-based on AHP Method

Authors: Zahra Ahmadi, Fazlollah Adibnia

Abstract:

Scheduling of tasks and the optimal allocation of resources in the cloud are based on the dynamic nature of tasks and the heterogeneity of resources. Applications that are based on the scientific workflow are among the most widely used applications in this field, which are characterized by high processing power and storage capacity. In order to increase their efficiency, it is necessary to plan the tasks properly and select the best virtual machine in the cloud. The goals of the system are effective factors in scheduling tasks and resource selection, which depend on various criteria such as time, cost, current workload and processing power. Multi-criteria decision-making methods are a good choice in this field. In this research, a new method of work planning and resource allocation in a heterogeneous environment based on the modified AHP algorithm is proposed. In this method, the scheduling of input tasks is based on two criteria of execution time and size. Resource allocation is also a combination of the AHP algorithm and the first-input method of the first client. Resource prioritization is done with the criteria of main memory size, processor speed and bandwidth. What is considered in this system to modify the AHP algorithm Linear Max-Min and Linear Max normalization methods are the best choice for the mentioned algorithm, which have a great impact on the ranking. The simulation results show a decrease in the average response time, return time and execution time of input tasks in the proposed method compared to similar methods (basic methods).

Keywords: hierarchical analytical process, work prioritization, normalization, heterogeneous resource allocation, scientific workflow

Procedia PDF Downloads 129
24736 Consortium Blockchain-based Model for Data Management Applications in the Healthcare Sector

Authors: Teo Hao Jing, Shane Ho Ken Wae, Lee Jin Yu, Burra Venkata Durga Kumar

Abstract:

Current distributed healthcare systems face the challenge of interoperability of health data. Storing electronic health records (EHR) in local databases causes them to be fragmented. This problem is aggravated as patients visit multiple healthcare providers in their lifetime. Existing solutions are unable to solve this issue and have caused burdens to healthcare specialists and patients alike. Blockchain technology was found to be able to increase the interoperability of health data by implementing digital access rules, enabling uniformed patient identity, and providing data aggregation. Consortium blockchain was found to have high read throughputs, is more trustworthy, more secure against external disruptions and accommodates transactions without fees. Therefore, this paper proposes a blockchain-based model for data management applications. In this model, a consortium blockchain is implemented by using a delegated proof of stake (DPoS) as its consensus mechanism. This blockchain allows collaboration between users from different organizations such as hospitals and medical bureaus. Patients serve as the owner of their information, where users from other parties require authorization from the patient to view their information. Hospitals upload the hash value of patients’ generated data to the blockchain, whereas the encrypted information is stored in a distributed cloud storage.

Keywords: blockchain technology, data management applications, healthcare, interoperability, delegated proof of stake

Procedia PDF Downloads 121
24735 Video Analytics on Pedagogy Using Big Data

Authors: Jamuna Loganath

Abstract:

Education is the key to the development of any individual’s personality. Today’s students will be tomorrow’s citizens of the global society. The education of the student is the edifice on which his/her future will be built. Schools therefore should provide an all-round development of students so as to foster a healthy society. The behaviors and the attitude of the students in school play an essential role for the success of the education process. Frequent reports of misbehaviors such as clowning, harassing classmates, verbal insults are becoming common in schools today. If this issue is left unattended, it may develop a negative attitude and increase the delinquent behavior. So, the need of the hour is to find a solution to this problem. To solve this issue, it is important to monitor the students’ behaviors in school and give necessary feedback and mentor them to develop a positive attitude and help them to become a successful grownup. Nevertheless, measuring students’ behavior and attitude is extremely challenging. None of the present technology has proven to be effective in this measurement process because actions, reactions, interactions, response of the students are rarely used in the course of the data due to complexity. The purpose of this proposal is to recommend an effective supervising system after carrying out a feasibility study by measuring the behavior of the Students. This can be achieved by equipping schools with CCTV cameras. These CCTV cameras installed in various schools of the world capture the facial expressions and interactions of the students inside and outside their classroom. The real time raw videos captured from the CCTV can be uploaded to the cloud with the help of a network. The video feeds get scooped into various nodes in the same rack or on the different racks in the same cluster in Hadoop HDFS. The video feeds are converted into small frames and analyzed using various Pattern recognition algorithms and MapReduce algorithm. Then, the video frames are compared with the bench marking database (good behavior). When misbehavior is detected, an alert message can be sent to the counseling department which helps them in mentoring the students. This will help in improving the effectiveness of the education process. As Video feeds come from multiple geographical areas (schools from different parts of the world), BIG DATA helps in real time analysis as it analyzes computationally to reveal patterns, trends, and associations, especially relating to human behavior and interactions. It also analyzes data that can’t be analyzed by traditional software applications such as RDBMS, OODBMS. It has also proven successful in handling human reactions with ease. Therefore, BIG DATA could certainly play a vital role in handling this issue. Thus, effectiveness of the education process can be enhanced with the help of video analytics using the latest BIG DATA technology.

Keywords: big data, cloud, CCTV, education process

Procedia PDF Downloads 226
24734 Assessment the Manner of Obtaining Hierarchies and Privacy of Traditional Houses Entrance in Providing a Safe Place-Case Study: Traditional Houses in Shiraz

Authors: Zahra A. Barzegar, Maryam B. Golboo

Abstract:

In this paper, the manner of obtaining hierarchies and privacy entry of traditional houses in providing a safe place in the city of Shiraz will be evaluated by qualitative–descriptive methods and 6 old houses are the case study. The houses of Shiraz, as the houses in other cities in Iran are a response to climate and physical features. The old part of Shiraz has a compressed and dense texture in which the houses are in narrow and tight alleys. In this regard, the principles of traditional house entrance design have been introduced. The results show that every house has a private entrance. Direction of the entry of most houses is toward the south and with a turn to the South-East side. Entrance to yard path in all the cases is not straight, and this had been done by using 90 degrees rotates of the corridor leading to the yard. Vestibule provides a private place for the house and entrance stairway to the rooftop is located inside it.

Keywords: entrance, components of entrance, hierarchy, frontage, Shiraz houses

Procedia PDF Downloads 296
24733 Significance of Transient Data and Its Applications in Turbine Generators

Authors: Chandra Gupt Porwal, Preeti C. Porwal

Abstract:

Transient data reveals much about the machine's condition that steady-state data cannot. New technologies make this information much more available for evaluating the mechanical integrity of a machine train. Recent surveys at various stations indicate that simplicity is preferred over completeness in machine audits throughout the power generation industry. This is most clearly shown by the number of rotating machinery predictive maintenance programs in which only steady-state vibration amplitude is trended while important transient vibration data is not even acquired. Efforts have been made to explain what transient data is, its importance, the types of plots used for its display, and its effective utilization for analysis. In order to demonstrate the value of measuring transient data and its practical application in rotating machinery for resolving complex and persistent issues with turbine generators, the author presents a few case studies that highlight the presence of rotor instabilities due to the shaft moving towards the bearing centre in a 100 MM LMZ unit located in the Northern Capital Region (NCR), heavy misalignment noticed—especially after 2993 rpm—caused by loose coupling bolts, which prevented the machine from being synchronized for more than four months in a 250 MW KWU unit in the Western Region (WR), and heavy preload noticed at Intermediate pressure turbine (IPT) bearing near HP- IP coupling, caused by high points on coupling faces at a 500 MW KWU unit in the Northern region (NR), experienced at Indian power plants.

Keywords: transient data, steady-state-data, intermediate -pressure-turbine, high-points

Procedia PDF Downloads 43
24732 Virtual Computing Lab for Phonics Development among Deaf Students

Authors: Ankita R. Bansal, Naren S. Burade

Abstract:

Idea is to create a cloud based virtual lab for Deaf Students, “A language acquisition program using Visual Phonics and Cued Speech” using VMware Virtual Lab. This lab will demonstrate students the sounds of letters associated with the Language, building letter blocks, making words, etc Virtual labs are used for demos, training, for the Lingual development of children in their vernacular language. The main potential benefits are reduced labour and hardware costs, faster response times to users. Virtual Computing Labs allows any of the software as a service solutions, virtualization solutions, and terminal services solutions available today to offer as a service on demand, where a single instance of the software runs on the cloud and services multiple end users. VMWare, XEN, MS Virtual Server, Virtuoso, and Citrix are typical examples.

Keywords: visual phonics, language acquisition, vernacular language, cued speech, virtual lab

Procedia PDF Downloads 583
24731 High Temperature Creep Analysis for Lower Head of Reactor Pressure Vessel

Authors: Dongchuan Su, Hai Xie, Naibin Jiang

Abstract:

Under severe accident cases, the nuclear reactor core may meltdown inside the lower head of the reactor pressure vessel (RPV). Retaining the melt pool inside the RPV is an important strategy of severe accident management. During this process, the inner wall of the lower head will be heated to high temperature of a thousand centigrade, and the outer wall is immersed in a large amount of cooling water. The material of the lower head will have serious creep damage under the high temperature and the temperature difference, and this produces a great threat to the integrity of the RPV. In this paper, the ANSYS program is employed to build the finite element method (FEM) model of the lower head, the creep phenomena is simulated under the severe accident case, the time dependent strain and stress distribution is obtained, the creep damage of the lower head is investigated, the integrity of the RPV is evaluated and the theoretical basis is provided for the optimized design and safety assessment of the RPV.

Keywords: severe accident, lower head of RPV, creep, FEM

Procedia PDF Downloads 215
24730 A Comparison of Methods for Neural Network Aggregation

Authors: John Pomerat, Aviv Segev

Abstract:

Recently, deep learning has had many theoretical breakthroughs. For deep learning to be successful in the industry, however, there need to be practical algorithms capable of handling many real-world hiccups preventing the immediate application of a learning algorithm. Although AI promises to revolutionize the healthcare industry, getting access to patient data in order to train learning algorithms has not been easy. One proposed solution to this is data- sharing. In this paper, we propose an alternative protocol, based on multi-party computation, to train deep learning models while maintaining both the privacy and security of training data. We examine three methods of training neural networks in this way: Transfer learning, average ensemble learning, and series network learning. We compare these methods to the equivalent model obtained through data-sharing across two different experiments. Additionally, we address the security concerns of this protocol. While the motivating example is healthcare, our findings regarding multi-party computation of neural network training are purely theoretical and have use-cases outside the domain of healthcare.

Keywords: neural network aggregation, multi-party computation, transfer learning, average ensemble learning

Procedia PDF Downloads 143
24729 Extracting Terrain Points from Airborne Laser Scanning Data in Densely Forested Areas

Authors: Ziad Abdeldayem, Jakub Markiewicz, Kunal Kansara, Laura Edwards

Abstract:

Airborne Laser Scanning (ALS) is one of the main technologies for generating high-resolution digital terrain models (DTMs). DTMs are crucial to several applications, such as topographic mapping, flood zone delineation, geographic information systems (GIS), hydrological modelling, spatial analysis, etc. Laser scanning system generates irregularly spaced three-dimensional cloud of points. Raw ALS data are mainly ground points (that represent the bare earth) and non-ground points (that represent buildings, trees, cars, etc.). Removing all the non-ground points from the raw data is referred to as filtering. Filtering heavily forested areas is considered a difficult and challenging task as the canopy stops laser pulses from reaching the terrain surface. This research presents an approach for removing non-ground points from raw ALS data in densely forested areas. Smoothing splines are exploited to interpolate and fit the noisy ALS data. The presented filter utilizes a weight function to allocate weights for each point of the data. Furthermore, unlike most of the methods, the presented filtering algorithm is designed to be automatic. Three different forested areas in the United Kingdom are used to assess the performance of the algorithm. The results show that the generated DTMs from the filtered data are accurate (when compared against reference terrain data) and the performance of the method is stable for all the heavily forested data samples. The average root mean square error (RMSE) value is 0.35 m.

Keywords: airborne laser scanning, digital terrain models, filtering, forested areas

Procedia PDF Downloads 127
24728 Thermodynamics of Water Condensation on an Aqueous Organic-Coated Aerosol Aging via Chemical Mechanism

Authors: Yuri S. Djikaev

Abstract:

A large subset of aqueous aerosols can be initially (immediately upon formation) coated with various organic amphiphilic compounds whereof the hydrophilic moieties are attached to the aqueous aerosol core while the hydrophobic moieties are exposed to the air thus forming a hydrophobic coating thereupon. We study the thermodynamics of water condensation on such an aerosol whereof the hydrophobic organic coating is being concomitantly processed by chemical reactions with atmospheric reactive species. Such processing (chemical aging) enables the initially inert aerosol to serve as a nucleating center for water condensation. The most probable pathway of such aging involves atmospheric hydroxyl radicals that abstract hydrogen atoms from hydrophobic moieties of surface organics (first step), the resulting radicals being quickly oxidized by ubiquitous atmospheric oxygen molecules to produce surface-bound peroxyl radicals (second step). Taking these two reactions into account, we derive an expression for the free energy of formation of an aqueous droplet on an organic-coated aerosol. The model is illustrated by numerical calculations. The results suggest that the formation of aqueous cloud droplets on such aerosols is most likely to occur via Kohler activation rather than via nucleation. The model allows one to determine the threshold parameters necessary for their Kohler activation. Numerical results also corroborate previous suggestions that one can neglect some details of aerosol chemical composition in investigating aerosol effects on climate.

Keywords: aqueous aerosols, organic coating, chemical aging, cloud condensation nuclei, Kohler activation, cloud droplets

Procedia PDF Downloads 374
24727 Relative Importance of Different Mitochondrial Components in Maintaining the Barrier Integrity of Retinal Endothelial Cells: Implications for Vascular-associated Retinal Diseases

Authors: Shaimaa Eltanani, Thangal Yumnamcha, Ahmed S. Ibrahim

Abstract:

Purpose: Mitochondria dysfunction is central to breaking the barrier integrity of retinal endothelial cells (RECs) in various blinding eye diseases such as diabetic retinopathy and retinopathy of prematurity. Therefore, we aimed to dissect the role of different mitochondrial components, specifically, those of oxidative phosphorylation (OxPhos), in maintaining the barrier function of RECs. Methods: Electric cell-substrate impedance sensing (ECIS) technology was used to assess in real-time the role of different mitochondrial components in the total impedance (Z) of human RECs (HRECs) and its components; the capacitance (C) and the total resistance (R). HRECs were treated with specific mitochondrial inhibitors that target different steps in OxPhos: Rotenone for complex I; Oligomycin for ATP synthase; and FCCP for uncoupling OxPhos. Furthermore, data were modeled to investigate the effects of these inhibitors on the three parameters that govern the total resistance of cells: cell-cell interactions (Rb), cell-matrix interactions (α), and cell membrane permeability (Cm). Results: Rotenone (1 µM) produced the greatest reduction in the Z, followed by FCCP (1 µM), whereas no reduction in the Z was observed after the treatment with Oligomycin (1 µM). Following this further, we deconvoluted the effect of these inhibitors on Rb, α, and Cm. Firstly, rotenone (1 µM) completely abolished the resistance contribution of Rb, as the Rb became zero immediately after the treatment. Secondly, FCCP (1 µM) eliminated the resistance contribution of Rb only after 2.5 hours and increased Cm without considerable effect on α. Lastly, Oligomycin had the lowest impact among these inhibitors on Rb, which became similar to the control group at the end of the experiment without noticeable effects on Cm or α. Conclusion: These results demonstrate differential roles for complex I, complex V, and coupling of OxPhos in maintaining the barrier functionality of HRECs, in which complex I being the most important component in regulating the barrier functionality and the spreading behavior of HRECs. Such differences can be used in investigating gene expression as well as for screening selective agents that improve the functionality of complex I to be used in the therapeutic approach for treating REC-related retinal diseases.

Keywords: human retinal endothelial cells (hrecs), rotenone, oligomycin, fccp, oxidative phosphorylation, oxphos, capacitance, impedance, ecis modeling, rb resistance, α resistance, and barrier integrity

Procedia PDF Downloads 82
24726 Impact of Audit Committee on Earning Quality of Listed Consumer Goods Companies in Nigeria

Authors: Usman Yakubu, Muktar Haruna

Abstract:

The paper examines the impact of the audit committee on the earning quality of the listed consumer goods sector in Nigeria. The study used data collected from annual reports and accounts of the 13 sampled companies for the periods 2007 to 2018. Data were analyzed by means of descriptive statistics to provide summary statistics for the variables; also, correlation analysis was carried out using the Pearson correlation technique for the correlation between the dependent and independent variables. Regression was employed using the Generalized Least Square technique since the data has both time series and cross sectional attributes (panel data). It was found out that the audit committee had a positive and significant influence on the earning quality in the listed consumer goods companies in Nigeria. Thus, the study recommends that competency and personal integrity should be the worthwhile attributes to be considered while constituting the committee; this could enhance the quality of accounting information. In addition to that majority of the committee members should be independent directors in order to allow a high level of independency to be exercised.

Keywords: earning quality, corporate governance, audit committee, financial reporting

Procedia PDF Downloads 152
24725 A Pervasive System Architecture for Smart Environments in Internet of Things Context

Authors: Patrick Santos, João Casal, João Santos Luis Varandas, Tiago Alves, Carlos Romeiro, Sérgio Lourenço

Abstract:

Nowadays, technology makes it possible to, in one hand, communicate with various objects of the daily life through the Internet, and in the other, put these objects interacting with each other through this channel. Simultaneously, with the raise of smartphones as the most ubiquitous technology on persons lives, emerge new agents for these devices - Intelligent Personal Assistants. These agents have the goal of helping the user manage and organize his information as well as supporting the user in his/her day-to-day tasks. Moreover, other emergent concept is the Cloud Computing, which allows computation and storage to get out of the users devices, bringing benefits in terms of performance, security, interoperability and others. Connecting these three paradigms, in this work we propose an architecture for an intelligent system which provides an interface that assists the user on smart environments, informing, suggesting actions and allowing to manage the objects of his/her daily life.

Keywords: internet of things, cloud, intelligent personal assistant, architecture

Procedia PDF Downloads 494
24724 The Ephemeral Re-Use of Cultural Heritage: The Incorporation of the Festival Phenomenon Within Monuments and Archaeological Sites in Lebanon

Authors: Joe Kallas

Abstract:

It is now widely accepted that the preservation of cultural heritage must go beyond simple restoration and renovation actions. While some historic monuments have been preserved for millennia, many of them, less important or simply neglected because of lack of money, have disappeared. As a result, the adaptation of monuments and archaeological sites to new functions allow them to 'survive'. Temporary activities or 'ephemeral' re-use, are increasingly recognized as a means of vitalization of deprived areas and enhancement of historic sites that became obsolete. They have the potential to increase economic and cultural value while making the best use of existing resources. However, there are often conservation and preservation issues related to the implementation of this type of re-use, which can also threaten the integrity and authenticity of archaeological sites and monuments if they have not been properly managed. This paper aims to get a better knowledge of the ephemeral re-use of heritage, and more specifically the subject of the incorporation of the festival phenomenon within the monuments and archaeological sites in Lebanon, a topic that is not yet studied enough. This paper tried to determine the elements that compose it, in order to analyze this phenomenon and to trace its good practices, by comparing international study cases to important national cases: the International Festival of Baalbek, the International Festival of Byblos and the International Festival of Beiteddine. Various factors have been studied and analyzed in order to best respond to the main problematic of this paper: 'How can we preserve the integrity of sites and monuments after the integration of an ephemeral function? And what are the preventive conservation measures to be taken when holding festivals in archaeological sites with fragile structures?' The impacts of the technical problems were first analyzed using various data and more particularly the effects of mass tourism, the integration of temporary installations, sound vibrations, the effects of unstudied lighting, until the mystification of heritage. Unfortunately, the DGA (General Direction of Antiquities in Lebanon) does not specify any frequency limit for the sound vibrations emitted by the speakers during musical festivals. In addition, there is no requirement from its part regarding the installations of the lighting systems in the historic monuments and no monitoring is done in situ, due to the lack of awareness of the impact that could be generated by such interventions, and due to the lack of materials and tools needed for the monitoring process. The study and analysis of the various data mentioned above led us to the elaboration of the main objective of this paper, which is the establishment of a list of recommendations. This list enables to define various preventive conservation measures to be taken during the holding of the festivals within the cultural heritage sites in Lebanon. We strongly hope that this paper will be an awareness document to start taking into consideration several factors previously neglected, in order to improve the conservation practices in the archaeological sites and monuments during the incorporation of the festival phenomenon.

Keywords: archaeology, authenticity, conservation, cultural heritage, festival, historic sites, integrity, monuments, tourism

Procedia PDF Downloads 104
24723 Removal of an Acid Dye from Water Using Cloud Point Extraction and Investigation of Surfactant Regeneration by pH Control

Authors: Ghouas Halima, Haddou Boumedienne, Jean Peal Cancelier, Cristophe Gourdon, Ssaka Collines

Abstract:

This work concerns the coacervate extraction of industrial dye, namely BezanylGreen - F2B, from an aqueous solution by nonionic surfactant “Lutensol AO7 and TX-114” (readily biodegradable). Binary water/surfactant and pseudo-binary (in the presence of solute) phase diagrams were plotted. The extraction results as a function of wt.% of the surfactant and temperature are expressed by the following four quantities: percentage of solute extracted, E%, residual concentrations of solute and surfactant in the dilute phase (Xs,w, and Xt,w, respectively) and volume fraction of coacervate at equilibrium (Фc). For each parameter, whose values are determined by a design of experiments, these results are subjected to empirical smoothing in three dimensions. The aim of this study is to find out the best compromise between E% and Фc. E% increases with surfactant concentration and temperature in optimal conditions, and the extraction extent of TA reaches 98 and 96 % using TX-114 and Lutensol AO7, respectively. The effect of sodium sulfate or cetyltrimethylammonium bromide (CTAB) addition is also studied. Finally, the possibility of recycling the surfactant is proved.

Keywords: extraction, cloud point, non ionic surfactant, bezanyl green

Procedia PDF Downloads 105
24722 Evaluation and Analysis of the Secure E-Voting Authentication Preparation Scheme

Authors: Nidal F. Shilbayeh, Reem A. Al-Saidi, Ahmed H. Alsswey

Abstract:

In this paper, we presented an evaluation and analysis of E-Voting Authentication Preparation Scheme (EV-APS). EV-APS applies some modified security aspects that enhance the security measures and adds a strong wall of protection, confidentiality, non-repudiation and authentication requirements. Some of these modified security aspects are Kerberos authentication protocol, PVID scheme, responder certificate validation, and the converted Ferguson e-cash protocol. Authentication and privacy requirements have been evaluated and proved. Authentication guaranteed only eligible and authorized voters were permitted to vote. Also, the privacy guaranteed that all votes will be kept secret. Evaluation and analysis of some of these security requirements have been given. These modified aspects will help in filtering the counter buffer from unauthorized votes by ensuring that only authorized voters are permitted to vote.

Keywords: e-voting preparation stage, blind signature protocol, Nonce based authentication scheme, Kerberos Authentication Protocol, pseudo voter identity scheme PVID

Procedia PDF Downloads 277
24721 MULTI-FLGANs: Multi-Distributed Adversarial Networks for Non-Independent and Identically Distributed Distribution

Authors: Akash Amalan, Rui Wang, Yanqi Qiao, Emmanouil Panaousis, Kaitai Liang

Abstract:

Federated learning is an emerging concept in the domain of distributed machine learning. This concept has enabled General Adversarial Networks (GANs) to benefit from the rich distributed training data while preserving privacy. However, in a non-IID setting, current federated GAN architectures are unstable, struggling to learn the distinct features, and vulnerable to mode collapse. In this paper, we propose an architecture MULTI-FLGAN to solve the problem of low-quality images, mode collapse, and instability for non-IID datasets. Our results show that MULTI-FLGAN is four times as stable and performant (i.e., high inception score) on average over 20 clients compared to baseline FLGAN.

Keywords: federated learning, generative adversarial network, inference attack, non-IID data distribution

Procedia PDF Downloads 136
24720 Open Source Cloud Managed Enterprise WiFi

Authors: James Skon, Irina Beshentseva, Michelle Polak

Abstract:

Wifi solutions come in two major classes. Small Office/Home Office (SOHO) WiFi, characterized by inexpensive WiFi routers, with one or two service set identifiers (SSIDs), and a single shared passphrase. These access points provide no significant user management or monitoring, and no aggregation of monitoring and control for multiple routers. The other solution class is managed enterprise WiFi solutions, which involve expensive Access Points (APs), along with (also costly) local or cloud based management components. These solutions typically provide portal based login, per user virtual local area networks (VLANs), and sophisticated monitoring and control across a large group of APs. The cost for deploying and managing such managed enterprise solutions is typically about 10 fold that of inexpensive consumer APs. Low revenue organizations, such as schools, non-profits, non-government organizations (NGO's), small businesses, and even homes cannot easily afford quality enterprise WiFi solutions, though they may need to provide quality WiFi access to their population. Using available lower cost Wifi solutions can significantly reduce their ability to provide reliable, secure network access. This project explored and created a new approach for providing secured managed enterprise WiFi based on low cost hardware combined with both new and existing (but modified) open source software. The solution provides a cloud based management interface which allows organizations to aggregate the configuration and management of small, medium and large WiFi solutions. It utilizes a novel approach for user management, giving each user a unique passphrase. It provides unlimited SSID's across an unlimited number of WiFI zones, and the ability to place each user (and all their devices) on their own VLAN. With proper configuration it can even provide user local services. It also allows for users' usage and quality of service to be monitored, and for users to be added, enabled, and disabled at will. As inferred above, the ultimate goal is to free organizations with limited resources from the expense of a commercial enterprise WiFi, while providing them with most of the qualities of such a more expensive managed solution at a fraction of the cost.

Keywords: wifi, enterprise, cloud, managed

Procedia PDF Downloads 71
24719 Machinability Study of A201-T7 Alloy

Authors: Onan Kilicaslan, Anil Kabaklarli, Levent Subasi, Erdem Bektas, Rifat Yilmaz

Abstract:

The Aluminum-Copper casting alloys are well known for their high mechanical strength, especially when compared to more commonly used Aluminum-Silicon alloys. A201 is one of the best in terms of strength vs. weight ratio among other aluminum alloys, which makes it suitable for premium quality casting applications in aerospace and automotive industries. It is reported that A201 has low castability, but it is easy to machine. However, there is a need to specifically determine the process window for feasible machining. This research investigates the machinability of A201 alloy after T7 heat treatment in terms of chip/burr formation, surface roughness, hardness, and microstructure. The samples are cast with low-pressure sand casting method and milling experiments are performed with uncoated carbide tools using different cutting speeds and feeds. Statistical analysis is used to correlate the machining parameters to surface integrity. It is found that there is a strong dependence of the cutting conditions on machinability and a process window is determined.

Keywords: A201-T7, machinability, milling, surface integrity

Procedia PDF Downloads 179
24718 Machine Learning Methods for Network Intrusion Detection

Authors: Mouhammad Alkasassbeh, Mohammad Almseidin

Abstract:

Network security engineers work to keep services available all the time by handling intruder attacks. Intrusion Detection System (IDS) is one of the obtainable mechanisms that is used to sense and classify any abnormal actions. Therefore, the IDS must be always up to date with the latest intruder attacks signatures to preserve confidentiality, integrity, and availability of the services. The speed of the IDS is a very important issue as well learning the new attacks. This research work illustrates how the Knowledge Discovery and Data Mining (or Knowledge Discovery in Databases) KDD dataset is very handy for testing and evaluating different Machine Learning Techniques. It mainly focuses on the KDD preprocess part in order to prepare a decent and fair experimental data set. The J48, MLP, and Bayes Network classifiers have been chosen for this study. It has been proven that the J48 classifier has achieved the highest accuracy rate for detecting and classifying all KDD dataset attacks, which are of type DOS, R2L, U2R, and PROBE.

Keywords: IDS, DDoS, MLP, KDD

Procedia PDF Downloads 222
24717 Risk Based Inspection and Proactive Maintenance for Civil and Structural Assets in Oil and Gas Plants

Authors: Mohammad Nazri Mustafa, Sh Norliza Sy Salim, Pedram Hatami Abdullah

Abstract:

Civil and structural assets normally have an average of more than 30 years of design life. Adding to this advantage, the assets are normally subjected to slow degradation process. Due to the fact that repair and strengthening work for these assets are normally not dependent on plant shut down, the maintenance and integrity restoration of these assets are mostly done based on “as required” and “run to failure” basis. However unlike other industries, the exposure in oil and gas environment is harsher as the result of corrosive soil and groundwater, chemical spill, frequent wetting and drying, icing and de-icing, steam and heat, etc. Due to this type of exposure and the increasing level of structural defects and rectification in line with the increasing age of plants, assets integrity assessment requires a more defined scope and procedures that needs to be based on risk and assets criticality. This leads to the establishment of risk based inspection and proactive maintenance procedure for civil and structural assets. To date there is hardly any procedure and guideline as far as integrity assessment and systematic inspection and maintenance of civil and structural assets (onshore) are concerned. Group Technical Solutions has developed a procedure and guideline that takes into consideration credible failure scenario, assets risk and criticality from process safety and structural engineering perspective, structural importance, modeling and analysis among others. Detailed inspection that includes destructive and non-destructive tests (DT & NDT) and structural monitoring is also being performed to quantify defects, assess severity and impact on integrity as well as identify the timeline for integrity restoration. Each defect and its credible failure scenario is assessed against the risk on people, environment, reputation and production loss. This technical paper is intended to share on the established procedure and guideline and their execution in oil & gas plants. In line with the overall roadmap, the procedure and guideline will form part of specialized solutions to increase production and to meet the “Operational Excellence” target while extending service life of civil and structural assets. As the result of implementation, the management of civil and structural assets is now more systematically done and the “fire-fighting” mode of maintenance is being gradually phased out and replaced by a proactive and preventive approach. This technical paper will also set the criteria and pose the challenge to the industry for innovative repair and strengthening methods for civil & structural assets in oil & gas environment, in line with safety, constructability and continuous modification and revamp of plant facilities to meet production demand.

Keywords: assets criticality, credible failure scenario, proactive and preventive maintenance, risk based inspection

Procedia PDF Downloads 381
24716 Development of a Bacterial Resistant Concrete for Use in Low Cost Kitchen Floors

Authors: S. S. Mahlangu, R. K. K. Mbaya, D. D. Delport, H. Van. Zyl

Abstract:

The degrading effect due to bacterial growth on the structural integrity of concrete floor surfaces is predictable; this consequently cause development of surface micro cracks in which organisms penetrate through resulting in surface spalling. Hence, the need to develop mix design meeting the requirement of floor surfaces exposed to aggressive agent to improve certain material properties with good workability, extended lifespan and low cost is essential. In this work, tests were performed to examine the microbial activity on kitchen floor surfaces and the effect of adding admixtures. The biochemical test shows the existence of microorganisms (E.coli, Streptococcus) on newly casted structure. Of up to 6% porosity was reduced and improvement on structural integrity was observed upon adding mineral admixtures from the concrete mortar. The SEM result after 84 days of curing specimens, shows that chemical admixtures have significant role to enable retard bacterial penetration and good quality structure is achieved.

Keywords: admixture, organisms, porosity, strength

Procedia PDF Downloads 224
24715 The Integration of Patient Health Record Generated from Wearable and Internet of Things Devices into Health Information Exchanges

Authors: Dalvin D. Hill, Hector M. Castro Garcia

Abstract:

A growing number of individuals utilize wearable devices on a daily basis. The usage and functionality of these wearable devices vary from user to user. One popular usage of said devices is to track health-related activities that are typically stored on a device’s memory or uploaded to an account in the cloud; based on the current trend, the data accumulated from the wearable device are stored in a standalone location. In many of these cases, this health related datum is not a factor when considering the holistic view of a user’s health lifestyle or record. This health-related data generated from wearable and Internet of Things (IoT) devices can serve as empirical information to a medical provider, as the standalone data can add value to the holistic health record of a patient. This paper proposes a solution to incorporate the data gathered from these wearable and IoT devices, with that a patient’s Personal Health Record (PHR) stored within the confines of a Health Information Exchange (HIE).

Keywords: electronic health record, health information exchanges, internet of things, personal health records, wearable devices, wearables

Procedia PDF Downloads 111
24714 EECS: Reimagining the Future of Technology Education through Electrical Engineering and Computer Science Integration

Authors: Yousef Sharrab, Dimah Al-Fraihat, Monther Tarawneh, Aysh Alhroob, Ala’ Khalifeh, Nabil Sarhan

Abstract:

This paper explores the evolution of Electrical Engineering (EE) and Computer Science (CS) education in higher learning, examining the feasibility of unifying them into Electrical Engineering and Computer Science (EECS) for the technology industry. It delves into the historical reasons for their separation and underscores the need for integration. Emerging technologies such as AI, Virtual Reality, IoT, Cloud Computing, and Cybersecurity demand an integrated EE and CS program to enhance students' understanding. The study evaluates curriculum integration models, drawing from prior research and case studies, demonstrating how integration can provide students with a comprehensive knowledge base for industry demands. Successful integration necessitates addressing administrative and pedagogical challenges. For academic institutions considering merging EE and CS programs, the paper offers guidance, advocating for a flexible curriculum encompassing foundational courses and specialized tracks in computer engineering, software engineering, bioinformatics, information systems, data science, AI, robotics, IoT, virtual reality, cybersecurity, and cloud computing. Elective courses are emphasized to keep pace with technological advancements. Implementing this integrated approach can prepare students for success in the technology industry, addressing the challenges of a technologically advanced society reliant on both EE and CS principles. Integrating EE and CS curricula is crucial for preparing students for the future.

Keywords: electrical engineering, computer science, EECS, curriculum integration of EE and CS

Procedia PDF Downloads 38
24713 Using a Robot Companion to Detect and Visualize the Indicators of Dementia Progression and Quality of Life of People Aged 65 and Older

Authors: Jeoffrey Oostrom, Robbert James Schlingmann, Hani Alers

Abstract:

This document depicts the research into the indicators of dementia progression, the automation of quality of life assignments, and the visualization of it. To do this, the Smart Teddy project was initiated to make a smart companion that both monitors the senior citizen as well as processing the captured data into an insightful dashboard. With around 50 million diagnoses worldwide, dementia proves again and again to be a bothersome strain on the lives of many individuals, their relatives, and society as a whole. In 2015 it was estimated that dementia care cost 818 billion U.S Dollars globally. The Smart Teddy project aims to take away a portion of the burden from caregivers by automating the collection of certain data, like movement, geolocation, and sound-levels. This paper proves that the Smart Teddy has the potential to become a useful tool for caregivers but won’t pose as a solution. The Smart Teddy still faces some problems in terms of emotional privacy, but its non-intrusive nature, as well as diversity in usability, can make up for it.

Keywords: dementia care, medical data visualization, quality of life, smart companion

Procedia PDF Downloads 121
24712 AI Features in Netflix

Authors: Dona Abdulwassi, Dhaee Dahlawi, Yara Zainy, Leen Joharji

Abstract:

The relationship between Netflix and artificial intelligence is discussed in this paper. Netflix uses the most effective and efficient approaches to apply artificial intelligence, machine learning, and data science. Netflix employs the personalization tool for their users, recommending or suggesting shows based on what those users have already watched. The researchers conducted an experiment to learn more about how Netflix is used and how AI affects the user experience. The main conclusions of this study are that Netflix has a wide range of AI features, most users are happy with their Netflix subscriptions, and the majority prefer Netflix to alternative apps.

Keywords: easy accessibility, recommends, accuracy, privacy

Procedia PDF Downloads 51
24711 Managing Data from One Hundred Thousand Internet of Things Devices Globally for Mining Insights

Authors: Julian Wise

Abstract:

Newcrest Mining is one of the world’s top five gold and rare earth mining organizations by production, reserves and market capitalization in the world. This paper elaborates on the data acquisition processes employed by Newcrest in collaboration with Fortune 500 listed organization, Insight Enterprises, to standardize machine learning solutions which process data from over a hundred thousand distributed Internet of Things (IoT) devices located at mine sites globally. Through the utilization of software architecture cloud technologies and edge computing, the technological developments enable for standardized processes of machine learning applications to influence the strategic optimization of mineral processing. Target objectives of the machine learning optimizations include time savings on mineral processing, production efficiencies, risk identification, and increased production throughput. The data acquired and utilized for predictive modelling is processed through edge computing by resources collectively stored within a data lake. Being involved in the digital transformation has necessitated the standardization software architecture to manage the machine learning models submitted by vendors, to ensure effective automation and continuous improvements to the mineral process models. Operating at scale, the system processes hundreds of gigabytes of data per day from distributed mine sites across the globe, for the purposes of increased improved worker safety, and production efficiency through big data applications.

Keywords: mineral technology, big data, machine learning operations, data lake

Procedia PDF Downloads 94
24710 ESRA: An End-to-End System for Re-identification and Anonymization of Swiss Court Decisions

Authors: Joel Niklaus, Matthias Sturmer

Abstract:

The publication of judicial proceedings is a cornerstone of many democracies. It enables the court system to be made accountable by ensuring that justice is made in accordance with the laws. Equally important is privacy, as a fundamental human right (Article 12 in the Declaration of Human Rights). Therefore, it is important that the parties (especially minors, victims, or witnesses) involved in these court decisions be anonymized securely. Today, the anonymization of court decisions in Switzerland is performed either manually or semi-automatically using primitive software. While much research has been conducted on anonymization for tabular data, the literature on anonymization for unstructured text documents is thin and virtually non-existent for court decisions. In 2019, it has been shown that manual anonymization is not secure enough. In 21 of 25 attempted Swiss federal court decisions related to pharmaceutical companies, pharmaceuticals, and legal parties involved could be manually re-identified. This was achieved by linking the decisions with external databases using regular expressions. An automated re-identification system serves as an automated test for the safety of existing anonymizations and thus promotes the right to privacy. Manual anonymization is very expensive (recurring annual costs of over CHF 20M in Switzerland alone, according to an estimation). Consequently, many Swiss courts only publish a fraction of their decisions. An automated anonymization system reduces these costs substantially, further leading to more capacity for publishing court decisions much more comprehensively. For the re-identification system, topic modeling with latent dirichlet allocation is used to cluster an amount of over 500K Swiss court decisions into meaningful related categories. A comprehensive knowledge base with publicly available data (such as social media, newspapers, government documents, geographical information systems, business registers, online address books, obituary portal, web archive, etc.) is constructed to serve as an information hub for re-identifications. For the actual re-identification, a general-purpose language model is fine-tuned on the respective part of the knowledge base for each category of court decisions separately. The input to the model is the court decision to be re-identified, and the output is a probability distribution over named entities constituting possible re-identifications. For the anonymization system, named entity recognition (NER) is used to recognize the tokens that need to be anonymized. Since the focus lies on Swiss court decisions in German, a corpus for Swiss legal texts will be built for training the NER model. The recognized named entities are replaced by the category determined by the NER model and an identifier to preserve context. This work is part of an ongoing research project conducted by an interdisciplinary research consortium. Both a legal analysis and the implementation of the proposed system design ESRA will be performed within the next three years. This study introduces the system design of ESRA, an end-to-end system for re-identification and anonymization of Swiss court decisions. Firstly, the re-identification system tests the safety of existing anonymizations and thus promotes privacy. Secondly, the anonymization system substantially reduces the costs of manual anonymization of court decisions and thus introduces a more comprehensive publication practice.

Keywords: artificial intelligence, courts, legal tech, named entity recognition, natural language processing, ·privacy, topic modeling

Procedia PDF Downloads 134